Ravens PHP Scripts: Forums
 

 

View next topic
View previous topic
Post new topic   Reply to topic    Ravens PHP Scripts And Web Hosting Forum Index -> NukeSentinel™ Enhancement Requests
Author Message
Raven
Site Admin/Owner


Joined: Aug 27, 2002
Posts: 17086

PostPosted: Tue Dec 25, 2007 11:27 am Reply with quote

I would like to start a single, separate thread on Enhancements and/or Fixes that you all feel are needed to up the bar for NukeSentinel(tm).

Some of the things to consider besides the actual functionality are
* Is the concept/approach we started with still viable by just adding more functionality?
* Do we need to redesign our concept/approach?
* Is it bloated?
* Is everything configurable that need to or should be?
* Does it do enough by just looking at $_GET and $_POST arrays?
* Are there any inherent flaws?
* Is it really doing the job its intended to do?
* If time and effort were not an issue what could/should it be doing?
* Is it really needed if *nuke was coded correctly?
 
View user's profile Send private message
warren-the-ape
Worker
Worker


Joined: Nov 19, 2007
Posts: 196
Location: Netherlands

PostPosted: Wed Dec 26, 2007 7:11 am Reply with quote

An updated manual Cool (hehe no kidding).


Im using it less than a week now, so perhaps not in the position to say anything about it but so far it looks pretty good.
Installation itself was pretty smooth and felt 'light' (the actual adding of the tables part), although the feedback from the system afterwards could need a bit of explanation.

It was a bit.. okay we have installed it for ya and you are on your own from here on now Wink

The tooltips are a big pro and do help explain all the various functions.

Some oppinions;
- What is the need of having Country Listing in that place? It looks like a big feature but its pretty much a simple list. Its more a sub-function in my eyes but i could be wrong Embarassed

- Im missing some numbered lists (ordered lists) especially within the ban section. Of course i have the 'nmbr of shameful hackers' -block, but within the administration i cant really tell how much bans there are. Not sure if other people feel the same about this?


Im sorry that i cant talk about any technical stuff its just not my field of expertise Wink

Raven wrote:
* Is it really needed if *nuke was coded correctly?


According to what ive red from you guys it isnt Razz but where not living in a perfect world. Until now i only received abuse-filter messages which would be blocked by patched *nuke versions anyway but i do like the feeling that they are being blocked before getting executed.

All in all, NukeSentinel does give your more insight in your website users and their actions, it even displays some things Google Analytics doesnt Very Happy
 
View user's profile Send private message
Guardian2003
Site Admin


Joined: Aug 28, 2003
Posts: 6793
Location: Ha Noi, Viet Nam

PostPosted: Wed Dec 26, 2007 9:23 am Reply with quote

Quote:
* Is the concept/approach we started with still viable by just adding more functionality?

No.
NS by its nature is more reactive than pro-active. What I mean by that is it lets *things* into your site and do *stuff* before it can determine what action it should take.
I would prefer a more pro-active apprach, for example, rather than creating a huge list of user-agents to block, simply block them all and only *allow* the ones we want.
Similarly with referers; why wait till a refering string does something when we can do a simple reverse lookup to ensure the IP and refering domain actually matches - this is one of the ways in which Spam Stopper works.

Quote:
* Do we need to redesign our concept/approach?
* Is it bloated?
* Is everything configurable that need to or should be?

I wouldn't say it is bloated, it just has such a huge job to do when it might be possible to steamline that my taking a more pro-active approach.

Quote:
* Are there any inherent flaws?

Not flaws exactly but without a full IP2C dataset there can be issues which we have seen here in the forums (invalid IP etc).
Keeping current with IP2C data is a pain - it isn't hard, just time consuming for each webmaster. It would be fantastic to have a 'feed' that NS could use itself for updating this data automagically.

Quote:
* Is it really doing the job its intended to do?

What can I say? It's a very brave webmaster who doesn't use NS!!

Quote:
* If time and effort were not an issue what could/should it be doing?

Oh boy!
Combine it with some of Spam Stoppers methodology, have an XML or SOAP feed for the IP2C dataset.
Allow *sharing* of banned data between installations back to a central 'repository'.
Every single installation of NS can then update from one central repository if new attacks/ bad bots etc are detedted.

Quote:
* Is it really needed if *nuke was coded correctly?

Nuke would certainly be less open to mis-use if it was coded correctly but I think it would be naive to say it *wouldn't* be needed rather than *shouldn't* be needed.
You can filter input and output until the cows come home BUT this is not going to protect you if you add third party code such as additional modules etc.

Would I would any of my sites without it?
Hell no!!
 
View user's profile Send private message Send e-mail
fkelly
Former Moderator in Good Standing


Joined: Aug 30, 2005
Posts: 3312
Location: near Albany NY

PostPosted: Wed Dec 26, 2007 10:21 am Reply with quote

It is a good thing I waited for Guardian to reply because now I don't have to say very much. I'm with him almost all the way. Let me just add:

    NS, taken in total, is an amazing product. Looking thru the code as I have from time to time I am drawn to say "wow, Bob must have spent YEARS writing all this" and indeed he must have. It is much better designed than base Nuke is and much better written, though of course not perfect. And of course it's free and one heck of a contribution to the community. So that should be said first along with "THANKS".

    Now to Raven's points. In theory, if Nuke was perfect, you wouldn't need NS. But of course Guardian's point comes into play, if you add any other module that has vulnerabilities you need NS or you will surely get hacked. So, even if the RN team could go thru and rewrite what's in the distribution to filter every form and other input you'd still need NS at the end as a catch-all. And of course NS does other types of "filtering" besides looking at POST and GET strings.

    Having said that, it would still be beneficial to go thru the RN distribution module by module and form by form, look at the programs they are submitted to and explicitly filter (validate) the form inputs rather than relying on unfiltered variables and GPC. There should be some way of communicating to NS that "this form was already filtered so you don't have to" so that eventually NS would have less of this work to do. (Which of course would mean building XSS filters and the like into the filtering of individual forms and generally making sure that a "forged form" can't bypass our filters).

    And one other thing, if I may. There is a whole section of code that starts out:

    Code:
      // Check for SCRIPTING attack
    
      // Copyright 2004(c) ChatServ


    and proceeds to check post and get arrays for various strings using eregis. It really would be helpful to have a "specification" for what that section is trying to do. In other words what exact strings or combinations are we trying to protect against? My instinct and some experience tells me that this section of code is the source of many of the "false positive" results we get that are banning people who try to post legitimate text that happens to have body with < > around them or various other quasi html codes. And I know we've made some improvements in this for 2.20 but if there is an eregi wizard out there who could go thru this and interpret it and figure out what we want to retain and what we don't need, it sure would be beneficial.

    (Example: we were banning body when it had any other character before the "b" in body which resulted in banning tbody which was showing up with increasing frequency whenever anyone created a table in the new wyiswyg editor. That specific problem has been fixed for 2.20 but the more generic issue of why we are banning certain variants of "body" remains as does the same issue with many other codes).

    Well enough. I said I don't have to say very much and there I went.
 
View user's profile Send private message Visit poster's website
montego
Site Admin


Joined: Aug 29, 2004
Posts: 9453
Location: Arizona

PostPosted: Wed Dec 26, 2007 11:03 am Reply with quote

.... And, I too, am glad that I didn't see this until now. So much good stuff already said!

I, too, would like to place a "common repository" high on my list. I run several sites off the same VPS, but don't really want to have to maintain more than one IP2C as well as it would be great if I could share the "blocks". It might also be nice to have a way to automatically have it sync up on a periodic basis the blocks to the various .htaccess files too or maybe provide a dump of the deny statements to add so we can manually apply them. But, one would have to be careful about wholesale blocks of countries. What might be good for one site, might not be right for another, for example.

I, too, would like to see the REGEX's (or approach) tightened up a bit to not be so aggressive on some of these (to fkelly's point about "tbody"). It would also be nice if somehow the style and other such tags could get more "granular" in how they are "tested". For example, there is NO security risk to "color:blue" right? or maybe "text-decoration:underline".

One thing I was thinking of is what about a module "white list". So, assuming we get a module rock-solid where it would not need NukeSentinel, we can add it to our white-list and NS is bypassed (at least certain blockers).

There might be other possible uses for "white lists" too, but for some reason I am having a "brain fart"... lol.

_________________
Only registered users can see links on this board! Get registered or login!
Only registered users can see links on this board! Get registered or login! 
View user's profile Send private message Visit poster's website
Guardian2003
PostPosted: Wed Dec 26, 2007 12:27 pm Reply with quote

Just as further clarification on the 'common repository" I would agree with M we couldn't block whole countries for the reasons he gave, there are some other 'blocks' we would not be able to do wholesale which is why there would have to be some manual approval for 'blocks' before they made it into any common repository.
I'd also just like to mention that by 'common repository' I did not mean just one single source - for something like this to work efficiently there would have to be a small network of 'sources' that would update from the main repository so the actual load could be spread over that small network.

I would also like to see these 'blocks' getting written to htaccess rather than have them rely on reach NS - and yes that would mean reading in the htaccess, adding the new blocks and writing it all out to file again. Why? Well incase there is any other data in htaccess that the site needs like url re-writing etc.

Any way, I'm focusing too specifically on one area now...
 
montego
PostPosted: Wed Dec 26, 2007 1:35 pm Reply with quote

BTW, my "common repository" verbiage was probably bad as I was referring more to me being able to have one IP2C, and possibly some common "blocks", database on my server and then have multiple sites with NS pointing to it. It would really help me to keep current with the IP2C as well as share my "active" manual blocks with all my sites.

But, I am still in agreement with the direction you are taking this G as evident in some of our other posts about some type of "service".
 
Guardian2003
PostPosted: Wed Dec 26, 2007 1:49 pm Reply with quote

We could do both!?
A webmaster that had multiple sites on one server *should* be able to deploy that data amongst his/her own sites regardless of whether or not they make it into 'the big kahuna' (whatever that is but it sounds good).
I am talking 'ideally' here of course but as you remind me, we have traversed the topic of "a service" a couple of times before. It just has such universal appeal I cannot get away from the topic lol and of course it has been on my wishlist for Spam Stopper for sooo long.
Right, I'm off to figure this SOAP thing out, I just have to do it.
 
Susann
Moderator


Joined: Dec 19, 2004
Posts: 3191
Location: Germany:Moderator German NukeSentinel Support

PostPosted: Wed Dec 26, 2007 4:49 pm Reply with quote

Because it seems you all prefer sharing of banned data between installations back to a central 'repository' or something like that. I´m still against this and I m pretty sure I will not use it for my sites.

E.g. One of my sites is currently listen at Project Honeypot for nothing and on an other site I got no admin access because of such a "central service". It was a mistake and fixed in a new version but that showed me only that nothing is perfect.
NukeSentinel is also not 100 % perfect but its the best product we can get to protect our NukeSites.


With activated flood blocker my sites are all not W3C valid and this wasn´t a problem in earlier NukeSentinel versions.


Several of the NSN regional sites which are listen in the files of every NukeSentinel release are out-dated like the manual. I´ve not seen any information about the German NukeSentinel Support Site but we support NukeSentinel since 2005. I don´t mind.
The German NS users know how they can find us.

NukeSentinel is still needed and I would not run a site without this powerful tool.
A whitelist would be a good feature in NukeSentinel. I´m using a simple string blocker in my blogsoftware and there is also a way to whitelist via the administration.
 
View user's profile Send private message
Raven
PostPosted: Wed Dec 26, 2007 7:00 pm Reply with quote

Exclamation Just as an aside, much of NukeSentinel(tm) is NOT HTML nor XHTML compliant. This cleanup is a separate project from RavenNuke(tm) although it is a direct result of RavenNuke(tm)
 
technocrat
Life Cycles Becoming CPU Cycles


Joined: Jul 07, 2005
Posts: 511

PostPosted: Mon Dec 31, 2007 10:47 am Reply with quote

I think really the question is how far do you want to take the level of protection? Obviously as already stated the root of all the issues is the abysmal level of protection provided by the core and then the propagation of that to other modules and block.

To help combat that we took it as far as I dare try by providing a class (which I have posted and will post again if you want) that allows for data fetching and type checking. Also it strips magic quotes and adds sql safe strings to all variable classes. I honestly didn't want to do the last part but after seeing so many of my users getting attacked by the GPC I knew that I had to do something to help combat that.

So again how far do you want Sentinel to stretch out? Where do you draw the line?

Personally my biggest issue is/was with the false scripting blocks. I understand the reasoning behind it but it's just not good enough anymore. In our next release we have actually done away with it. We implemented HTMLPurifier instead which allows us much better html checking and filtering. They currently don't have the ability to process an XSS outside of stripping it, but they have honored my request to do so and have it planned for their next release (http://htmlpurifier.org/live/TODO).

The now allows for HTML to be used anywhere safely and without fear of Sentinel doing something it shouldn't.

I have said before that regex'es should be used to clean up some of the coding. Also the ability to cache some of these settings also help speed up things.

The ability to use a central repository would be a nice feature but one that should be well thought out because there is a possible abuse factor there. A proxy repository would be extremely helpful since many attacks are from those addresses.

_________________
Only registered users can see links on this board! Get registered or login!
Only registered users can see links on this board! Get registered or login! / Only registered users can see links on this board! Get registered or login! 
View user's profile Send private message
fkelly
PostPosted: Mon Dec 31, 2007 11:15 am Reply with quote

Quote:
Personally my biggest issue is/was with the false scripting blocks. I understand the reasoning behind it but it's just not good enough anymore. In our next release we have actually done away with it. We implemented HTMLPurifier instead which allows us much better html checking and filtering. They currently don't have the ability to process an XSS outside of stripping it, but they have honored my request to do so and have it planned for their next release (http://htmlpurifier.org/live/TODO).


Yeah! I have been after the scripting blockers too but (a) I don't know what the original specs were for what they were SUPPOSED to block and (b) I'm not good enough with eregi's, even using regexbuddy, to fix them. (and of course not knowing what exact threats we are trying to eliminate makes it hard to fix anything).

Let me ask what may be a too obvious question. Where do you plug HTMLPurifier code in? The "good" thing about NukeSentinel is that every $_POST and $_GET request by a non-admin goes thru the filter. Does Purifier work the same way and could we just plug it in instead of the current code in the Scripting section of NS? Or does each form that does a POST and each GET have to be explicitly sent thru Purifier? I'm just trying to get a sense of what the scope of the effort would be to make the change.

Or could we plug Purifier type code into each program that is the target of an action from a form in the system and thus assure that, regardless of whether the post strings came from a real form or a forged one they'd get filtered before anything was acted on or put in the database?
 
technocrat
PostPosted: Mon Dec 31, 2007 11:41 am Reply with quote

We ended up building it as part of the strip/slash functionality of the variable class.

So here is what happens. Again I should preface with I wish it didn't have to be this way but FB is at fault here. He made these mistakes and now they have infested through out the nuke world.

The class gets called in the mainfile as one of the first files. The constructor gets run, it first checks the validity of the input variables, ie phpbb opening loop in common.php. Then it takes $_POST, $_GET, $_REQUEST, and $_COOKIE and does a deep array strip of magic quotes (if on). Then everything gets passed through HTMLPurifier if its a string. HTMLPurifier is also set to XHTML 1.0 so it will clean up html code on that pass as well. Then goes back though with:
Code:
    switch ($function) {

        case 'mysql_real_escape_string':
            return mysql_real_escape_string($data);
        case 'mysql_escape_string':
            return mysql_escape_string($data);
        case 'mysqli_real_escape_string':
            return mysqli_real_escape_string($db->connection_id, $data);
        case 'mysqli_escape_string':
            return mysqli_escape_string($db->connection_id, $data);
        case 'addslashes':
        default:
            return addslashes($data);
    }

Adding the appropriate slashes. There is string checking and its a deep array loop to make sure everything gets done.

So now no matter where the outside input is getting used it has been cleaned up. Even in the worst modules using GPC there is no way to take in bad data.

Is this the most efficient way, heck NO. But after pondering this for awhile, I see no way to properly clean up FBs mess beyond doing this. Unless we can get everyone on the same page and there is a massive purge and clean up of Nuke.

The class is also used for fetching and type checking of input.

Here are 2 examples of that:
Code:
$id = $_GETVAR->get('id', 'POST', 'int', 0);

$email = $_GETVAR->get('email', 'POST', 'email');
 
Raven
PostPosted: Mon Dec 31, 2007 11:49 am Reply with quote

Just for curiosity, in order to type check are you using the mysql schema/meta tables or are you hard coding the type to check?
 
technocrat
PostPosted: Mon Dec 31, 2007 11:55 am Reply with quote

Hard coding. We have done all the core Evo modules and such.
 
Raven
PostPosted: Mon Dec 31, 2007 12:04 pm Reply with quote

Shocked Smack ROTFL
 
technocrat
PostPosted: Mon Dec 31, 2007 12:08 pm Reply with quote

Well honestly I dont see how you get that working in many situations, since not all data is db bound. Plus that would be one extra query that would have to be done for each variable unless you cached it. I am not sure I see the value in that.
 
Raven
PostPosted: Mon Dec 31, 2007 12:44 pm Reply with quote

I was referring to the DB vars. What I would envision would be a temporary table that is loaded up one time and stores all the vars that you will be using from the database tables and from your application. Your non database vars could possibly be stored as constants and loaded to the database from there. This is the approach I plan on using in a future release. It's more for organization - one stop shopping - and it's always current w/o having to change any code. I would even go as far as to look at loading up the non database vars in a table of their own. Then the application truly is database driven and controlled, imo.
 
technocrat
PostPosted: Mon Dec 31, 2007 2:28 pm Reply with quote

I am still not sure I see value in doing it that way. Maybe I am just misinterpreting how you plan to do it.

So the way I am visioning your implementation is in the DB you have `id`= int. Then using my $_GETVAR as an example it would be $id = $_GETVAR->get('id', 'POST'); So it then type checks vs the db as an int?

If that's sort of the way your thinking?? If so I am just not sure how much that helps. I mean how many times do you really change a db or variable type? Not often. So I it doesn't really seem to add the value of quickly changing the type when it is not often taken advantage of.

It makes it difficult to read. Where as $id = $_GETVAR->get('id', 'POST', 'int'); is easy to understand that it will be an int. What about I expect $id to be a username? That can create a problem.
 
fkelly
PostPosted: Mon Dec 31, 2007 4:05 pm Reply with quote

I'm sort of a thud thud clank clank type of guy so let me ask the wizards some basic questions.

1. Even if we went thru the existing RN distribution line by line and made sure that every single form was explicitly filtered with a $varxxx = somefilter($_POST['xxx']) and added application specific filters on top of that
(likewise with examining GETS):

a. we'd still want classes or some other standardized way of applying filters along the lines of HTMLPurifier or KSES or something like that in order to avoid having each developer reinvent the wheel and leave out some spokes
b. we'd still need a catch-all filter system for third party modules or anything that was added from outside our distribution. This would be similar to what NS does now (act as a catch all)
c. we'd probably want a way to "exempt" input that already went thru our standardized filters from going thru the catch-all
d. we'd want to make the actual application of filters in the catch all identical to the code that is in the standardized classes, even probably using the same classes under the covers

? no?

I've taken a good look at the thread Raven recently opened back up from 2005 and we are still hacking away at many of the same filtering issues
that were being discussed back then. I'm not smart enough or experienced enough with PHP and HTML issues to determine a direction for this, but set up some procedures and coding standards and some good examples of how we want to recode this stuff and assign me some modules and I'm ready to go in 2008. I'd love to see fractured filtering and associated issues be a rapidly receding speed bump in our rear view mirror.

And while we are at it ... let's take into account the resources required to implement any proposed solution while maintaining compatibility. I mean, if the decision is made to just totally fork that's one thing, but if we want to maintain a high level of compatibility then let's at least factor in how much time it would take to recode things to the new standard. I can see for instance:

take the news module. find every form. make a list of all "variables" implicit in that form. find the functions they are used in and make sure they are filtered thru an appropriate class for the variable type and use. then make sure application specific filters are applied. when you think the module is done run it in some kind of test bed where you can make sure it runs without GPC. Do the same for all gets. cross news off your list.
 
technocrat
PostPosted: Mon Dec 31, 2007 4:56 pm Reply with quote

Your correct. You'll find implementation in most modules is fairly easy. We did all of ours in less than a month.

We are in the same boat we were during the previous thread. Standing here looking at the same issues and possible solutions. I have said it before that and I will say it again that a fork should have been the way to go. Really FBs code should have been dumped and something new should have been put in its place. But everyone (including me) took the easy road up till now and just reused.

I plan to stop that cycle and completely divorce myself from nuke starting in 2008 and start refocusing on something new and better that is made largely from phpbb v3. Last week I just finished a technical demo for my fellow Evo programmers to look at that shows how we can make a CMS or portal using phpbb and create a bridge back to Nuke modules and blocks. I have to say that it looks good and I am excited to move forward on it.

Well now I have taken this completely off topic.
 
Raven
PostPosted: Mon Dec 31, 2007 4:58 pm Reply with quote

technocrat, I don't want to belabor this Smile. It's one way of doing things. I have built systems (when I had a real job) and I did this sort of thing all the time. It was required for several reasons. All variables were stored in meta data tables of some kind or another - from the ground up. And that's where our different perspectives come into play. There is a big difference between building something from the ground up and building it remedially. So, basically you have this meta data of stored information about a variable. You verify user input against the meta data with no hard coding. The benefit of that goes w/o saying.

So for instance let's just assume I want to verify the data a new user is trying to enter when registering. I would do a query (possibly) like this:
Code:
SELECT `COLUMN_NAME`,`DATA_TYPE`,`COLUMN_TYPE` 

FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME='nuke_users'
AND `COLUMN_NAME` IN('name','username','user_email','user_password')

Which would yield the following data:
Code:


COLUMN_NAME    DATA_TYPE    COLUMN_TYPE
name                 varchar          varchar(60)
username           varchar          varchar(25)
user_email          varchar          varchar(255)
user_password     varchar          varchar(40)

There are many more columns that could be extracted for use. I would then write a function/class to validate the data. This is just an example that I hope gets the thought across. It is one way of doing things. It facilitates/promotes standardization (all programs/scripts validate against the same data all the time), ease of maintenance, etc.
 
Raven
PostPosted: Mon Dec 31, 2007 5:16 pm Reply with quote

Frank,

I have only been able to allude to my plans for RavenNuke(tm) after v2.2 in our public and a bit more in private. Since we all (RN Team) are preoccupied for the moment with getting v2.2 out, we are unable to delve into what comes after. As you know we have been scratching many of the itches that this process has revealed/caused. Remediation as the primary goal has almost been exhausted imo. But, we have been laying some solid groundwork for the transition, whatever that will be. The next phase/stage will be much easier as we have been making some changes that are not 100% backwards compatible. So, imo, we have already put our stake in the ground. We know for sure there will be a v2.3 of some kind as the masses will discover stuff Wink - and then we still have the "future considerations" that we may want/need to add in before we move to a v3.0 branch, so to speak.

I don't know if I have helped or not. I have been very obvious in saying much but revealing little. It is both intentional and unavoidable. Since we, as a Team, are still discussing the roadmap, we just aren't ready to go public Smile. I will say that a fork is much easier to design, from a spoon, when you gradually reshape the spoon instead of just stamping a fork right out of the spoon.
 
fkelly
PostPosted: Thu Feb 07, 2008 12:25 pm Reply with quote

I know we are all consumed with 2.20 right now, but I saw an article on the Times today about Google forms. The original link is here:

Only registered users can see links on this board! Get registered or login!

Basically they've written an application that builds forms and then submits the data to Google where it is published in a spreadsheet that can be shared. In the context of this thread I started thinking: "I wonder how they validate the data?" In other words how do they do the things that we have talked about in this thread. This led me to:

Only registered users can see links on this board! Get registered or login!

Now this is only the PHP version (they have similar classes in other languages) but it's very interesting. Some of you are much more comfortable with classes than I am at the moment and I can just see you salivating at this.

I don't know how much of this we could use within some future Ravennuke but the approach seems to be pretty much what we'd want.

1. Have a form builder that generates all the forms in standard format
2. Build the validation as part of building the form -- you can't have a form in the system that is not automatically validated
3. Use standard classes and methods for validation. There would be ways to make exceptions (the Google class provides this ... to quote:

Quote:
you can retrieve the unescaped value using the getUnescaped() method, but you must write code to use the value safely, and avoid security issues such as vulnerability to cross-site scripting attacks


In my view, if we had "core" Ravennuke (that is everything we include in the distribution) built this way we wouldn't need NukeSentinel for parsing POST strings at all or we'd just have it there as a backup method for 3rd party modules that don't conform to our standards (and even there we might be able to use the validation methods in our classes instead of there current eregi approach).

And just an edit or add on to this. I've also looked at html purifier a bit and it looks great for what we want to do and I know Technocrat has already put it into play for Evo. Just in general I think we've got to settle on a "library" for these functions rather than trying to reinvent the wheel with our own programming of filters. It's an entire specialty and there is just no way we can or should try to duplicate it within Ravennuke.

Second edit. Oh man, you all have to look at this ...

Only registered users can see links on this board! Get registered or login!
 
Raven
PostPosted: Thu Feb 07, 2008 1:17 pm Reply with quote

Yes, it is great, for sure. It does require PHP v5.1 or greater however.

In the process of writing the Ravenstaller(tm) (the v2.2 installer) I changed courses many times. That's why it may give everyone a headache if they look too closely at the internals. If I am to release it with v2.2 then you all will just have to accept the code as is and understand that it will be cleaned up in the next release.

I have taken/started a similar form.class construction to standardize all installer forms/objects and am using it, such as it is, for now. Basically these are the steps. Please keep in mind that it is very much still under construction but the approach is the same. However, I also am cognizant of the fact that not all browsers are yet sophisticated enough to handle all the DOM manipulation that would make this all much easier. So, rather than having to program in a graceful fallback I have chosen to use a little more code to accomplish things like domObject.AddElement dynamically and instead use the class to to add standard HTML form elements statically. I have also chosen to use just simple ajax rather than JSON, SAJAX, etc. I may decide to go that route in the future, but for just an installer, in this case, all the other is way more bloat than is needed. The field edits/validations will be built into the form class also in very much the same way.

<?php
$fw = new formObject;
$fw->formOpen('step03', 'post', $_SERVER['PHP_SELF']);

$fw->htmlLabel('Database Table User Prefix','db_table_user_prefix');
$fw->tag='input';
$fw->type='text';
$fw->name='db_table_user_prefix';
$fw->id='db_table_user_prefix';
$fw->value='rnuke';
$fw->class=$fw->style=$fw->checked=$fw->selected=$fw->label='';
$fw->htmlTagGenerate();

$fw->htmlBreak(1);
$fw->htmlLabel('Auto Activate User Membership','autoActivateUsers');
$fw->tag='input';
$fw->type='radio';
$fw->name='autoActivateUsers';
$fw->id=$ln.'_true';
$fw->value='0';
$fw->checked='';
$fw->label='True';
$fw->class=$fw->style=$fw->selected=$fw->option='';
$fw->htmlTagGenerate();

$fw->tag='input';
$fw->type='radio';
$fw->name='autoActivateUsers';
$fw->id=$ln.'_false';
$fw->value='1';
$fw->checked='checked';
$fw->label='False';
$fw->class=$fw->style=$fw->selected=$fw->option='';
$fw->htmlTagGenerate();

$fw->htmlBreak(2);
$fw->tag='input';
$fw->type='button';
$fw->name='previewSettings';
$fw->id='previewSettings';
$fw->value='Preview Settings';
$fw->onclick='formPreviewSettings()';
$fw->class=$fw->style=$fw->checked=$fw->selected=$fw->label='';
$fw->htmlTagGenerate();

$fw->formClose();
?>
 
Display posts from previous:       
Post new topic   Reply to topic    Ravens PHP Scripts And Web Hosting Forum Index -> NukeSentinel™ Enhancement Requests

View next topic
View previous topic
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You can attach files in this forum
You can download files in this forum


Powered by phpBB © 2001-2007 phpBB Group
All times are GMT - 6 Hours
 
Forums ©