Jump to content
xisto Community

ethergeek

Members
  • Content Count

    388
  • Joined

  • Last visited

Posts posted by ethergeek


  1. On another note, you're just asking for a denial of service attack...all I'd have to do is find a nice big linux iso to request over and over again until your bandwidth is all used up (or your php chokes and crashes on trying to create a png image from an image file). Might want to try sanitizing the inputs to your script and make sure only requests up to a certain size are honored, as well as some interval transfer limits for clients so they can't do this.


  2. I would still have the issue with a macbook and the software that I would like to use...

    You could always use boot camp to dual boot mac os x and whatever other operating system you wanted. apple has drivers for the macbook hardware for xp and vista...

  3. Well if it weren't for the steep prices on apples I would probably go out and buy one right away....
    My one issue is that I have so much software that runs only on windows that switching would not be very easy for me at the moment....other than that I would probably go right out and switch....


    Apple's stuff is a little higher-priced...but despite what alot of people rip on apple hardware for (as long as you don't get anything with a revision 1 logic board, lol) the hardware is very reliable...and lasts alot longer than it's pc counterparts. The latest versions of OSX run on much older hardware than the latest version of Windows do...so you don't need to upgrade a machine to use new software.

  4. It took 72 hours of marathon coding to be done. A potential vulnerability which allowed access to upper folders via ../ has been resolved. It would have compromised my hosting files, so I have disabled all paths with ../. The File type customization is now working and currently I am looking for cool file type icons. Another thing I am thinking of is to read the icons embedded in files such as EXEs.

    utilizing the open_basedir configuration option in PHP is safer as it restricts filesystem access at the runtime level so you don't have to do code checks, just handle the errors gracefully.

     

    open_basedir string

     

    Limit the files that can be opened by PHP to the specified directory-tree, including the file itself. This directive is NOT affected by whether Safe Mode is turned On or Off.

     

    When a script tries to open a file with, for example, fopen() or gzopen(), the location of the file is checked. When the file is outside the specified directory-tree, PHP will refuse to open it. All symbolic links are resolved, so it's not possible to avoid this restriction with a symlink. If the file doesn't exist then the symlink couldn't be resolved and the filename is compared to (a resolved) open_basedir.

     

    The special value . indicates that the working directory of the script will be used as the base-directory. This is, however, a little dangerous as the working directory of the script can easily be changed with chdir().

     

    In httpd.conf, open_basedir can be turned off (e.g. for some virtual hosts) the same way as any other configuration directive with "php_admin_value open_basedir none".

     

    Under Windows, separate the directories with a semicolon. On all other systems, separate the directories with a colon. As an Apache module, open_basedir paths from parent directories are now automatically inherited.

     

    The restriction specified with open_basedir is actually a prefix, not a directory name. This means that "open_basedir = /dir/incl" also allows access to "/dir/include" and "/dir/incls" if they exist. When you want to restrict access to only the specified directory, end with a slash. For example: "open_basedir = /dir/incl/"

     

    Note: Support for multiple directories was added in 3.0.7.

     

    The default is to allow all files to be opened.


  5. Google images and imagefilez are probably blocked because there's no control or ratings on the images. You can pull pornographic images on either and there isn't any other means to regulate it but block the whole site. Well, they could probably do something with safesearch, but I'm sure live search already has such an integration anyway, so it would be in microsoft's best interests to not waste time with google.


  6. I'd say your best bet would be to just enable access to % and choose a good, STRONG password, then remove that account's % access location when you aren't going to be working with it for awhile.Might also want to let the site admins know so that they don't think that your violating the TOS by using the SQL databases as storage for another site (I'm not sure if this violates the TOS or not, but when doing something nonstandard and/or unexpected like this, it's generally a good idea to let the sysops know ahead of time what you're doing).


  7. Here's why I think time machines will never exist...Let's say you want to go back in time to assassinate Hitler before he comes into power...like, drown him as a baby or something. If you create a time machine, go back in time, and do it, one of two things can happen.1. He's dead now, most of WW2 doesn't happen, world events changed. In this eventuality, where time would coalesce back into itself (i.e., not break off onto a tangent of alternate events), what point would your future self have to make the time machine in the first place? Taking the limit of this logic, nobody ever has any reason to build a time machine, so nobody ever makes one.2. An alternate timeline is generated, but the timeline you came from is untouched. This can happen 2 ways. Let's say you go back to the timeline you came from...life has not changed for you, and all you did was create a better timeline that didn't exist in the first place, so you didn't really repair anything. Now what if your trip back to your own time period landed you in the parallel, alternate timeline? Well...you'd be there, but everyone you knew would be in the other one, and most likely you wouldn't even exist in the alternate timeline, so you'd have to start your life over from scratch.In any of these cases, you personally don't see any net improvement with your own life...seems kinda pointless unless we have really backwards ideas on time and of course its nonlinear.


  8. Ok, so previously I had installed the java SKD on my computer (the one that comes with netbeans), as of right now it is version 1.4.2_13 and I really didn't pay much attention to the version. Everything seemed to work ok for what I was doing, although I wasn't using netbeans, instead I have been using eclipse. (Since I am doing this for a class at school and that is their IDE of choice)
    Anyway my programs have to compile at the command line (not just work in eclipse). I have been able to get them to work in eclipse fine, but I am having trouble when I go to the command line. Specifically with things such as the Scanner class (which was introduced in java 1.5), and so noticing that I have the SDK of version 1.4...I was thinking that was the problem. But what I can't figure out, is that that seems to be the latest version available on Sun's website. Do I have to download the JDK? (That is of the same version as the Java Runtime Environment).

    It just doesn't make much sense to me right now....why is eclipse working, but yet I can' compile at the command line? (I have gotten it a little further by including the JRE jar directory in the classpath directory, but that spits out errors about version numbers)

    So do I simply have to uninstall the Java SDK, and install the JDK? If so, why is it that eclipse is working just fine for me?


    Probably a few things. For one, eclipse manages your classpath for you so you don't need to fill your classpath with all the jars. You may be missing a jar on the commandline. Also, you may be compiling 1.5 code with a 1.4 compiler...this also will not work. Use javac -version to determine what version of the compiler you're using. If it's 1.4, you need to tweak your path so that it points to a 1.5 javac.

    So you know, eclipse doesn't use javac -- it uses an internal line-by-line compiler (which is why you can still run applications in eclipse if they have errors, you can't do this with netbeans.

  9. Cygwin means ported code - someone actually went through each POSIX utility and created a .exe binary that runs natively under Windows. Cygwin provides an emulation DLL called cygwin1.dll, but only for those ported binaries.
    Lina is full-blown emulation, which means you take a Linux binary utility that nobody ever thought of porting to Win32, and you FTP it to a Windows machine, you run it on top of Lina and Lina intercepts any Linux system calls attempted by that program and does them itself.


    OK...I read that site backwards then...thanks for the clarification. This seems pretty cool, but how does it handle things like library dependencies? I assume lina handles dynamic linking and all, but could I say, for instance, pop KDE and all it's prerequisites over to my windows machine and use kicker for my launcher, kdesktop for my shell, and kwin for window management?

  10. That's not true anymore. Although it was true of old fiber optic systems, advances both in the understanding of how light travels through devices with high indexes of refraction and advances in manufacturing technique allow fiber-optic to be extremely flexible. Especially fiber-optic cable that's wrapped. For example, Yale has been using extremely flexible fiber-optic cable as a security system in their computer labs for years now. The cable runs from a source through the lock slot on their computers to the input spot on the source. If the signal ever ceases, then a computer has been moved since the cable has broken, and the alarm goes off. It used to be quite annoying because it meant monitors couldn't be moved very far, but now it's not an issue. Also, one fiber-optic cable is extremely cheap. In fact, the gold used in high quality USB 2.0 devices (which adds maybe between 10 to 50 cents to the cable's cost) probably costs more than the amount of optic cable being put in USB 3.0 devices. Large bundles of fiber-optic that are inches thick is expensive, but each individual cable is about as thick as a hair. I think this is an interesting idea. It could even make networking via USB far more feasible, which would be cool.
    ~Viz


    I'm glad someone took the time to research this...I knew that fiber had gotten cheaper in the past few years, but I had no idea that the gold plating cost more than the fiber...though if you remove all the manufacturing costs, fiber is really nothing but sand anyway, and we have tons of that compared to gold.

    As for increasing costs...that's inevitable. SATA devices used to be more expensive in the PATA days, but now SATA is standard and costs no more than PATA devices of similar size. Once we start a big push to fiber, manufacturing capacity will increase, costs will go down, and we'll all be happier. It's the early adopters who will basically pay the premium to subsidize the R&D as always anyway :blink:
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.