Jump to content
xisto Community

evought

Members
  • Content Count

    244
  • Joined

  • Last visited

Everything posted by evought

  1. I have just started working on the CrystalSpace/CEL/Crystal Core project. They are working on some major engine overhauls and are pushing forward CrystalCore as a capability demo and working example. CrystalSpace is a very good engine, powerful and flexible, especially once the current round of enhancements gets added. There is a lot of control for the level designer over shading and lighting. The project manager, for instance, just added the ability to control how doorways effect lighting of adjacent sectors. I would not say it is easy to learn, even for an experienced programmer. There is a good deal of learning curve to start figuring out how it is organized and how to make use of all those abilities. RIght now, it is fast moving and the stable (non-CVS) release is a little old. There is quite a lot of example code, though, and a very active/helpful community.
  2. I had a copy of Cutting-Edge 3d Game Programming With C++ by John de Goes and it was decent. It is a bit older, mostly DOS-based, but went through a good deal of the concepts, walks you through the construction of a basic 3-D engine and so forth. If you have looked at the inside of the open source Quake code, this book will hold no surprises for you as far as the engine design. The C++ class/object organization may be new to you. Another good book is O'Reilly's Physics for Game Developers by David Bourg. The examples are all in C/C++. It is Windows-centric (I am a Mac/UNIX programmer), but the meat of the examples is portable. It does a very good job of teaching both basic principles of physics which most people miss (or forget) from school in the context of game-design and does a very good job. I have lost my copy in a move and miss it regularly.
  3. C# is 'newer' being only a few years old now. C++ goes back about 20 years now. That is actually C++'s advantage, it has been around for quite a while and has been steadily improved. There are many good books about C++ development and lots of existing code to look at. C++ is supported on all kinds of systems (Windows, UNIX, Linux, Mac, embedded systems in cars and robotics, etc.). C# is a 'new' language which is sort of based on C++ and looks a lot like Java. It is only supported on MS Windows and only certain versions. It is still rather new, is not necessarily stable and has not been used by as many people. There are fewer books and mostly not as good. There are fewer existing programmers, so it will be harder to get your questions answered as you learn. C++ has proved itself and is not going away anytime soon. C#'s fate is still uncertain. My advice is to learn C++ first. Once you learn the programming basics, you can (and should) start learning a few more languages, especially new ones, but not when you are just starting out.
  4. No, it will not slow anything down and it will not do anything at all except when you actually serve a page to someone. TCP/IP is designed for lightning fast routing for these kinds of situations. The only annoyance you will get is the innevitable mass of script kiddies and zombies trying to connect. If you keep up to date, they should not be able to hurt you, but it will make your router work a little harder than having all ports closed.
  5. Installed it with no problem. I went ahead with it to make sure that I had the patch for the recent Apache-PHP worm. I have not noticed any issues with the new version other than the usual pain of updating all of my keychains.
  6. First of all, redirecting port 80 from your router to this computer won't prevent your other PCs from accessing the Internet. It only redirects *incoming* traffic. The downside is that you can only do it once. In other words, if you had two PCs both of which wanted to use web sharing and both accessible from the outside it is not easy to do. I have no idea what the addess is you are being shown. It could be your name on the ISP's internal network. As the above poster said, you can use whatismyip.com to get a number you should be able to give out to others, but it may change on a regular basis. WIth some ISPs it may change less than once a week at random. I have had an ISP that forcibly changed the address every four hours like clockwork. I have used dyndns.org before to get an actual domain name for my PC. You run a script to tell them what your address-of-the-minute is and they send traffic to the right place. Sorry it took so long to get back to you.
  7. You have to forward port 80 from the router to your machine or make your machine the 'DMZ host' (all requests get sent to your machine). If you have multiple systems which serve port 80 behind the same NAT firewall, you are essentially out of luck (with personal web sharing). You can always set up the webserver itself on a diferent port and serve pages, but the automatic personal web sharing setup will not work that way. The Mac installer for Zope/Plone sets up a fairly comprehensive web portal on any port you want. If you have a dynamic IP, other local systems (with Rendevous, either Macs or Linux/Windows that have had Rendevous added) can access your system by name (yoursystem.local). Some routers also support automatically adding your name to their local nameserver when you connect to the network. For outside users, if your router's IP is dynamic, you will have to try dyndns.org or similar services.
  8. OK, stupid question. I am trying to run Quake II on my Mac Powerbook. I have gotten sick of RtCW for a while and decided to go back and replay some of the Quake II stuff (single player; my reflexes, particularly during the winter are not really good enough for online FPSes) this winter. I have the Mac OS X Quake II port, I have the PC Quake II CD.Now, the directions for the port say to install the PC version and then add the Mac patch. OK, how in %$^#& do I install the PC version of Quake II on a Mac Powerbook running OS X???!!!!I can copy files off the CD if I know exactly how to copy them, but obviously the PC installer will not actually run. The only thing I can think of off hand is to do the install on a PC and dump the install dir to a firewire drive to move to the Mac. I can then add the patches and burn an actual Mac image to throw back into storage.It seems like I am missing something though. Is there no installer for Mac? I cannot find one on Google. How hard is a manual install? Is it something I could write a bash script for?Anyway, thought I would ask. I am looking at playing through the Mac ports of the Medal of Honor games if I decide to bother picking up a copy.
  9. "throws IOException" tells the compiler that the method is allowed to throw an Exception derived from IOException to indicate a problem. When an exception is thrown, it travels up the stack, back through all the methods that have been called until it reaches a "catch" statement which handles it. If there is no 'catch', the exception goes all the way up to main and exits the program. The reason you do not need it in your case is that the method you were adding it to *was* main. There is no where else for the exception to go, so do not bother declaring it. If an exception is thrown in main, it will jsut exit the program anyway. If you wanted to handle the error in some way, like printing put a message or giving the user another chance, you would put the read calls inside a try...catch block.
  10. I spent most of last year living in an 18 foot medieval pavillion in the middle of nowhere. We hauled water, cooked over a fire, made calzones in a stone oven from fresh goat's milk ricotta cheese (the goat had been milked the night before) and ... well, I won't even get into the bathroom facilities. Point is, living like that gives you a real quick idea of what you can or cannot live without. Funny thing is, we did have a laptop and occasional Internet access. Every once in a while we would scrape up enough power from a small battery bank to watch MPEGs of stupid british comedies (Black Adder, in particular). Now I'm living on a small farm not quite in the middle of nowhere. It has neat features like water that comes out of a tap and electricity which does not need to be budgetted out to the last Watt. We raise animals for food, spin wool, weave clothing and so forth. I make regular use of a wood maul and a froe for light carpentry, fencing repair and so forth. We make our own soap and I braid rope from inner bark and milkweed stalks. And, yes, I still have a computer and Internet access. I think that if more people lived closer by who were doing the same things, I could forgo the computer easily, but we live in a very disconnected world and the Internet is often the only way we have to connect to others. When I want to buy a sheep, I look online. If I need to know something about an herb I have located and I am not finding anything on my bookshelf, I go to the Internet. If I want to get together with other area folks to work on spinning or weaving, I send an email. The computer is an odd connection between traditional homesteading and the 'modern' world. I would trade it for an active farming community, but such are very hard to find and disappearing fast. Just a couple weeks ago we had a bunch of Ahmish folks staring at us because they had never seen someone spin wool before or dip taper candles.
  11. Well, for computer graphics, you want to get yourself a Macintosh (sorry, couldn't resist...). Seriously, at that point, many other things become more important than the processor itself, in particular, the video card, bus speed, and RAM speed. That is why Macs were able to continue to hold an edge with the G4 processor even when its clock speed was falling behind the rest of the market- the whole system was optimized for graphics work, not just the CPU. Software can be a limiting factor as well. Many very good boards have been saddled by poor drivers and bad application support. In the same way, when choosing between an Intel and AMD CPU, you want to look for motherboards and chipsets that will do what you want and choose the processor which goes with that board. You want to make sure that your bandwidth from memory to CPU to graphics card is as high as it can be. Hit the library, start reading magazine reviews, and look at various online benchmarks. This is something which changes all of the time as the various manufacturers bring out new products and update old ones. Look at reviews where they try to perform real world tasks; don't just look at manufacturer specs. As I said, poor software support can make good hardware unusable. When you want to buy, choose a *combination* which works for you. And, seriously, you might consider looking at a Macintosh. Much of the graphics editing world still uses them and will probably continue to do so since Apple does a lot to cater to them. I have used both and there are reasons to go either way, but it is worth thinking about.
  12. Unfortunately, those D-LINK cards are very squirrely. Even for Windows the D-Link drivers are very flaky and D-LINK does not release the information necessary for anyone else to write drivers. I have had to give up on my card for Windows 2000, let alone the Linux install on the same box. They also tend to be extremely touchy with what routers they will talk to. I have played with the in development linux drivers, but, unless you have a good deal of technical knowledge and a lot of patience, I would recommend not trying it. You have the advantgage of a working Windows install on the same box with the same card which will make it a bit easier, but I would still recommend against it. For one thing, the custom software breaks most of the auto-update tools so you end up doing a lot of other things manually at the same time. As I said, I have given up on mine under both linux and windows and have just run cat-5 to plug it straight into the switch. The windows driver sorta works, but every few days the card stops working until I re-install. I have gotten the linux driver to mostly almost work, but have had a lot of trouble recompiling for the wireless driver and not screwing up all the other devices on the system. I am going to put FC-4 on the system in a few days and may try again. I have a much older D-Link card which I know someone else has had success with and may just swap it out. Either way, good luck.
  13. Besides the PHP and Javascript routes, there is the XML stylesheet route: Generate your site content in an XML dialect (e.g. docbook) and use a stylesheet to generate the X/HTML and provide the CSS. Then, the navbar, footer, header, *is* actually embedded in each (HTML) page, but not in your original source (the XML) and the stylesheet keeps everything up to date. ---- The downside to all of these is problems with caching of documents. With the PHP or Javascript route, the last-modified date usually ends up being the modified time of the main page. If the navbar is newer, the client may not reload it. With the XML route, every time you update the navbar, header, footer, etc, all pages get marked as new and no caching happens. With frames or the Object tag, the browser can reload only the pieces which are actually out of date and performance is generally much better. As you have noted, however, some browsers are still squirrelly with the Object tag or with iFrames. It is possible using PHP to look at the browser and use either an Object tag or a direct include based on the browser type. This ends up with the best possible performance but gets complex. When my site got to the point where this was a pain, I just went to a CMS that handled all of the overhead for me.
  14. Books, and online sources. For HTML, the official source is the World Wide Web Consortium's HTML Page. Of special note is the HTML validator to use to check your work. There are links to several good tutorials off of that page. HTML is not a programming language per se, just a document format. If you wanted to get started with, say, RUBY: * Ruby Central will get you started. Included there is the online text of Programming Ruby (which I helped review) which is also available in print. More authors are publishing their books both on paper and online now. * The Mondrian IDE is an Integrated Development Environment (editor, debugger, etc.) for Ruby that is itself written in Ruby and is OpenSource. So, you can do your work in it, it is a good source of sample code, and you can work on it as well. For general programming: * SourceForge is a central repository of OpenSource projects with regular 'help wanted' postings. * Safari Online Bookshelf is an online repository of electronic versions of programming books. It works on subscription, but is not expensive. Other than that, find other programmers local or online that you can talk to on a regular basis as you go. That is also one of the benefits of working on a project- you get to learn from your peers.
  15. One thing that is a big help is to have existing source code to work on. By that I mean, do not start by just writing things from scratch. Find an OpenSource project you are interested in, download the source and start figuring out how it works. Make small changes and see what happens. It is often helpful to pick out two projects, one fairly small (a small utility, an iTunes plugin, a web app), and one larger (one of the Apache projects, Mozilla, various Linux/UNIX apps, JEdit, etc.). The smaller project will be easier to change, but the larger project will expose you to different programming styles, organization, structure, etc. If you are working with code you are interested in, you will probably stick with it longer; standard textbook programming examples convey simple concepts, but you don't understand them until you see how they work (or fail to work) in a real world situation. Choose your first language/platform/technologies according to what the project uses. When I started learning Object Oriented Programming, I learned Turbo Pascal (Pascal or various flavors of are *great* for learning by the way, they were designed for it) and was developing/adapting Telegard BBS code for local systems operators. The Telegard code was hacked, ugly, insecure, and bug-ridden. Fortunately, Borland shipped TP with the TurboVision utility library which came with complete source code. It helped immensely to be able to look at well-written, organized, and commented code in order to figure out how to fix the broken BBS code. I think that the ability to see good and bad practices side-by-side is what developed my good programming habits the most. All of the things that the books said was 'good practice' was suddenly very clear. If you just read the textbooks, you cannot see *why* certain things are important. If you just write a program from scratch to learn something and then throw it away, you never learn how hard it can be to maintain your own code or make your code work with an existing structure. This is why I really dislike standard computer science courses. After I learned those important lessons, learning different languages was easy. Since then I have have programmed in C, C++, Modula, Perl, Python, Ruby, Ada (YUCK!), various assembly languages, Java, VB (YUCK!YUCK!), etc., etc. Once you learn the concepts well, picking up new languages is not a big deal. They become just another tool in your toolbox. Another immediate advantage to joining an OpenSource project is you can read through their discussion list archives and ask your own questions. *Most* projects are very helpful with new volunteers; i they are not, volunteer elsewhere. If you go to SourceForge and set up an account, you can search the 'volunteer wanted' lists. I do recommend that beginning programmers look at the various 'Pragmatic Programer' books. For experienced programmers, a lot of what they (Andy Hunt, Dave Thomas) may seems obvious, but most of us had to learn the hard way. All other things being equal, I will say that Python, Ruby, and Pascal (e.g. Delphi) are somewhat better for beginner programmers because they were designed for ease of learning. The languages enforce certain structural conventions that new programmers often have trouble with. They were also designed from scratch, so they are internally consistent and therefore less confusing. Languages like C, C++, Perl, Visual Basic, etc., kind of grew out of a lot of older technologies. They have a lot of old baggage and can be *very* confusing even though they can be very versatile/powerful as well. In particular, I would not recommend starting with Visual Basic. It is in some ways easy to learn and can be powerful in its way, but if you learn it first, it may make it harder to transition to other languages. VB does not make it easy to write good, clean code and much of the sample code available to look at is very badly written as a result. Vb is more of a glue or sticking various bits of the Windows API together than a language in its own right (largely the same for AppleScript). It is a useful tool, but watch your fingers. In the same vein, there is an awful lot of bad C and C++ source code out there and a little that is very good. Python or Ruby (or TCL for that matter) will get you to the point of being able to write your own, from scratch programs faster. They are scripting languages, so you can write a single line program that actually does something. Their interpreters make it very easy for you to try small changes, examine the way the program is running and really see what is happening. C++ is going through a lot of changes still. Folks are still adopting STL (Standard Template Library) and new template constructs which fundamentally change how you write C++ programs. Source code you find will show some three generations of completely different style code. There are now good books that show you how to make some sense of this (Plaugher's book on the STL, Scott Meyers' Effective C++ series, the Boost libraries and documentation, but it is still a bit dicey. Personally, I think it is better to learn another OO language first and let the C++ folks sort things out a little longer. After you know the basic OO concepts, the arguments in the C++ world will make more sense to you. Read articles on your language and programming in general, not just books. Books take a while to get into print and are less controversial. Read some of the decent journals out there (Dr. Dobbs, C/C++ User's Journal, etc., or online equivalents. Track some of the arguments in online discussion lists or standards orgs. Some of the debate may not make sense to you or seem important at first, but as you pick up more experience, they will start to fit in. Eventually, you will have soem of your own zealous oppinions on the *right* way to program.
  16. Look here for cpanel specific instructions: https://www.visn.co.uk/cpanel-docs/error-pages.html and here for more general information about apache, php and error pages: http://www.onlamp.com/pub/a/onlamp/2003/02davidsklar.html In short, you must instruct Apache to use the custom pages by putting directives in the .htaccess file. Cpanel has some support for doing this for you. If you want smart error pages (e.g.: doing a site search and coming back with related pages), you need to do some PHP work.
  17. The tools on that site seem to be mostly for getting rid of DDOS clients, trojans and worms on your own system, not for getting rid of zombies on someone else's system attacking you. I do not know what RID is precisely, but, in a stroke of irony, their site is timing out. Unfortunately, dealling with attacking zombies is mainly a process of trying to identify as many of the attacking systems (or networks) and get their provider to pull the plug or route them out upstream. Many of the DDOS attacks these days look like legitimate traffic and may be flooding *upstream* from you where changes to your own system or routers has no real effect. To top things off, the packets often have spoofed source addresses. One of the really annoying things these days is to send a bad packet somewhere else (usually many different places) with your source address spoofed so that the NAKs all get sent back to you. You do not have the original bad packet, so you cannot analyze it to determine its origin. All the servers being sent the bad packets don't get enough of a traffic increase to know that anything is wrong. Personally, I think all the routers should be configured to drop any packets coming from a Windows box. That would get rid of 99.99% of all the zombies right there.
  18. wutske: To reenforce OpaQue's statement, my site (also on panda I believe) has also been out for the better part of 24 hours. It looks like a couple other folks have reported it as well. Several other sites I use on a regular basis (outside of Xisto) are also affected, some are not. opaque: On the shoutbox a day or too ago, someone suggested the free hosting was short on Linux admins. If this is true, I would be happy to lend a hand. My physical condition is up and down, but when I'm up, I'm happy to help. I recently ran Diversity Ink's servers as a volunteer and have done a lot of miscellaneous adminning, including at the five-sided puzzle palace. I also program (perl, python, C/C++, shell, various markup languages, etc.). I obviously cannot do site development until the problem is resolved, so I may as well pitch in.
  19. My site has been inaccessible or about 16 hours now as well. Rather annoying since I have some photos I have been asked to put up and cannot get to it. Oh well.
  20. Well, the article said 27 Pounds ($60?) per 1 meter sheet. If you only use them for the front page photos, you can probably get quite a few newspapers out of one sheet, so add only, say 20 cents per newspaper to the cost, which can be made up by one more advertisement. The newspapers are almost all ads these days anyway. The newsstand price is not what the paper costs; papers have sold at a loss for years. The ads pay for the production. Durability I cannot speak to, but they are aparently very low power and I know there are paper-thin batteries in production. They should last long enough for anyone but my mother to read them (she piles up papers for weeks), but I would imagine library archiving will be difficult. What they should do is put a small mylar solar-cell on every paper so you can recharge them .
  21. The first thing I thought of was picking up a paper and seeing a moving wanted poster, right out of Harry Potter. Truth stranger than fiction, huh? I do not think this will kill the written word (by itself, anyway). People are already inundated with TV and Internet video. Also, it will be quite expensive to use on food packaging for some time. The article I read said about 27 Pounds per square meter. What, $60 a sheet?
  22. I you are asking, "How does another computer figure out my address to send data to it?", then the answer depends on the networking protocol. I'll explain how two of them work. First, there is Windows NETBUI or what everyone refers to as "Windows Networking". NETBUI computers do not have IP addresses, they just have names. When you try to connect to a system by name, a request gets broadcast to the network saying "Who answers to 'FRED'S COMPUTER'". The computer that recognizes its own name responds by sending back its MAC address and the connection happens. NETBUI is *really* ineffecient for large networks because of all of this broadcast traffic on every request, although some newer technologies, like Master Browsers and Domain Controllers make it a bit better. The there is TCP/IP, the standard Internet protocol. TCP/IP uses ARP, the Address Resolution Protocol to figure out MAC addresses. When a data packet gets to its final hop (the routing tables say that the target machine is not behind another gateway), an ARP request gets broadcast: "WHO_HAS 212.112.181.001?" The system with that MAC address responds with its physical address, the address gets saved in a table called an ARP cache and the connection proceeds. It actually gets a bit more complex than this with routers and bridges and PPP connections, but you get the idea. So, the short answer is, the sending system broadcasts an ARP request and the receiver sends its MAC address back. This only works on the same physical network. On UNIX or Mac systems, the arp program can be used to display the table of MAC addresses your system currently has cached. ("arp -a" does it on my Macintosh) I think various flavors of Windows have arp as well, but I do not know the right syntax off hand.
  23. It has to do with the fact that Windows did a partial format. Rather than erase every byte, it just put an allocation table at the top of the disk which said the rest was free space. Now, grub had a disk offset stored in the bootloader which pointed to the space where your linux partition used to be and where the bootloader files just happened to still exist. Grub does not deal with file systems per se, just the hard physical addresses, so it had no way to know that file had been 'deleted', and it continued to function just fine. Now, at some point, Windows actually allocated and used those blocks of disk, probably for the mpeg you were burning. Now, all of a sudden, grub looks for its data and finds *video* and has a fit. It is basically a random time bomb. Some of the viruses that can unformat themselves work essentially the same way.
  24. Another thing to consider is create a partition to share between the two OSes. I you have, say, 20 GB, you can use 6 GB or linux, 6 for Windows and 8 for shared storage. Make the shared partition FAT32 which can be used from both sides. Then put your documents and a lot of your data on the shared partition, such as your word processor stuff, your MP3s, etc. That way, you don't need to copy files back and forth as much, you can access the same files from both sides, and it becomes easier if you decide to get rid of one or the other. If someday you decide to go all the way and blow away Windows, for instance, you can delete the Windows partition, reformat it for linux, move everything over and away you go. It gives you a bit more room to play with if you end up with too little space on one side or the other and makes it much easier to rearrange. When I originally made the switch (like '92 or so), I forced myself to do everything in Linux for a month. That's enough time to get over the culture shock and start figuring out how linux is supposed to work. It is not a "Windows substitute", it's an "alternative" and it has a very different mindset and philosophy. It's like learning a language when you hit the point of not translating word by word and start realizing there are whole ideas in one language which cannot be expressed in another. Anyway, good luck and have fun.
  25. MacJournal does look interesting. I use Circus and Ponies' Notebook myself. It has many of the same features but is a little more oriented toward taking notes than diary/blogging-style journalling. Its clipping features (a Service to clip text, images or documents directly to a notebook) really make organizing odd bits of information much easier. It also has the blogging capabilities of Mac Journal. I have not tried Voodoopad. I have not used Cyberduck before, but looking at it, I think I will now. I usually use command-line ncftp, but a good GUI client does have its uses. iTerm is a program I would have trouble parting with. It is a Terminal replacement that supports tabs and has a lot better session control. When I am doing administration work, I use it heavily. My desktop gets too cluttered with Terminal windows otherwise. I have also found that its terminal emulation seems to be just a bit better.
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.