Jump to content
xisto Community

rayzoredge

Members
  • Content Count

    1,047
  • Joined

  • Last visited

Everything posted by rayzoredge

  1. I think that this is very subjective to your hardware, considering that my install of Windows 7 x64 Ultimate took less than half an hour, if I remember correctly, on an Intel dual core 2.13GHz processor and 4GB DDR2 SDRAM.Upgrading was never really the way to go... just for people that wanted to keep everything without having to back up and install everything all over again. From the sounds of this, it doesn't save you any time to do it this way, but again, it's all subjective to the hardware that you're working with.Always wipe the hard drive clean and start anew when you can. Yes, it might be a pain to back everything up and to install all of your software again, but it's always best to do it that way to start off of and work off of a clean slate.Also, I wouldn't put off XP machines as being inadequate for Windows 7. I've successfully run Windows 7 pretty well on a piece-of-crap work laptop spec'd with an Intel Pentium 4 1.3GHz and 1GB of RAM, and Windows 7 is supposed to work well with Netbooks, which boast similar, "low-end" specifications.
  2. Windows 7 shouldn't be too much of an expense if you are a student or if you recently bought a PC... or if you know a student or someone you know recently bought a PC AND don't care to upgrade to Windows 7 (since they probably would go for a full-out Linux installation or something like that). If you are a student, you have the option of providing a student e-mail to snag Windows 7 Home Premium for only $30 using the student promotion. If you just bought a PC with Vista Home Premium on it, you get an upgrade to Windows 7 Home Premium for free. If you are willing to talk to some friends or family, you can get the Windows 7 Family Pack, effectively paying only $50 for each license. And of course, if you decide to go by way of piracy, you can get it for free. Even with a legitimate purchase, Windows 7 is still pretty affordable... if you know how to get it at an affordable price.
  3. I definitely have to say that Windows got a lot prettier, somewhat user-friendly, and more productive.Jumping straight from an XP perspective with little Vista experience, I can say that I don't miss any performance issues. The only gripes I have so far is that it doesn't play well with some of my programs, like Microsoft Office XP. (PowerPoint loves to go unresponsive.) Gaming is great on Windows 7, with Fallout 3 at maximum graphic and CPU settings on a 2.13GHz dual core Pentium, 4GB of DDR2 RAM, and a GeForce 9800M GS graphics chip.I still think that Windows 7 is slow to boot and still suffers from the strange issue of taking much longer to boot up when you use the logon screen (as opposed to just booting into the Windows environment seamlessly). Shutting down is pretty quick.Updates are nice and frequent and retain the ease of being selective as to what you want updated or not. The Action Center is actually useful in troubleshooting problems and reminds you of useful things, like setting a backup option, having anti-virus, etc. without being annoyed by the stupid balloon too much. The user interface is good too with the taskbar and being able to preview windows and use jump lists. It's funny that one of the cooler things with Aero was Windows+Tab, but I never use it. I do like waving my cursor over window previews and how Windows 7 applies Aero Peek to every other window but the one you hover over to show it to you.There's a lot that I probably didn't cover, but Windows 7 is pretty good in my book so far. I haven't experienced a BSOD yet... but I have run into one occurrence of not being able to shut the computer down for some reason. (I would click on Shut Down, then try Restart, then try something else... but it would just sit there and do nothing. You could still use programs and do whatever you needed to do, just like if you didn't even try to shut down in the first place.)
  4. I actually discussed this in another thread.You cannot upgrade a graphics chip in a laptop UNLESS you have a dedicated graphics chip to begin with and if it is a compatible form factor AND if you can get your hands on a graphics chip of the same form factor. Simply, yes, you can, but no, most likely you will not be able to.I've actually updated a laptop's graphics chip before. I used to have a Dell Inspiron 8600 and had access to a Dell Latitude "something." Both laptops had an NVIDIA dedicated graphics chip in it, and I found out through discovery (and the fact that Dell probably makes things easy to assemble by using the same type of parts for everything) that both graphics chips were of the same form factor (presumably MXM). I took both laptops apart and literally swapped chips, which brought my Inspiron 8600 from a 32MB NVIDIA card to a 64MB NVIDIA card, which made a comparable difference to warrant the upgrade. (I forgot what graphics chip was in it.)You will have to actually get a laptop with a dedicated graphics solution if you want to game on a laptop. That's actually how I choose my laptops, working from what's there for a graphics chip first, since you can't really replace the graphics chip very easily if at all. You might as well guarantee that you can't replace a graphics chip in a laptop.As far as the PCIe slot goes for laptops, I believe that they are reserved moreso for accessory cards than a replacement graphics solution. Not to mention that rvalkass is right about the size of a PCIe card, not to mention the special power consumption and performance specifications tailored to a laptop in comparison with a desktop variant...
  5. About the Magic Mouse: I don't know too much about it, but a quick Google search tells me that you can use it in Boot Camp, so why not on a native Windows machine? I don't think you can take advantage of the gesture features or anything fancy like that, negating any reason to spend money on the Apple tax, but if you really want it, I'm sure that there are a ton of enthusiastic geeks working a way to make it work well in Windows and even Linux.About "gaming" mice in general: I never really understood why there is even such a thing, but I speak from a biased point of view. Being the cheap bum that I am, I don't really want to spend money on Razer peripherals, but I will spend money for Logitech devices simply because they have so many useful and productive features. Aside from an ultra-high DPI specification and the classification of a "gaming" mouse, why would you get one over a full-featured mouse like the Logitech Performance Mouse? Some gaming mice do have the nice ability of adding weights and adjusting DPI on the fly, but is it really worth paying the premium price over a "normal" mouse?
  6. I actually just vented about this a few months back. What doesn't annoy me about posting in old threads include the good topics that have good discussions, good information, good posts, and anything else that's worth reading. What does annoy me about old threads being resurrected is the fact that new members are probably suddenly trying to post a million short one-liners trying to bring up their post count by adding something completely irrelevant, retarded (subjective), stupid, or what-have-you to topics that are so out-of-date that you have to wonder why we even allow for dumb topics to keep coming up. (Again, subjective.) I think one of the largest reasons that this is happening is the poor search function for Xisto (which doesn't allow for you to find anything that's actually relative to the subject matter you're searching for) and the Similar Topics section that, although brilliantly employed to bring up similar topics based on keywords, doesn't discriminate between what's a few years old and what is still an open and current issue or topic. So, instead of just whining about it some more, I think that we should: - try to actually utilize working code to create a more effective search function, or even utilize Google within Xisto to be able to search within the forum, to include Boolean functions and the option to exclude content like searching within signatures (which probably adds to the confusion of irrelevant topics on search results) - mandate a sort order for Similar Topics by date and not to include so many choices - allow for users to close their own topics if their thread is about a problem they need solving to avoid others from posting in an XX-year-old thread, answering a member's question whom may not even be around anymore, and for an automatic expiry (like a month or so) of inactivity before the thread becomes automatically closed, and giving the OP the choice to reopen their topic if they do so choose to - utilize the Get New Posts page as the default opening page for users logging in Yay? Nay?
  7. We didn't miss you at all. I kind of miss your contributions to Windows 7 topics. I just got Windows 7 Ultimate x64 myself.Live life. We all understand that we have one. Except for the mods, of course.
  8. True... I forgot the fact that his hardware is going to be liquid-cooled. I'm so into the mindset that overclocking won't make that much of a noticeable difference nowadays, and with the factor of introducing more heat and killing the hardware that much faster, it just wasn't worth it. Now, back in the day when you could hit that Turbo button to go from 33MHz to 66MHz, that was amazing... I do remember from doing prior research back in the day that it basically was the combination of the speed and latency of the RAM that gave it the oomph we describe. But isn't RAM stability an issue as with overclocking a processor? You say a second or so of difference... to a tinkerer, that's a very big deal, but to the average user, it's not that big of a deal to invest the time and effort to gain that extra second. I'm not belittling the art of overclocking, but from an "average user" standpoint, a second or even a few seconds isn't that big of a deal. When we start talking double digits, then it's on (10, 20, even 30 seconds would be very noticeable). I read somewhere that multi-core processing is evolving to that point where a load will be balanced amongst cores as opposed to just taxing one when it comes to software being the limiting factor, so you're right on that. The thing that's a concern right now is that there isn't a heck of a lot of software that actually utilizes multiple cores right now, which is changing now as we start seeing more dual and quad-core processors on the mainstream side. You are definitely 100% good to go with compiling kernels with a quad-core processor; however, this leaves a wide audience [that excludes programmers and advanced users] still to wonder if their processors are actually being utilized to its full potential or not for their programs, games, and everyday tasks. Like I said, I'm not saying that quad-cores or even dual-cores are useless, but maybe I'm under the misconception that you are limited to clock speeds and performance when software limits hardware utilization?
  9. I like this argument because you are right: it does help to branch your knowledge base out so that you can be familiar with and assist others with different operating systems. I'm glad that I jumped into Linux when I did, and even though I still consider myself quite the newbie with it, I don't balk that much at seeing unfamiliar commands and Linux CLI strings online (especially when it comes to hacking my iPod nowadays). It's actually a nice feeling to be able to look at once-unfamiliar stuff and actually be able to read it. However, in the career market, I don't know how useful knowing OSX would be unless you were applying to be a PC tech or specialist of some sort, since with the ~10% market share of Mac users come ~10% of your customers having an issue with their Mac. (Then again, the way that some of these elitists come forth with bragging about how awesome their Apple products are, they shouldn't be coming to you with any problems.) Linux is definitely a helpful knowledge base to at the very least touch on, but I think that a combination of Linux and Windows Server would be best for an IT position. That's just a guess, though. It depends on how comfortable you are with approaching each avenue. A lot of people are so used to using a GUI and take the CLI for granted, since they never really see it. Also, a lot of people nowadays are of the "gimme" attitude and won't actually do the research to begin with. Those that do venture into this void will either find a solution that they can cope with, figure out a solution that fixes their problem, or abandon it because no one has a solution to share or even before a solution becomes apparent because typing in a bunch of commands that you have no idea what they do and hitting enter repeatedly is not the best way to go, not to mention that people LIKE knowing what they're doing. Thankfully enough and thanks to its current users, Linux has become more of a GUI and user-friendly environment with its CLI working for it in the background, much like Windows in that aspect, which will attract more users in the long run once people start spreading the word. The common correlation with the word "Linux" automatically brings us to geeks and people who are computer savvy, but I'm sure that will change over time. Once developers start seeing an audience for Linux, I'm sure software support will follow suit, which will bring even more users in, etc.
  10. I'm actually running it right now (as I figured I'd give it a test run again) and I like and don't like it.First impressions had me disappointed with the same baby blue Fisher Price theme, but I was very glad to find out that you can now change your theme VERY easily in "real-time" without a restart. The way that Google omitted the command toolbar and put everything to the right with the two drop-down menus is still a nice touch... still. I like how the status bar only comes up when it's required. All this, so far, shaves off valuable real estate on the browser, which makes it "feel" bigger, even though in all reality, you only shaved off maybe 20-30 pixels. The feel of browsing feels a tad peppier, which is nice. Nice to see that sandboxing is still in effect. I don't like how you can't re-open a tab you may have accidentally closed (or do I just not know how to?), but I love the history and how it's displayed. The location bar is very intuitive and I can search as well as go to my frequent sites without too much trouble; Firefox has a bit of a lag when it realizes that you typed in a search term instead of a URL. I like how the downloads work out of the box, reminding me of DownloadStatusBar for Firefox.However, there are a few glaring issues that will bring me back to Firefox after I finish using this Chrome session. Ironically, GMail STILL has issues displaying in a browser that I'm trying out... and for it to be slower in Chrome than Firefox is an irony that I have to question. Sometimes my CPU becomes so taxed for some reason that Chrome will become unresponsive for seconds at a time, then when switching tabs, will do a slow "fill" downward to view the page. (After the "fill," it performs normally.) I like how the tabs resize themselves when you close ones you don't need, but I don't like how Chrome tries to fit all the tabs you have open in the available space instead of shoving them off to the side like Firefox does and adding an arrow to scroll through them. (It makes it impossible to figure out what tabs are open when you're busy middle-clicking, but I'm sure this only affects the few weird people that do that sort of thing. )I know I have only tried this browser for a good five hours, but so far, it's not enough to make me switch. However, I have to commend Google for a pretty good browser "right out of the box" and would encourage others to give it a whirl. Obviously, people like it (or are just under Google's vats drinking the Kool-Aid), so maybe it has more credit than I give it. Also, keep in mind that I'm biased towards Firefox anyway, but gave this one a go, so your mileage may vary.Addendum: Google Chrome does NOT play well on an older machine. I'm running a 2.2GHz processor with 1GB of RAM on Windows XP SP3 and even though these specs aren't too shabby, Chrome just started to crawl badly enough to the point where it was unusable (long unresponsive periods of time, slow response to scrolling, etc.). It's like the longer this thing runs, the slower it gets. Again, YMMV.
  11. Patient. I'm currently running a 2.13GHz dual core processor, which I suppose is pretty darn good compared to what I'm used to (1.6GHz dual core, 3.2GHz single core, 1.2GHz single core...), and personally, I don't know if you can REALLY tell a difference between my computer's performance and yours unless you could seconds or tens of seconds... unless you were doing something to impact the utility of all four cores. That's how I figure it in my head, anyway... if software doesn't utilize multiple cores, you're stuck with 2.66GHz, which is probably a handful of seconds faster than mine. If you overclock, maybe you'll have something that's actually much faster... but in the best interests of hardware longevity, think about what overclocking will really do for you versus making that quad core last as long as possible... unless we get octocores before that i7 dies. Windows 7 plays pretty darn well with games. I'm working on Fallout 3 right now and the NVIDIA 9800M GS with 512MB in this laptop runs great. I've also tried running Command & Conquer 3 and it runs pretty well without lag, plus I usually have Firefox open in the background along with a handful of other things I may have been working on. Think of XP performance with DirectX 10 capability and you've got W7 gaming performance in a nutshell. If you have a working case, I say you keep it unless you find something better down the road. Apparently you did your research and you did ask for peer reviewing (or just a chance to brag your future rig, right?), so I think you'll be set with the case you ordered UNLESS for some reason it doesn't fulfill your expectations. I know that you were going for uniqueness, but in all reality, a gaming tower is a gaming tower unless you have it custom made AND in a form factor that would turn heads, like making it out of Legos or out of an old NES case. Otherwise, I personally recommend you go for function and cooling efficiency... and since you already bought the case, I'm not going to bother looking at the reviews for it. However, I do remember some of my choices that you might want to take a look at in case you had an oopsies! moment and want to take another look at cases. (ThermalTake and Lian-Li have some very good cases, the latter being much more expensive because of the name.)
  12. I LOL'd. Sorry Bani... calling out someone's English deficiencies and then following up with your own was just plain funny. All in good fun. If you read up on Karmic Koala, the goal was for a 25 second boot-up time, which I'm sure doesn't include POST, but that's still faster than my current Windows 7 boot time of about a minute and a half. The release after that is looking at a 10 second timeframe, and if you want to look at Linux as a whole on this goal, they've already got me booting up my ExpressGate "distro" in five seconds, INCLUDING POST.
  13. Yikes... I hope that this setup is as stable as you want it to be. Overclocking would be unnecessary, but then again, I know how it feels to be able to "just do it." I'm pretty sure that you'll be happy with Windows 7... personally, I am so far.I like rvalkass's pointers... have you considered them at all? To build such a massive machine only to bottle-neck yourself with hard drive access speeds, RAM latencies, and the efficiency of cooling in a "standard" case might be of concern. A little too late now, but hopefully you have a good return policy. Hope that graphics card fits too. That'll just suck if it doesn't...Let us know how your building and operating experience goes.
  14. I wouldn't say that it "passed" Opera... can't say much about Safari because I don't use it on a computer. (A better term would be that it "caught up in a matter of months," which is very impressive. Then again, look at the company footprint of Google.) I use Opera as my "backup browser" (... that just sounds silly) and as my reach out to try something new, and I really like it. The only thing that keeps me from using it all the time as a competitive alternative is that the last time I used it, GMail was having wicked issues with Opera, and things were just SLOW. I'm actually very glad that it is a viable alternative to FireFox, though. I tried Google Chrome back in the beta stages and although I do love the sand-boxing technique, I personally haven't had a lot of crashing going on with FireFox as much as I used to when it was in the infancy of its 2.0 release. (I actually can't remember it crashing... watch it crash on me now. ) What I do remember about Google Chrome, specifically, was its Fisher-Price-looking baby-blue theme that turned me off very quickly and got me back into FireFox with my at-least-decent-looking default theme skinned with WindowBlinds. I'm sure that's changed now, which is why I'll be back to play with Chrome a bit more to see whether it's a viable alternative to FireFox or not (which I'm sure it is).
  15. I wanted to make the comparison with Karmic Koala based on what it boasts for features along with the reliability and the goodness that is Jaunty Jackalope, since it would be unfair to pit a contest with Snow Leopard, Windows 7, and an older Ubuntu distro. (Then again, I'm pitting just Ubuntu, which isn't fair in itself, but it is one of the most popular on the Linux list of distributions.) FouGilang is the Joe Schmoe that I'm talking about and what I want to base some of the debate in this topic towards the OS-level. Joe Schmoe wants to do basic stuff like surfing the Internet, word processing, and the essentials. Wouldn't Ubuntu be the brain-dead choice, since it's secure thanks to frequent updates, a great community, and the whole nature of it being obscure in the operating system market (so far)? He could also go with OSX too (on an OS-strict level) and be happy with Snow Leopard, because it still has an advantage over Windows as being secure through obscurity. (However, Ubuntu still wins because it's free and free to install on any machine, whereas OSX is only available to the general consumer via a less-cost-efficient Macintosh.) If for some reason, Joe wants Windows because it runs everything and developers make everything for it, he trades in the market-obscurity for that versatility in software IF he chooses to play games outside the scope of WINE'ing in Linux or what's supported for OSX. I know that Joe probably isn't savvy enough to work a dual-boot or even know what it is, but even if he did, wouldn't the choice be evident in staying with Windows as the all-purpose OS (albeit the target of every piece of malware imaginable) instead of what a power user (note: not expert) like me would do like dual-booting because I can and because it probably would be best to do my casual work in Ubuntu 9.10 and game with Windows 7? Here's my personal issue. On a practical standpoint, it makes sense to me to just stick with Windows 7 because I do everything on it, including gaming. It already has everything that I want to do, thanks to the fact that developers can only go with making the most money by focusing their efforts and offering their products to a Windows-saturated market. If Ubuntu had developers working to replicate those efforts and it had as great of a software library to pick from, the brain-dead choice would be Ubuntu... but unfortunately, it's not. If I decided to dual-boot Ubuntu and Windows, I would have to switch to the other operating system every time I wanted to play a game that works best with Windows, which is a pain in the balls / bullocks / what-have-you. If I didn't play games, it would make sense for me to just go with Ubuntu because in every other department, it rocks as long as your hardware is supported AND it's actually very powerful when you actually get to know it as far as customization, programming, and file and access control. Why dual-boot then, nevertheless triple-boot? The bookmarking issue can be rectified by utilizing delicious or something along those lines... I haven't jumped onto that bandwagon yet but it makes sense to. (One of these days... ) The history is going to be a pain, though... I know exactly how you feel when I'm trying to find a website I looked at earlier until I realize that I looked at it in Ubuntu instead of Windows. I usually work off of an external hard drive or my USB flash drive for all of my frequently-accessed documents and such, so I don't run into too much of an issue trying to get back at a document in a different operating system. (Yes, you lose the convenience of accessing it via the Recent Documents feature, but I don't use it often enough for it to be much of a concern.) Ubuntu supports NTFS pretty well now, but if you do decide to jump into Microsoft's new exFAT file system, things might get a little hairy there. The waiting period is huge with Windows, and that was one of the things I mentioned earlier. If I want to play a game, I want to just run it then and there... not wait to shut down Ubuntu, reboot, choose Windows in the bootloader CORRECTLY , then wait the minute and a half to two to boot up into Windows to play my game. In all reality, it's not that much time to make it a realistic complaint since you can just take a break from your computer, but what are you going to naturally do when you quit your game and want to surf the web? Shut down Windows and boot up Ubuntu to fire up FireFox, or run FireFox from Windows? Haha... I want to work for your company. When you NEED to have a copy of CounterStrike on your work laptop, you know you struck gold in your career.
  16. Depending on what you have already for technology, 2TB might actually be as essential as 2GB back in the day when we were happy with 200MB hard drives. With the massive amount of movies, music, and pictures I have on my hard drives, I had to upgrade first from 200GB to 500GB, then from 500GB to 1.5TB because I actually filled up my 500GB. And of course, I had to buy in pairs for a physical backup, so it's actually rather expensive but worth it, considering that no one thinks about how cold your blood feels when you hear that dreaded clicking sound on a hard drive that contains your digital life on it. (I read a post somewhere by some elitist that he never had to back up his crap because he was running OSX. I can't wait for his hard drive to fail. Yes, I'm evil. ) I gave my 200GB to a friend of mine and now have two 500GB hard drives and my new 1.5TB hard drive. I have one 500GB posing as my media drive that I plug straight into my Xbox 360 so I can avoid streaming movies (formatted to FAT32 using GParted), use the other 500GB to hold my backup minus my movies (since it's on the media drive, "backed up," and use the new 1.5TB as my external hard drive that holds everything. If you're one of those folks that think XXXGB is enough, think about future-proofing yourself. With games getting larger, programs putting a larger hard drive footprint, HD movies, the hundreds to thousands of pictures you take with your camera (from 1 to 3MB each)... it's easy to justify getting what seems like a humongous and unnecessary capacity than face running out of room. Wait as long as you can for price-per-GB to drop as low as it can, then buy the second-newest thing. Put an extended warranty on it if you use your drives frequently enough to warrant a failure down the road and drive on.
  17. This kind of dumbs down the morals of the question and relies on simple practicality.One loss is better than ten, so you would pick the one.However, let's throw a twist in here.Let's say that the one is a six-year-old girl, and the ten people are full-grown adults.Or one adult and ten elderly people.Since you don't know any of them, how easy would it to think in the more practical way? I think that instinctively people would make a quick decision based on the age group and the numbers, then make that swerve. No matter what, the consequences afterward is always going to keep you up at night, even if you find out that the one or ten people turned out to be douche bags or wanted criminals.
  18. The thing is that I play 720p HD video on my aging HP Pavilion zd8000 laptop, which is spec'd at a Intel Pentium 4 3.2GHz processor, 2GB of DDR2 SDRAM, and a dedicated ATI Mobility Radeon X600 graphics chip (PCIe). As old as this thing is, it plays 720p without too many hiccups, but if I want true fluidity without any hiccups with larger files, I need to keep Firefox closed, etc. Comparing the two, the X600 and the 3650 are comparable with benchmarks until you start getting newer with the 3DMark versions. I don't know if this translates into HD performance or not, but I shouldn't be getting better HD playback than you do. Then again, I'm running Windows XP SP2 on that machine, so maybe it's the codec that's slowing you down, or how Linux is handling that codec? If you're serious about watching HD, I'd try another operating system just to see, and of course, for the good of all computer users wondering the same thing. Although you wouldn't want to (and it's counter-productive), reducing your resolution might help with lag, especially if your video card doesn't even support the resolution you're trying for. (You should be good, because the specifications noted for the 3400 series say that you can go up to 1920x1080. Emphasis on should.) The form factor of anything is the physical property, size, and shape of a component. Basically, I was saying the obvious that you can't shove a square peg into a round hole and expect it to work. NVIDIA has/had MXM for a mobile solution, but I've only done this once and with two similar Dells that I presume both had MXM form factor chips. That stupid Vista-Ready sticker is the equivalent of slapping a Greddy, AEM, or Shelby sticker on my Honda Civic. Just because I have the sticker doesn't mean my hardware under the hood is any better, and it definitely doesn't make me go faster. Try the XP solution if you have some time on your hands and install the latest version of the K-Lite Mega Codec Pack for brain-dead codec management and inclusion of the most popular codecs. Fire up VLC or Windows Media Player and let us know.
  19. LOL... duh me. I'm retarded. For those of you who want free apps that have a jailbreak iPod Touch / iPhone, check out Appulous.
  20. ... but if your wireless capability on your laptop goes kaput, how do you tether the iPod Touch to your laptop...? Magic? And if you get a WiFi signal to share with your iPod Touch, what are the chances that your laptop is probably getting that same signal too? The only reason that I can see here to use this is to expand your wireless coverage using the iPod Touch as a repeater... but that's just a waste of battery power, not to mention that you leave your iPod Touch out of your hands. And I'm not willing to leave my iPod 10 feet away from me. I've already had a $300 Creative Zen Vision M stolen, thank you very much. [Am I missing something here?]
  21. Nice... I'm loving the responses. (I don't mean that in a sarcastic way, either!) Don't get me wrong... I actually loved working with Ubuntu 9.04 when I had my Dell (before I sold it) and preferred it over Windows on my work laptop after-hours because it "felt" so much faster and better. I bring this topic up because from my non-developer's perspective, I didn't see enough reasons for Joe Schmoe to divert away from the 90% market share to try something else. The people I affectionately refer to as elitist Macheads seem to flood every forum and every comment section on the Internet with Macs being so much better than Windows-based machines, and of course, the Linux user occasionally chimes in. Linux has come a LONG way from what it was before, and I agree that it is very easy to use in comparison to how it was for me back in the day when I tried out Kubuntu 7.04. I'm actually a command-line kind of guy (as anyone can witness from my AutoCAD habits), which truefusion nailed, but it was a little frustrating for me to learn simple things that I wasn't used to, like make'ing and compiling, getting used to sudo, using shell scripts, etc. Not to mention that almost every time I installed Ubuntu onto a machine, I ALWAYS had to figure out how to make SOMETHING work, whether it was the stylus for a TPC, or the wireless capability on a network card, etc. Even with the forums out there, it truly isn't always a one-size-fits-all solution every time, and that's when I start getting frustrated with it, wondering why I'm making that hurdle instead of just sticking with Windows which already has drivers for it. From my perspective, however, it seems redundant for me to put yet another operating system for doing what I already do with Windows. I'm probably going to end up installing Ubuntu 9.10 anyway just because I can and plus I do like Ubuntu. I'm not really sure what I was going with my initial post, but it just aroused my curiosity as to why people boot multiple operating systems when for the most part, one out of the three has the most market share and therefore the most software available to it, and if a user didn't care for what Windows had to offer, wouldn't dual-booting Linux and OSX be kind of redundant? FYI, I threw out what I knew for pros and cons for each operating system to try to give some sort of fair perspective to all of them... but it's kind of hard when I only have experience with Ubuntu and Windows. @truefusion: If you want to try OSX out, look at Chameleon 2.0 RC3 and give that a whirl. If you can get your hands on a SL CD, maybe you can give a good comparison between Linux and OSX?
  22. What a crappy card... NVIDIA 8400 GS, eh? What happened with snagging the 8800? Was it not available? One of the things that I would have personally done was to aim for a graphics card with at least 512MB of RAM and GDDR3 to future-proof myself... but then again, that's me, plus I don't know how much money you really wanted to spend. (I know that prices can be daunting when you have a budget nagging in the back of your brain...) You're going to have to tell us how it performs, because Tom's Hardware doesn't even have it on their benchmark comparison list. As far as that cheap shot at Ash-Bash goes, he may be a cocky one, but he's not idiotic, as far as I can tell.
  23. So we've debated, all over the Internet, which operating system is better than the other and possibly might through in some factual information as to WHY it's better. I'm going to be possibly attracting trolls with this topic, but I want some factual information to base my decision on whether to stay with one operating system, dual-boot, or even triple-boot. I saw an article today on Cnet with Psystar, still with an Apple bulls-eye target on it, who released yet another piece of controversy: Rebel EFI. It is a bootloader that allows for you to install the highly-toted and esteemed OSX, to include Snow Leopard, onto a PC with an Intel processor. (AMD fans will have to wait.) Wading through the fanboyism, I saw an interesting little snippet: that this was nothing new, and that there was a project already out there called Chameleon that essentially did the same thing. Google led me to a thread about triple-booting Windows 7, Ubuntu, and Snow Leopard, and even though I don't see any immediate desire or need to do so, I thought it would be a pretty cool idea to have three awesome operating systems on my machine... just because I can. Then I came to think about these evil things we call reason and practicality, and that's when I wondered to myself: Why would anybody actually choose to triple-boot three operating systems on one machine? Heck, if you're a native Windows user, why would you even want to consider Linux or OSX? The unfortunate thing about this post is that the Windows platform is ubiquitous, and it makes sense why developers don't really focus their software at less than 10% of the market share of Apple and Linux users. The Windows users love to say how Linux and Apple aren't gaming platforms, that productivity is severely limited with big-name business programs being exclusive to the Windows platform (like Peachtree), that PCs are more cost-efficient than an Apple machine, etc. On the consumer end, the reasons why you would want to have Linux on your machine is for the security and the fact that it is a full-featured operating system available FOR FREE and with such a large supporting community, frequent updates, and a general advancement towards what users want. However, Linux falls short on the consumer end if the end-user wants to play popular games requiring DirectX, and you definitely can't just WINE everything. (It does, however, offer some competitive free alternatives to popular M$ software like OpenOffice and GIMP.) OSX is in a similar boat, offering Apple exclusives to Microsoft counterparts, but still falling behind on the focus of the gaming market. Also, with businesses relying on Microsoft-exclusive software to run their accounting needs (PeachTree) and whatnot, it's a hard sell for Apple. So, on the perspective of the operating system, I can see these essential pros and cons for each, on the consumer level: - Windows (Windows 7) Pros: Owns the developing market and has the most market share, DirectX support, gaming, being the best Windows OS yet by being more user-friendly and even pretty to boot, advanced security features, native 64-bit support, exclusive popular and efficient Windows-platform software Cons: Is the main target for malware users, infamous bugs like the BSOD OSX (Snow Leopard) Pros: Very user-friendly and pretty to boot, security through market-obscurity, exclusive popular and efficient OSX-platform software (iMovie, Vegas, etc.), ability to run on any Intel-based machine, efficient with hardware (?) Cons: Apple-exclusive, takes much effort to get installed on any machine other than an Apple, fewer choices in popular productivity software due to market obscurity, limited gaming library Linux (Ubuntu 9.10 Karmic Koala) Pros: Very user-friendly and pretty to boot, security through market-obscurity, very efficient with hardware (?), constant updates, customization capability, free Cons: Command-line work scares off most casual users, support is mostly user-based, limited gaming library, driver support is a hit-or-miss at times - I am only judging the other operating systems by what I know of them, so please correct me if I'm wrong. And please note that if you are going to compare OSX to Windows that it's strictly an operating system comparison and not a rant on how your MacBook is uber-awesome because it just is. With all of this being said, it seems that I should just stick with Windows 7 to fully encompass everything I want to do with my computer, to include productivity and gaming. Yes, I'm excited for Karmic Koala's release just so I can play with my beloved cube again and to see that 25-second boot come into play, but at the same time, my Asus G50VT-X5 already has a Linux distro on it with ExpressGate, which boots up in 5 seconds as soon as I push that button and gives me access to e-mail, IM, Internet, media sharing, etc. What arguments do people have for multiple-booting with Windows and OSX, or Windows with Linux, or OSX with Linux, or what-have-you? Do you really use both operating systems "equally" (as in for the reasons why you have multiple operating systems, i.e. Linux for secure, casual use and Windows for gaming). Is the case of Vegas being Apple-exclusive worth dual or triple-booting for just so you can edit movies with Vegas but then switch to another operating system for everything else?
  24. Haha... I just jail-broke my iPod Touch last night.Amazing potential, although it's probably like a Pandora's box. I used BlackRa1n's super-easy jailbreak executable, and lo and behold... that was easy. [Cue in Staples royalties for rayzoredge] The only pain-in-the-butt part was updating my firmware. I had 2.1.1 on my 2G, so in my research, I found out that I could update to 3.1.1 by downloading the firmware online, actually run the latest version of iTunes (as I ran into trouble with version 6 or 8 or whatever I had, which I installed to get my iPod Touch to be recognized under Windows 7 x64), then hold down Shift before clicking Restore to update the firmware. After 3.1.1 was updated, I could access the 3.1.2 firmware update legitimately through Apple for free (as opposed to the $4.95 they wanted).Where there is a will, there's a way. After the firmware update (which wiped out everything on my iPod Touch), I ran BlackRa1n's executable, let the iPod Touch reboot, and ta-da! it was done. I restored my backup (which, strangely enough, didn't restore my music), added Cydia, and now I'm good to go.With the jailbreak, I would like to recommend even more applications now:Backgrounder - Allows for you to constantly run apps in the background. I would imagine that you can leave an e-mail app or an IM app to run so that you can continue to receive calls, IMs, or check for e-mails constantly even with your iPod / iPhone in sleep mode. Haven't really used it quite yet, but I can tell that this can be a good thing and bad as well... since available RAM is going to be an issue.SBSettings - More advanced settings to help you save power on your device, like turning off WiFi, Bluetooth, hiding icons, etc.Categories - You have to get this if you have a million apps. Allows you to organize your apps into folders so you don't have to thumb through and scan through all of your icons.WinterBoard - This is what WindowsBlinds is to Windows XP. Customize the look of your iPod Touch / iPhone. Another must-have if you like being personal.There are still a bunch I haven't tried, but I'm looking to check out.iPhoneModem / OpenSSH - Use this to tether your computer to your iPod Touch's WiFi connection. I still don't really find a use for it, but maybe someone here will. Or maybe it's moreso for the iPhone's broadband capability, but I'm sure that data plan's going to rear its ugly head.IntelliScreen - Another must-have. It makes your Lock Screen informative so that you can check stuff without having to unlock your iPod Touch, then find the app that you want to check.Navizon - Pseudo-GPS for the iPod Touch / iPhone. Not sure how well this would work since it would need WiFi to find where you are, but it's worth a shot.DropCopy - Apparently, an easy way to copy files to your PC from your iPod Touch / iPhone.
  25. Just to help you out from the Western front... Tom's Hardware: Graphics TH has an extensive knowledge base on graphics cards and has benchmark results for tons of games and tons of information on most graphics card chips. I highly recommend you take a look there before making any final choices. Tom's Hardware: Best Graphics Cards for the Money (Sept. '09) The NVIDIA 8800 GTS will run you ?39, which is roughly $65 USD. You have ?155, which is roughly $256 dollars to spend after purchasing GTA IV. Let's take a look at your choices, using TH for your research base. For comparison purposes, the NVIDIA GeForce 8800 GTS (512MB) will set you about $70 USD. For the Fallout 3 benchmark (1680x1050, x4AA, x8AF, Very High Quality), it ran at 59.3 FPS. At 1920x1200 with no AA or AF, it runs at 71.1 FPS. For the Far Cry 2 benchmark (1920x1200, 8AA, 16AF, Very High Quality), it ran at 16.5 FPS. If you decide to go with the ATI Radeon HD 4850 (512MB), it will set you about $100 USD. For the Fallout 3 benchmark (1680x1050, x4AA, x8AF, Very High Quality), it ran at 54.6 FPS. At 1920x1200 with no AA or AF, it runs at 64.4 FPS. For the Far Cry 2 benchmark (1920x1200, 8AA, 16AF, Very High Quality), it ran at 9.6 FPS. If you decide to go with the ATI Radeon HD 4870 (512MB), it will set you about $130 USD. For the Fallout 3 benchmark (1680x1050, x4AA, x8AF, Very High Quality), it ran at 71.1 FPS. At 1920x1200 with no AA or AF, it runs at 82.8 FPS. For the Far Cry 2 benchmark (1920x1200, 8AA, 16AF, Very High Quality), it ran at 11.1 FPS. If you decide to go with the ATI Radeon HD 4870 (1024MB), it will set you about $150 USD. For the Fallout 3 benchmark (1680x1050, x4AA, x8AF, Very High Quality), it ran at 76 FPS. At 1920x1200 with no AA or AF, it runs at 85.9 FPS. For the Far Cry 2 benchmark (1920x1200, 8AA, 16AF, Very High Quality), it ran at 31.9 FPS. If you decide to go with the ATI Radeon HD 4870 (2048MB), it will set you about $230 USD. For the Fallout 3 benchmark (1680x1050, x4AA, x8AF, Very High Quality), it ran at 73.8 FPS. At 1920x1200 with no AA or AF, it runs at 84.6 FPS. For the Far Cry 2 benchmark (1920x1200, 8AA, 16AF, Very High Quality), it ran at 30.40 FPS. I took those cards straight off of the best bang for your buck. Now we can take a look at what we'll get if you decide to deviate from the 8800 GTS. I think that we can be safe with taking out the ATI Radeon HD 4850 because, for some reason, you'll be paying more for a performance decrease, with any gains being negligible for the price hike. Thus, we're left with the following options: Pay $60 more for the ATI Radeon 4870 (512MB) for about a 25-33% increase in overall performance and bring DirectX 10.1 to the table. Pay $80 more for the ATI Radeon 4870 (1024MB) for about a 30%-43% increase in overall performance and bring DirectX 10.1 to the table. Pay $160 more for the ATI Radeon 4870 (2048MB) for about a 26%-38% increase in overall performance and bring DirectX 10.1 to the table. NVIDIA and ATI both bring their separate proprietary technologies to the table. (PhysX is one notable addition to NVIDIA. I'm not too knowledgeable with ATI. The 8800 GTS is apparently capable with PhysX, although an initial Google search makes it seem like it's a headache to do so.) Ultimately, it's your money and your call. On a strict frame rate level, paying almost double for only 25%-33% of an increase is kind of dumb, but if you're going to keep this video card for a few years, maybe it would be a good idea to future-proof yourself so that you're not lagging in other games you want to play as soon as next year.
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.