Jump to content
xisto Community

rayzoredge

Members
  • Content Count

    1,047
  • Joined

  • Last visited

About rayzoredge

  • Rank
    That Guy Who Doesn't Know What He's Talking About
  • Birthday 09/23/1985

Contact Methods

  • Website URL
    http://www.myspace.com/rayboyington

Profile Information

  • Gender
    Male
  • Location
    New Durham, NH
  • Interests
    Computers, chat, gaming, snowboarding, paintball, web design, music

Recent Profile Visitors

15,144 profile views
  1. hey- HAPPY BIRTHDAY! where have you been???

  2. Haha... apparently when you say "trusted," people thought "not misguiding with information." Just going out on a limb, but I think Ash-Bash actually means a "trusted member," as in "more privileges."We already have our custom titles, our post counters, spam control, moderators, etc. Why throw out an "elite group" opportunity for a forum? To further separate members from each other? It's not like you actually get anything out of it other than maybe the gloating factor... because I'm not sure about you or anyone else, but I'll continue to look down on idiots, praise good members, applaud improvement, and treat people as they present themselves, not by a title. That works in real life, too... except I just bite my tongue when the douchebag is a superior with direct influence over my career. =)One thing I can add to Ash-Bash's suggestion, however, is maybe as a "trusted member," you'll be able to edit your posts or something small but useful to the user. The custom title was already a nice touch... and I like how people have Spam Control badges, which I'm sure is a form of trust to appreciate another person's judgment. Maybe if you report enough spam topics or plagiarism, you earn badges for Spam Control or COpyright Patrol... COP? =) Could be dumb, but if people are asking for statuses now... probably fits the bill with what Ash-Bash is asking for.I still liked the badge thing we had back in the day... anyone remember that? It was dumb, but at the same time, it was a warm fuzzy feeling to get pats on the back for good posts, which other members could see, and you could give good posters kudos as well as give one-liners and spammers negative feedback. Maybe we can bring that back but not have as many badges (like the WTF? badge... for absurd posting?).
  3. I just wanted to point out also that this HDTV supports a number of file formats and audio and video codecs when playing off of an external USB hard drive connected to one of the two USB ports on it. However, watch out for movie files that have more than one audio track... I'm suspecting that it doesn't play nicely with the HDTV, since my copy of "Holes" doesn't play at all, resulting in frame-by-frame freezing and stoppage. (That particular file has a 2-channel audio track as well as a default 5.1 audio track.) That's what I suspect, anyway. Also, playing movies that have a 5.1 audio track in this way DOES result in 5.1 sound when the digital output is connected to a surround sound receiver. Strange why this works and not the HDMI...
  4. I'm guessing that someone didn't even bother, thinking that it wouldn't be that big of a deal... or that person was "getting around to it." Text files? Do it right the first time and you won't suffer the bad PR that comes with a massive blow to security like this.Even though it was their fault in basically leaving the keys in the car, it goes to show how stupid people really are when it comes to passwords. You trade in security for convenience, and if you're protecting your financial information with these kind of passwords... well, you deserve to have your crap stolen from under your nose if you're that lazy.Good rule of thumb for passwords: letters, numbers, uppercase, lowercase, and symbols. Mix it up and you'll severely reduce your chances of an easy brute force hack. I'm a lazy guy by nature myself, but I still go by that philosophy, and it's relatively simple to stick to it while making it easy to remember.I used to use "ThisSucks11!!" for one of my work stations when I was stationed in Germany. It has two capital letters, the rest lowercase, two numbers, and two symbols to meet DOD standards for passwords, plus it was easy to remember because... well, my job sucked. See how easy that is and how much more of a force it is to be reckoned with compared to "123456?"You can do the same thing too to keep yourself safe. Use a phrase, or a name even. "Michael" can turn into "MichaelJames13!!" "Password" can be something easy as "Pa55word!!" which gives you the effectiveness of adding numbers and symbols to your password, and essentially, it's the same password.Of course, this is all useless if you're the kind of guy who loves to write your passwords down on a sticky-note and stick them to the sides of your monitor...
  5. Yikes... I don't know about you, but from what I'm reading, "Celeron" is synonymous with "suck" in the CPU world. Take that 3.3GHz figure and knock it back to something like 2.6GHz and you'll have more of an idea as to its "effective" clock speed. The world has been changing, and a few years ago, I thought 2GB was all you needed for just about everything short of crazy AutoCAD renders, working with Abobe software, or any other RAM-intensive application. Nowadays, I occasionally bring my laptop with 4GB of RAM to its knees while trying to work with a sizable image file in Photoshop... which makes sense. Upgrade to 4GB of RAM. Then sit on that until I start saying that you'll need 8GB. (You shouldn't really ever need more than 8GB of RAM, though... I swear. And gaming doesn't make much of a difference between a machine equipped with 4GB of RAM compared to one equipped with 8GB... we're talking about single digits for frames-per-second here.)Bluedragon is right on your limitations for most games that are GPU-intensive. 3D games and especially fast first-person shooters can really take a toll on your graphics card, and it's depressing to know that an upgrade isn't exactly cheap. Your card, by initial glance, sounds pretty good relative to your set up, but if you're complaining about slow speeds during gaming, you might just want to look into upgrading to a whole new system. You can snag a pre-built system with a quad-core processor, 4GB of RAM, and a decent hard drive for sub-$600 prices nowadays, so I'd look into that and throw a good graphics card into it to get the best bang for your buck as easily as possible.But I'm sure that's not what you wanted to hear, because like 95% of the people here on Xisto (including me), you too have a budget. It's easy for anyone to recommend the obvious: upgrade your hardware, defragment your hard drive, clean this, do that. However, sometimes we just have to do what we can and suck it up. This is what I have to say as far as not having to spend a dime:Defragmentation only improves your performance by a marginal amount. What it does is basically "organize" clusters of data with each other so that your hard drive doesn't have to seek the entire platter to put together some pieces to load something... it can just get all the data from one place on the platter and save you those microseconds. Yeah... not a huge difference, unless your hard drive is really, really badly fragmented. (I believe that NTFS helps with that though.)Cleaning out your registry with third-party tools can be good for your start-up time (since there's less entries for Windows to consider), but I don't know how much of a performance boost it would grant. To me, I don't think having a clean registry will help your overall performance after boot, but I could be wrong.Overclocking your CPU using software could be slightly beneficial, but if your GPU is holding things up, it won't help you that much in that case. Overclocking the GPU would help to an extent, but that's the longevity of your hardware that you're trading there.Making sure you have enough hard drive space for your page file will keep your computer from crashing. The page file usage only occurs when your system runs out of RAM to use at any given point in time, and using your hard drive space as RAM is quite possibly the slowest way to go... which is why everyone says to get more RAM. You won't need more than 4GB unless you run a lot of applications at once and multi-task.Limit your start-up programs to items you really need. Personally, I tell anti-virus auto-protection to take a hike, but leave the firewall up. (I use my AV program to scan manually when I know there may be a possible risk to running risky executables.) Anti-malware programs can take up quite a bit of CPU and RAM sometimes, especially if you allow background tasks.If you want, you can fine-tune your services that start up with Windows, although this will free up just a marginal amount of resources... so it's your call (again) to risk disabling a service that you may need. (Fortunately, you can just turn it back on if you accidentally disable your Internet or your ability to communicate with other computers on your network, etc.)Not sure what else you can do there... like I recommend, I'd upgrade to a whole new machine if and when you can.EDIT: I notice that the "edit" function comes and goes... this is the second time I've seen it.Anyway, SLI (Scalable Link Interface) is nVIDIA's technology that allows for you to use two like graphics cards in tandem to give you more GPU power, although it is hardly doubling your performance. At best, it can give you up to a 60% boost... which can make sense economically if you can just snag up another 9400GT for $60 if your graphics card is indeed the bottleneck in your set up for gaming, but you aren't exactly getting the best bang for your buck. Situation-dependent, really...
  6. Apple's iPad (video) Not sure if QA let this one slip on purpose... but really, they had mocked the name years before and now they're actually naming it the iPad? Bigger iPod Touch... whoop-dee-doo. Yeah... I'll get right on that with its super-fast 1GHz CPU. What's the graphical potential on this thing? Gaming is going to be limited... kind of like how we compare the gaming potential of the Wii to the more powerful 360 and PS3. I guess those accelerometers will make it more fun though... I'll get to play my favorite iPod Touch games in full 1024x768 resolution (if I dock it...)! I do like the connectivity though... 3G capable of wireless anywhere, and unlocked to boot. I guess Steve Jobs learned from the massive failure that was the locked iPhone tethered to AT&T's yet-to-recover network... good to know that he's actually paying attention to consumers. The price point isn't too bad, either. You can spend $200 more from an iPod Touch 8GB 3G and snag this gadget... but in retrospect, you can spend $150 more than the iPad to actually get a functional computer that runs the latest games. (Talking about my laptop, of course.) Or you can spend less and be able to do all of what the iPad can do, albeit in a less "phat" package of a netbook or a mainstream laptop. Another thing I like about it is that you can watch up to "720p" movies (but it's limited to 640x480?!) with stereo sound. Problem is that I'm thinking this will be another gadget with potential... limited by iTunes. It, like the iPod Touch, only supports Apple's painful-to-work-with MOV files, but it does work with MP4. More container support (like the ubiquitous AVI and the superior MKV) would be nice, but we don't care about making this thing as useful as it can be, right? It's not bad, but it's nothing revolutionary. Apple was just slow on the ball... for good reason, to learn how the public responded to the Kindle, the Nook, netbooks, slate tablets, smartbooks, and the successes of the iPod Touch and the iPhone. Now everyone gets to spend the Apple tax to receive homage in the form of an iPad... or should we call it the iPod Touch XL?
  7. It makes sense that you take out the CMOS battery to restore factory default settings... what do you have to lose? Your computer already doesn't boot. Hopefully your CMOS battery is only held by prongs and not actually soldered to the motherboard... for convenience sake.Make sure your CPU heat sink/cooler is mounted properly too. That now can be a contributor to your machine not booting up.You didn't elaborate on what happens either when you try to turn the machine on. Do you hit the power button and everything starts to run for a few seconds, then shuts off? What happens? (If you provide this sort of info, our more hardware-inclined geeks would be more able to help you out.)
  8. Problem: MPC-HC plays video just fine, but every 10ish minutes or so, the audio track for any movie begins to chop/stutter like crazy until I stop playback, then resume it, in which the video plays and the audio is silent for a few seconds until it "catches up" with the timeline and plays normally.Set up: I have an Intel dual core 2.13GHz processor with 4GB of DDR2 RAM. I have MPC-HC installed with the K-Lite Mega Codec Pack. I've also installed CoreAVC 2.0 Pro with CUDA support, thinking that it was a CPU-intensive problem. (CoreAVC utilizes CUDA technology to put some of the video processing load onto the GPU instead of letting the CPU suffer on its lonesome.) I usually play movies encoded with H.264, XviD, and DivX for video and that have AC3 or DTS sound tracks, and they are in AVI, MKV, or MP4 containers. I thought that this was because my machine wasn't fully capable of playing HD movies, but the chopping also occurred during a standard definition movie (700ish x 300ish XviD/DivX, MP3) last night. I set up MPC-HC by blocking both ffdshow audio and video and preferring CoreAVC and AC3filter (3/2+SW with SPDIF on and DTS + MPEG + AC3 passthrough). I watch these movies by connecting them to my HDTV via HDMI cable for video and sound output goes through my SoundBlaster Live! 24-bit External USB sound processor (out of its optical output port) directly into the receiver for surround sound. I am running Windows 7 x64 Ultimate.Any ideas? CPU usage is occasionally high when I play HD content, but it's never maxed out at 100%. Playback is perfectly fine until it chops, and audio playback from a pause or stop or just starting the movie up results in a few seconds of silence before it plays normally again. Is it my settings with MPC-HC? Is it my settings with AC3Filter?
  9. I was just reading about using a PS3 as a computer yesterday... George Hotz has done it again and hacked the final "next-gen" console. Basically, now we can pirate games and open up its potential by quite a bit now. Further reading suggests that it may be a less-than-fruitful endeavour to create a "super-computer" out of multiple PS3s, since the processor in it is a 8-cell microprocessor, with each cell dedicated to specific functions and tasks of the processor as a whole. Think of it as a multi-core CPU with each core doing something different. In my opinion, since I'm not in full understanding with what the PS3 can and cannot do, my best guess is that the PS3, putting multiples of it to single use, would yield more of a super-computer applicable to advanced mathematical computation and not so much beneficial on the home user front, since just about every piece of software out there wouldn't know what to do with a Cell processor and thus not use it to its full potential. (The Cell processor's SPEs are the reason why it is difficult for developers to create games that take advantage of the PS3's hardware, which is also probably why it has a limited gaming library in comparison to the other next-gen consoles. And I'm not including shovelware.) Basically, if you decide to create a super-computer in your home, it would only be beneficial to you if you had the software to take advantage of it. Aside from bragging rights, at the moment it would be as useful as a regular 3.2GHz, 256MB RAM computer running Linux... but that's more useful than a PS3 that can only play Blu-Ray titles and play PS3 games, right?
  10. dfddf I try to be in-depth when I can... it can only help others who are looking for the same answers I was looking for. There is a difference with Blu-Ray vs. DVD. BR titles have a higher pixel count and are finer (720p, 1080p) than what DVD has to offer with 480p and 576p (correct me if I'm wrong?). It basically is less fuzzy with finer details and more vibrant colors. I can't understand how some people say that they can't tell a difference between the two, but it's only understandable if you're standing very far away. As you get closer to the television screen, especially at a normal viewing distance, you'll be able to discern the finer quality of high-definition video. Most people would benefit from upgrading to 720p since you would have to sit rather closely to your HDTV to notice the differences between 720p and 1080p, but videophiles will obviously be happier knowing they spent more money to get the best quality available. Also, keep in mind that if you blow up a DVD image to fit a finer resolution, it will look bad... kind of like resizing a 100x100 image to 400x400 and wrongfully expecting that it retains the same image quality. However, most DVD players nowadays can upscale DVD quality to fit a resolution larger than what would be optimal for 480p, so this statement is applicable up to an extent. Upgrading to HD in the financial perspective is probably what would be keeping people from taking that plunge. With the newer LCD LED HDTVs, a complete home theater system can run you around $2000 or more. However, with the drops in prices for LCD and plasma HDTVs, you can have a pretty decent (and complete) set up for sub-$1000 prices. I don't really know from experience what the difference is between 5.1 and 7.1 sound systems or if it's even worth the extra 2 channels. There aren't a heck of a lot of source materials that even support 7.1 surround sound, although most movie titles carry DTS sound tracks (in addition to Dolby Digital) to help you appreciate the extra two channels. There are arguments for the superior quality of DTS and how it is less compressed, giving you more sound quality, but it's arguable that you won't realize the difference in real-time for sound quality OR the extra two channels of sound. Then again, it's up for debate... unless you're an audiophile. Note that all these images are of different resolutions too, so try stretching the already-fuzzy XviD and DVD and 720p source images to fit a larger HDTV with finer resolution... while you compare it with the 1080p source image that fits right at home with a finer resolution to begin with, no resizing necessary.
  11. The Terminator movies were, for the most part, great for sound. The first movie was a disappointment with no real use with the surround channels, although I can vouch for the 720p up-scaling. It felt like I was watching a DVD, but it wasn't bad.The rest of the Terminator quadrilogy was pretty darn good. Terminator 2 was a nice blast to the past, with booming gun fire, tinkles of broken glass, and whatnot. There was a lot of ambient sound during the chase scenes, but most of the action was really up front. Terminator 3 was surprisingly good, especially during .I liked bits and parts of 3:10 to Yuma too, but haven't fully watched it quite yet.What I'm starting to figure out with most movies is that although there might not be much with engaging action on-screen at any given moment in time, a lot of movies like to use the surrounds for theme or background music. Ambient sounds are rather soft too so sometimes it's hard to notice (especially with incorrect speaker placement), but that's how it's supposed to be. A lot of times, I forget that the surround sound is there and just watch the movie, but then I'm reminded with a well-placed sound effect from time to time to remind me that it's on and working.
  12. First off, if you get so into a game that you throw your controllers around (because you forgot or it slipped), you might find some comfort in the fact that Nintendo reminds you to put the damn wrist strap on and to make sure that there's plenty of room around you. I don't know about you, but my kitchen, although it's a decent size, isn't a good playing place with the island in the way. Also, you're still going to swing that bat or bowl that ball the same way when comparing a fast motion to a whipping motion. I'm surprised there aren't more stories of kids beating each other up with the Wiimotes... but wait, Nintendo did warn us about ample playing area, right? The technology is Bluetooth... advertised ranges are probably 30 feet, but the real range is probably 6 feet. I know I had some input issues playing at the 6-foot line sitting on the couch, but the Wii isn't exactly always the gaming console that you play traditionally with little more than thumb and finger movement. You'd be making a rather dumb decision in omitting the Wii because you couldn't do something as simple as having playing room and wearing the wrist strap that prevents the "issues" that Wii owners have.Secondly, I got sucked into high definition because it IS better. I don't care whether it's Blu-Ray or it's the dead HD-DVD... high definition is high definition. Is it worth upgrading to? In my opinion, watching movies at home should incorporate a surround sound system and a high definition television because it actually makes watching movies even more fun. I like hearing bullets whiz "past" my head. I like explosions that I can feel (coming from the sub). I like being able to see texture on Optimus Prime's face when he falls in the forest scene (all shot with IMAX cameras). Even watching something like Tinkerbell and the Lost Treasure was gratifying, being able to feel like you can touch the animation. It's a crazy upgrade from watching DVDs, depending on what you watch. Is it worth spending money on? If you can appreciate all of what I just described, it's probably worth getting a PS3 to get the gaming console and the Blu-Ray... but I wouldn't make that decision if you can't appreciate the PS3 as a gaming console, since Blu-Ray players are getting less expensive.The 360 might seem like the winner here, then, if you don't go for HD, but you also have to look at what it has to offer and what YOU'RE going to use it for. As we've all pointed out, the 360 and the PS3 have their faults and their strengths. Decide what you're going to use a gaming console/entertainment center medium for and go from there.
  13. Take a look at DocViewer too. You can use SSH to throw PDFs, MS documents, images, and whatnot so you can basically use your iPhone or iPod Touch as a read-only PDA. There's another one called Files, but I haven't used it yet so I can't really comment about it. Also, here seems like an excellent solution that allows you to create and edit in addition to MS document and PDF support... check it out.
  14. I read that Intel leads the way with technology and being ubiquitous with just about every computer out there. It's like the case with Microsoft Windows against Linux: people know Intel, people buy Intel. Those that don't know any better probably don't even realize that they have an AMD processor in their computers when they buy them. AMD is slowly catching up on the tech side, but there seems to be more focus on keeping Intel from dominating the CPU market than there is in new research and development. It's crazy when I hear 6-core (sex-core? ) processors coming out of Intel with their working i7 line-up and the only thing that stands out from the crowd for me is the tri-core processors from AMD. It would make sense that AMD is better with an on-die cache on their processors, reducing lag time with communicating with the northbridge, but I think that Intel has the capital and the reputation to make deals with other companies to "optimize" and work with their technologies... and those companies tend to be a bit higher up in the food chain. Also, I'm not sure if there's more of a focus on developing ATI's graphical powerhouse or if the ATI portion of the company is just cranking out monsters like the HD 4870 that dominates the GPU floor, but if the focus is indeed moreso on graphics, it would kind of explain why the processor technology seems... a tad slow in coming. I don't think there's enough of a performance difference when measuring *BLEEP* for tat between Intel and AMD processors, but if you want the latest and greatest AND can pay for it, Intel is the way to go. However, AMD is still in the ball game with efficiency (IMO) and it would take quite a bit to dismiss it as an inferior product. EDIT: THE EDIT IS BACK! Anyway, the *BLEEP* is actually t-i-t. Not sure why that's censored if I'm not even using it in that context... not to mention that it is a valid word and not just slang.
  15. I need to recall one of my quasi-reviews on WALL-E's sound performance.The reason why I said it wasn't that great as far as surround sound goes would most likely attribute to the fact that I probably wasn't sitting down and watching it. WALL-E deserves a spot in the Recommended list. Sitting down to watch the movie made me realize just how the surround sound was being used, and after being abused by titles like The Taking of Pelham 123, you tend to miss the nuances of sound that is the swirling of sand and dust in the beginning of WALL-E, or the fact that most of the music is played through the rear channels while the action is riding on the front speakers. There's a lot of ambience to be heard, and the funny thing is that you won't notice it... unless you disconnected your rear speakers. That's what surround sound is supposed to be: not knowing it's there until something actually happens "behind" you.
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.