Jump to content
xisto Community

kam1405241509

Members
  • Content Count

    70
  • Joined

  • Last visited

Posts posted by kam1405241509


  1. I assume you removed the bluetooth adaptor completely, or that it fell off (ie it didn't break in two, leaving the USB plug still plugged into it).

     

    Does the motherboard come with any onboard diagnostics (modern ones from Abit & Asus etc come with a nice LED that reads out error codes, so you can then look those up in the manual). Usually the error codes aren't of much detail, but they at least tell you where in the boot process it gets to (I had some boot problems & thought it was the board (having just removed/reinstalled it to add some fans), but then looked at the board codes as they went through the sequence & it stopped at the RAM check ... it was incompatible RAM!).

     

    3.  What do you mean your HDs don't get read?  Do you mean you can hear the cylinders inside rolling, but don't hear that "tick" sound they make shortly after you turn your computer on?  Sometimes, this happens when the RAM or the VGA aren't connected.

    1064334704[/snapback]


    I'm guessing he means he doesn't see his HDD read/write LED turn on.

  2. I am looking for a new Video Card and was wondering what the prices range from. In my current one I am getting video, but the quality is not too good. I played that new game Rome and the game works, but when they get to the parts where the animated characters talk it starts lagging, and same with age of empires III. I just want a video card that can play all the games properly and above the minimum requirements. Anyone got any ideas on what kind I should get?

    1064329474[/snapback]

    Hi Spartacus :-),

    I love Rome Total War too.

    Check out this roundup of cards (http://www.xbitlabs.com/articles/video/print/2004-27gpu2.html). It's quite old (no 7xxx's etc) but I doubt you're after a high-end expensive card ... esp since these things double in power every 6-12 months ... it's probably best to upgrade later (unless you have a specific game/app that needs the 7xxx's power ... or you could upgrade to PCI-e & SLI etc .. but that's still quite an expensive route as well).

     

    Anyway, the review includes benches for a few strategy games (Rome) & some FPSs (Doom3, HL2, Far Cry). If you just want to play the Rome & DoomIII at min performances (low-res, low effects), then just get a 5700 which will give you approx 20-23fps in both, otherwise a 6800 would be twice that performance&cost in Rome, and nearly quadruple that in Doom3 (because it has the hardware to handle complex shaders/lighting, & the fill rate etc ... and hi-poly characters/animation)! At the highest end in Doom3 (1600x1200, FSAA/AF, etc), the 5700 only manages 5fps whilst the 6800 can reach almost 39-49fps (54-90fps in SLI mode)!

     

    The 6800s have upto 16 pixel pipelines (62xx was released with reduced functions to sell cheaply etc .. I think they had 12), 6 vertex units, Shader Model 3.0 ... upto 7200Mpixels/s fill rate (same for #texels per sec wrt 1 texture unit per pixel pipeline) ... and the memory bus reaches 35GB/s (128-256MB)!

     

    5700's are 75USD, and 6800's (at newegg) are 150USD :-). So it all really depends on whether that extra 75USD is worth it for you if you play the most recent games with complex shaders etc.

     

    Finally, personally I prefer NV for their drivers (very good Linux, OGL & multimonitor functons ... see the above link plus this one http://forums.xisto.com/no_longer_exists/), but ATI has a strong following too due to their focus on efficiency & low-heat/noise etc rather than brute force.


  3. On what OS?

     

    curare

    1064330010[/snapback]


    The 64-bit Far Cry patch is the only damn good commercial homeuser demonstration of what can really be done given 64-bit processing & memory addressing. The draw-distance can increase (scene can be drawn out further into the distance), more objects can appear onscreen simultaneously, higher resolution textures (this makes a big difference & I think most games developers say they are going down this route soon .. ie in 2006). Wrt games/graphics I really like the look of HDR & some of the newer shader ideas, but that's going slightly off topic ;-).

  4. I use Bouth Windows XP 64 bit (Extremely Buggy), and Linspire 5.0

    1064330055[/snapback]


    I think the main problem with x64 is that many hardware makers haven't bothered to release 64-bit drivers at all, or at best the drivers are not very stable. I don't think the OS itself is to blame. Also I don't think MS takes it that seriously & knows full well that this problem is due to vendors' resources being focused on Vista instead which will have a longer lifespan (and therefore worth the effort/time/money spent on developing Vista specific 64-bit drivers instead). Again though I don't know anything for sure since I haven't seen the internals!

  5. WeaponX, I would not worry too much about Vista. It will take presumably more than 6 months until a stable version is available. Service pack 1 should be there before you install it. But, other than that, I am not sure if you'll want Vista on your machine at all. This OS has built in many restrictions, unnecessary alerts, and what not. It's bloated in a way you'd think M$ is in the HD making business in the first place and not in writing SW. I can tell, because I have a beta version installed here. And while this is not the final product, you get an idea where it's headed. Forget it! It is not even tasteful anymore. It's sitting on my HD but I rarely call it up because I simply detest an OS which tells me what file to open and what not (for example mp3 files). Do you want such an OS? I don't

     

    I installed ubuntu 5.10 here as well on the same machine, and it is so amazing, so effective and so efficient without putting any unnecessary clutter on either your desktop or your HD. Plus it is of course very stable.

    1064329998[/snapback]


    Vista brings with it a GPU accelerated GUI (basically since OSX had Aqua .. Linux is also getting equivalent setups but it'll probably be minimalistic/efficient :-)). Eventually it'll bring with it the SQL-server-lite-enabled WinFS ... not sure I'd want a database running 24/7 behind the scenes, unless it doesn't do much beyond updates & tree restructuring etc until it is needed to find something.

     

    Current realistic specs are bidirectionalPCIeX16/GPU/256MB, 2CPU(it's more threaded internally)/2GB, SATA/NCQ ... and of course an HDCP HD-LCD to view BRDs/HDTV/etc (legal) content!

     

    I agree, but it depends on the user. I don't want my OS doing much since for me it's just there to manage/run all my apps efficiently. For a laptop, especially, wrt one that's trying to save power, running a DB on a 2nd CPU is excessively wasteful!

     

    I too don't want my OS dictating what I can & cannot view/execute .. I will certainly not have my main PC running any protected content .. There are cracked hardware out already wrt HDCP to DVI, but the standard allows for the blocking of specific peripherals & they can change the encryption codes etc at any time. I'm sure that eventually there will be work arounds .. or people will just get all their content from non-legit sources online instead .. in the same way that many people avoid iTunes files simply because they do not want any restrictions, so they burn their CDs & compress to an open format!

     

    Overall I think there's very little benefit for far too much cost with Vista. I'd use OSX if I wanted something pretty & with a database (and secureness), or Linux if I'd want something efficient that just works & does the job (and can look very nice if setup well etc .. without being inefficient). And I still really like W2K more than XP because it works as well without any of the fluff! With HW virtualisation (esp in v2) there will be less/no reasons to run Windows as a main OS just so as to run a particular app(s) .. so then people will chose their main OS purely based on the features they need (or think they need).

     

    Finally, I think MS's logic is that they rarely produce a completely rearchitected OS (a lot of the stacks in Vista are completely rewritten or new), so they won't get a chance in the near future of taking advantage of all the new hardware changes that have/are taking place (multicore, 64-bit memory space, PCIe graphics, SATA-NCQ, etc), so they may as well focus this OS on the very latest or near-future (DDR3) hardware, and wait for the majority of users to catch up. Eventually all this will be common/cheap (even multicores on laptops now exists), and hopefully the continued focus on lowering power by intel (and also AMD to a lesser extent) may have some decent results soon :-). This is all conjecture/guesswork though until the final final build is released.


  6. Vista, although suffering minor setbacks, is scheduled for realeas in the first half of 2006.  It is a 64 bit OS.  The AMD 64 bit chipset is backwards compatible with 32 bit processors because it is mostly simply a zero extended processor to handle 64 bit values, not many new 64 bit instructions (unlike the Intel, at least, unless you get a new make which has compatibility for the new intel instructions).  Go with a 64bit chip, even if you can't reap the benefits now, in two years, when XP is obsolete, you will not need to buy another new processor.

    1064329971[/snapback]


    The original roadmap stated Vista would be released to RTM in June & public launch in Oct 2006 .. it was never H1 that was just the later betas/RCs (http://forums.xisto.com/no_longer_exists/). Vista betas are getting delayed & now an MS exec stated the final's been delayed too(http://forums.xisto.com/no_longer_exists/).

    They're going to release as both 32-bit and 64-bit .. it'd be pretty dumb to sideline the majority of users (even if their experience would be pretty lame without serious hardware!).

     

    Yes the IA64 is a serious number of changes, but it's basically dead thanks to much higher prices & much poorer IA32 performance!

     

    Intel's other 64-bit CPU architecture is (now*) an exact copy of of AMD64 .. because MS said they will not support two versions of x86-64 CPUs just because Intel didn't like the fact that they didn't invent it & because they wanted to cheat and use their dominance/power rather than compete fairly (AMD filed a recent lawsuit claiming unfair lack of competition from intel in the supply chain)!

     

    * Intel originally released an incomplete version of AMD64 (of course they called it an Intel acronym) ... Intel had less not more x86-64 intructions initially!

     

    Now, although x86-64 is a simpler change than IA64 (obviously .. as it's a bunch of extra instructions not a whole new ISA), it's not simply a matter of widening the eight GPRs etc (64-bit addresses to be able to address >4GB's without a broken Xeon memory bus extension). AMD doubled the eight GPRs & eight SIMD (SSE/2/3) registers, and the rest is const (eight x87 FPRs & the extra 120 hidden/internal registers for register renaming.

     

    Finally, AMD's K8 ISA isn't simply just 64-bit-ness. They also took the leap to HyperTransport CPU&PCIe interconnects ... and the integrated memory controller which helped scalability without losing/sharing performance .. these two only really come into their own at the high end though (2/4-way SLI, massive peripheral bandwidth etc for the former ... and SMP/ccNUMA for the latter).


  7. i want to get a new hd for my computer as a second drive.

     

    i want to get a high capacity becasue im going to store captured video on it so 200gig/250gig/300gig.

     

    i live in the uk so im buying from uk dealers. i was wondering where is best to buy. ive looked on ebay and there are lots on there but i dont know anything about hardware brands or storage or anything hardware much. will the drive come formatted or will i need to do that [not a bit problem as im gonna be using linux on it]. how will i install it onto my current xp.

     

    could anyone tell me any useful infomation about brands/prices/dealers/sizes anything useful really becasue im a bit warey as i dont really know what im doing  ;)

     

    thanks in advance

    1064328327[/snapback]


    I agree with Sarah81, WD SATA/PATA drives are very nice. They are the only 10k RPM SATA drives with 3-5year warranties on their top end drives, so they believe their drives are reliable otherwise they'd be losing money ;-).

     

    Ditto I'd agree that you need to get as much HDD space as possible if you're gonna store tons of videos on the HDDs. However if you are just aiming at max performance on a few AVs then SCSI would be best. Maybe even a combination of the two if you're loaded!

     

    I'll probably be selling a bunch of WD250GB SATA's on ebay soon, and I took a look at the average price a few weeks ago (to guage what price I should sell the at) and I think it was around 80GBP for new ones.

     

    IBM have some 400GB drives out, and several other manuf's will be doing likewise (right now just a bunch of stoopeed paper launches which are now stoopeedly common in this supposedly-fast-developing but-more-like-fast-lame-@r$-marketing industry :-((). Sorry for the rant, just remembered how long I've been waiting for larger drives promised nearly a year ago .. so dumb to wait for things that aren't out yet when they keep changing their deadlines all the time!

     

    With AV files, 16MB caches were popular on SCSI drives & are now common on high-end SATA/PATA ones ... not sure it'd be that big a deal when streaming a giant AV into RAM, but obviously no harm if it's the same price :-).


  8. Anouther thing to look out for is the type of hard disk...

     

    SCSI, IDE or SATA.

     

    IDE is the most common, SCSI is supported by very few boards, and SATA is a littrle faster, and usually supports hardware RAID.

     

    1064328485[/snapback]


    Hi qwijibow,

    there are many different speeds of IDE, SATA, SCSI ...

     

    pSCSI runs up to dual channel U320 (320MB/sec) and 640 is on the way soon along with serially-attached etc.

     

    SATA is currently at 150MB/sec and soon will be at 300MB/sec which is still slower than good old U320 SCSI!

     

    If money is not an issue, a SCSI RAID array is the way to go for NLE video editing apps, no doubt about it. However, SCSI drives and controllers are more expensive than their SATA/PATA counterparts, due to the extra complexity in the protocol (it's made for multiple devs accessing things simultaneously & is built with large queues & respective algorithms for this). The algorithms are not all aimed at server tasks, since AV is a popular use of SCSI drives, so some desktop alg's are also defined by the big disk vendors. Note you'll not get as much GB's for your GBPs if you go this route, but you'll get higher performance using fewer drives, so if this is what you value go for it.


  9. You might consider getting one of maxtors newest drives with 16Mb cache, could speed up things a bit ;) .

     

    The RAID0 thing is also a good idea, but remember that chances to loose your data because of a disk failure will double (2 disks can die, if one dies, everything is lost, now way to repair the data).

    1064328492[/snapback]

    Hence RAID5!

    JBOD is just a bunch of disks, RAID0 is striping, RAID1 is mirroring, RAID2 is bit-level rather than block-level striping and is no longer popular (mem to choose a large block size in RAID0 if you are storing/manipulating large AV files), RAID3 is byte-level stripes with a dedicated parity disk, RAID4 ditto but block level, RAID5 is block-level striping with parity data distributed across all the disks is the array & it is currently the most popular method. When having large numbers of disks, RAID6 which is ditto but with the parity data stored at two locations so it can survive two disk failures!

     

    Then there are all kinds of nested RAID levels like RAID0+1 (two HDDs in RAID0 stripe, with another two disks as a RAID1 mirror of the original), and RAID10/1+0 which does a similar but opposite setup (stripe across drives that are locally mirrored), RAID50/5+0


  10. 2. Oh I see you mean power consumption, doh, sorry. Well you could always power it up with a cheap 2nd PSU or even an external one (I use a laptop-style PSU for my ext USB convertor etc!).Also I forgot to mention that NCQ (& some other neat features) is NOT on all SATA drives, or even on SATAII drives, which don't even have to be at 300MB/s either `... there's a whole load of can-o-worms on this! I think anand or ace had a good explanation a few months back. Basically SATAII are a set of recommendations, not a standard or anything .. I think it is a new name for the working group too. SATA-300 etc basically states the speed, and the manuf's must state the features separately.


  11. Hi sparx,

     

    Question1: Will a new SATA HDD function and co-exist with an existing PATA HDD ?

    1064328559[/snapback]

    Yep, no probs. If you have the appropriate PATA & SATA controllers (either onboard or on PCI) it's just a matter of pluggin them all in. The OS sees them as normal drives .. except Linux which treats SATA drives as SCSI ones, but that's another matter!

     

    Question2: Will I need to make any changes in the power supply?

    1064328559[/snapback]

    Nope, often SATA drives have both the old Molex PATA style power connectors, along with the new thin SATA power ones too :-). Even if there's only the SATA ones, convertion is simple (either DIY or I think I've seen many dirt cheap convertors online) but that's probably not necessary at all anyway!

     

    Question3: Will there be any difference in performance between a SATA and PATA HDD of the same capacity?

    1064328559[/snapback]

    It depends on the drive's model (I've seen a review of a PATA drive beating out high-end SATA ones in certain benches!) wrt how the cache algorithms are defined & how much cache etc ... and also on the usage model wrt if it's a server then maybe the larger queue algorithms of SATA may be handy. For desktops, the NCQ queue algorithms may be handy in getting the right sequence of data off the disk in terms of what's closest to the head at the moment etc. It's best to check out & compare individual models. WD raptors are SATA only and are rated as the highest performing desktop drives ... but I don't have access to one so I couldn't say for sure ;-). Also, if depends on what you are going to do (video needs high bandwidth, whereas gaming often needs fast access times along with a reasonable amount of bandwidth, etc etc).

  12. Well the ink on the paper towels seems to me like a good thing .. at least it seems the cartridges themselves are OK and aren't dried up or something. My old-ish Epson C44UX inkjet was the same. I tried the old safety-pin/hole jobbie, and I also tried to squeeze the carts to pour a LITTLE bit of ink into where the cart's hole met a small protrusion (that pierced the carts) within the head-part ... but I remember it took ages to get the thing printing properly again .. and I also remember it was very messy & frustrating (to the point where I vowed never to use it again .. and moved to a cheap HP laser which had more paper jams than any printer I've ever dealt with .. but that's cuz I was desparate to do manual duplex using el-cheapo thin/lightweight sheets!!). Damn, sorry for the divergence/rant.


  13. I thought this would be the best place to post this link that was sent to me

     

    http://www.lumenlab.com//?gtnjs=1

     

    I read through some of the forum posts on their site and was really impressed by the quality of some of the images they were producing on their own homemade projectors. For a fraction of what an actual commercial projector costs (upwards of 5000 $ I hear) one can produce their own quality one. I might think of buying the guide from the website. It says on their website that building a descent one costs 400$ with new components and all.

    1064327693[/snapback]


    Hi golgothurteen,

     

    they mention: "The Lumenlab DIY 15" Projector is a REAL projector with a high quality XGA LCD panel, a powerful Metal Halide lamp and REAL projection optics. The so-called projection TV kits are simply a cheap fresnel lens that you put in front of your television; an image is projected, but not a watchable one".

     

    It's definitely not worth paying for "their" info ... I've seen drawings/schematics for these types on some forums a few years ago.

     

    I've seen the Fresnel+TV kits on an online TV tech show, and the "quality" was worthless. These guys say they use proper optics etc. What they are doing is basically projecting a bright lamp through a normal LCD (rather than one on a chip) and then using the lens to focus the image crisply on a wall/sheet.

     

    I remember the forum thread I read mentioning problems of heat being the main issue .. and one guy built a bunch of fans in his.

     

    All of these sites sell "howto's" that should really be free .. and they are just there to then sell their own wares!

     

    Other options I read about include the use of a cheap OHP with an LCD display, and both transmissive & reflective LCDs ...

     

    Kam.


  14. Yeah, I've seen loads of these, both DIY types & premade, and there are tons of different features available too.The most basic would be:1. DIY drill/tap acrylic to mount HDD from underneath2. mount rear sockets with cheap IDE to USB (or eSATA)More common/advanced would allow both internal (p/s-ATA cage) & external (USB or eSATA) mounting.Most advanced ones have autonomous RAID1 backup etc .. but I can't see the point of those, much!There are also ext cases for 2.5" laptop drives with tiny USB & power interfaces ... had a sturdy metal case too. Not sure if it's that easy to DIY when you get to smaller drives.


  15. Confused again :) 

     

    Just to wrap this up, so PATA is similar to SATA and it's not ATA at all?  That explains the 16MB cache :)

    1064325469[/snapback]


    Hi WeaponX, sorry for the late reply again, and sorry for the long message previously .. one of my bad habits :-(.

     

    OK, to sum up, PATA=ATA=old, SATA=new. But either is just fine to me, since as you can see in that review I posted, this drive is damn good & competes well with all the new SATA drives too. So to me there's not really an issue between choosing one or the other. I'd actually prefer PATA to use in older PCs, or SATA if you have a specific need for some feature.

     

    PATA is actually ATA, and it's the older parallel interface that runs at upto 133MB/s.

    SATA is the newer serial interface standards that run from 150MB/s upwards (SATAII is a little consuding to some extent since it's a lot of marketing just to state that the drive has certain features that improve how data gets read).

     

    I think you can buy convertors to go between the two, since the protocol is still pretty much the same, just serial or parallel, and hence all the same software drivers pretty much work as is (unless you want to enable SATA/II specific features).

     

    If you have any more questions please write them here, I'm sure there must be other people who are also confused, so your questions will hopefully make me write better answers :-). Also, I will reply no matter what the question, though I can sometimes go AWOL now and then due to work deadlines, sorry :-).


  16. To me, PC systems are getting expensive to upgrade and maintain for gaming.  Personally I went to a Mac for home computing and a PS2 mini for playing games. 

     

    Systems today are running into performance problems due to heat.  The people I know that have purchased or built machines in the last two years have all been having similar problems.  Between the processors, GPU's, Ram, and other cards in smaller and smaller cases, heat is the greatest problem. 

     

    One of my friends that actually does work for a game studio say's they are buying Falcon northwest machines for high end testing.  However, the down side is that they are rather expensive.

    1064325586[/snapback]

    To me PC's are getting much cheaper than they used to be since they've become popular with the masses a decade ago. It used to cost 10k just to get a crap PC that'd only be useful for work (about 20 years ago), then that dropped to about 3k for a machine that could play games but not as well as dedicated consoles (15 years ago), to machines that cost 2k but played games well thanks to 3dfx (about 10 years ago) .. nowadays you CAN pick up a gaming machine for 500GBP if you make many compromises & don't go for the high-end all the time.

     

    As for Macs, yes I must admit I do love OSX, I think it's the best of the current-gen desktop OSs no doubt about it :-). But I've got a lot of x86 apps that I need to run, so for me I'm waiting for the x86 versions .. hopefully they'll be as good as their recent Macs (high-end & not that much more expensive to PCs).

     

    But I'm not really into console games (beat-em-ups, non-FPS shoot-em-ups, platformers etc). I prefer flight sims & FPSs & adventures, and PCs are better for these genres at the moment. There's no technical reason why consoles couldn't run these types of games (xbox1 was basically a PC!), but they don't aim at these genres, and even on PCs flight/adv/etc are dying breeds, since most people don't like them :-(. I have absolutely nothing against consoles per se, but the next-gen consoles don't look that appealing to me compared to an easily upgradeable PC with a fast CPU/GPU/PPU. There's no way PCs can compete with consoles in hardware costs, since MS/Sony discount the hardware (MS lost billions on XB1) and make up the cash on selling a licence per game sold etc. But if you wait a year or so after the next-gen systems launch you could get a better specd PC at a bit more cost that will play games aimed at those consoles for a few years! But for me, a console just doesn't replace my PC gaming needs .. I wish it did, but I doubt it ever will!

     

    I don't think there's any need to spend huge amounts of money just to play games on PCs unless you want to continuously stay on the cutting edge for some reason, and maybe that'd only be necessary if you were competing in tournaments professionally/commercially or something like that, if you know what I mean.

     

    I can understand why your friend's game dev co wants a cutting edge machine at the time of game release so they can make sure that's where they aim their top specs at (it's a bit pointless trying to sell a game no one can play yet, and that by the time they can it'd have been superceded ;-)). But they are doing it to make money, not for playing games per se. Personally, I'd only spend a lot of money on a PC IFF there was a financial payback from it (either now or in the near future), but that's just me .. I'd feel too guilty trying to justify a high-price just for gaming, unless an insurance company was paying for it for me of course ;-).

     

    OMG HAHAHAHHA.  At https://www.voodoopc.com/ check out the "Digital Creation" under the desktop "Omen" HAHHA.  You can configure that thing up to $35,000 ahahhahah.  You can get dual dual core processors hahahahaa.  Dual video cards and quad 15,000 rpm hard drives ahahhaah.  They even have a 46 inch 8ms gaming LCD hahaha.  There is even an option for a $510 paint job ahahhaha.  So uncalled for...  Check it out lol.

    1064326877[/snapback]

    Er, heh, I thought I'd seen it all ... now there's a famous gaming company trying to sell a PC the price of a small car ... that's just nuts to me. I'd never spend that much (in GBP) on a car to go from A to B, let alone a PC just to play games & program code/type docs etc!!

     

    Large screens at those response times are usually low-res ones. That kinda defeats the purpose of a monitor to me .. that's more like a 1080p TV, so it should cost so much more than TVs normally cost on average ;-). They say it's a Samsung with a pathetic 1366x768 (not even 1080p!) resolution yet costing 9000USD!!! You'd have to be demented to pay that much for a screen that'd be useless for watching HD movies or tv in the near future!

     

    500USD just for cosmetics ... I'd rather go down to my local garage and get them to do a metallic paint job (sure non of this angle colour changing stuff .. but is it worth 500USD of your own money when that's the price of a half decent CPU?!). Just doesn't make sense to me ... unless I was a rich kid & it was my parents' money & I had no idea of the value of money ;-). No harm in dreaming though!

     

    2000USD loading it up with 2TB of HDDs ... again there are much better ways of spending money than on loads of HDDs, and you can always simply add drives when you run out of space at which point those prices would've dropped .. so to me I'd rather start with just 1 or 2 & try to use cheap optical for bulk files like videos.

     

    Basically, there are ways to have a decent systems for most peoples' needs whilst not breaking the bank, but it involves being ultra conservative and minimalistic .. and doing lots of small minor upgrades but only when necessary .. rather than aiming high at the start!

     

    Again it's my own opinions .. just my 2c's wrt what I've found myself from making some dumb mistakes along the way in buying at the high-end once or twice for a few components and usually regretting it! I've not regretting spending on decent monitors since they're worth it wrt you can use them for a decade, but a high-end graphics card just for 1 game definitely isn't to me, if you can manage to hold out on buying until the next versions come out so you can then buy the previous gen at half the initial price ;-). Again, if you're using them for work, then it's a totally different issue!


  17. one more thing since your into raytracing and renderings , I would like to ask what would mainly decrease my rendering time ?

     

    a decent Processor ? or a HIGH end gfx card such as the Quadro or the FireGL series ?

     

    since a few of my friends told me that a high end graphics card would only matter in viewports not in the final render , is that true ?

    thanks .

    1064325985[/snapback]


    Actually the video card has nothing to do with rendering, only with the program running.  The more video ram, the faster things will select on screen and move around and the larger number of ploy's that can be display.  Some packages will offer a sample render function that uses the vid card for a quick and dirty rendering without all the features.  (I havn't touched Max since 2.5) 

     


    A commercial rendering app normally involves only the CPU/RAM, yes, but still there are a few different options available:

     

    1. Expensive opteron ccNUMA machine

     

    2. Cheap network cluster .. setup network rendering in your apps (to use those 12 PCs when they're not being used ;-)).

     

    3. Not quite there yet, both the hardware & the software, but it's worth keeping an eye out on these .. several recent academic ray tracers make use of programmable GPUs that support certain shader languages .. but I found they weren't any faster than a fast x86 though!

     

    4. Expensive custom hardware .. there are also commercial rendering hardware accelerators (e.g. from ART in the UK). These are PCI-X cards with 8 of their AR350 processors, and you need to recompile code to use their libs though they did write a RenderMan interface & 3dsMax/Maya plug-ins :-)) ... literally ONE PCI-X can shrink rendering times from HOURS to MINUTES. Their hardware can accelerate ray-tracing, radiosity & more :-).

     

     

     

    However, rendering frames of actually has to do with CPU and RAM.  Most of the newer rendering engines are tuned for Intel chips, with exception to Mental Ray, which is tuned to Linux and AMD 64. 

     

    Your better off with a lesser video card and more raw Ghz and you really will not see large performance boost from a 64-bit processor until the code has been optimized for 64-bit and you double the ram on your system.  If you were using 2GB on a 32-bit system, you'd better have 4GB to see a good increase in the rendering times. 

     


    I still think it's nonsense that intel is faster than AMD CPUs based on apps being optimised that way .. I compile code using INTEL's compiler for my AMD CPU .. and it's faster than GNU compilers when you set SSE3 optimisation flags etc!

     

    AMD's high-end architecture allows you to scale the number of CPUs & the RAM. This IS important in rendering .. but would cost boatloads of cash :-(. It allows for 25GB/sec bandwidth when running across 4 CPUs .. and with 4GB DIMMs that'd be 64GB of system RAM!! But that's not relevant here due to the ridiculous cost of it all .. at the moment ;-).

     

    I agree though ... there's far too much playground antics amongst users & manufacturers/vendors! I couldn't care less whether intel or AMD were on top, but personally I've found that whenever I'm up for a new machine, that AMD seemed to be faster (having tried out both in my workplace with my benches, apps & code).

     

    Oh, and also the OS, drivers AND most importantly the apps must be compiled to support 64-bit ... this is why you didn't find any difference in games .. most commercial games/apps are still 32-bit .. I'd still recommend AMD for having a better 64-bit implementation (I read recently that Intel are finally going to support all the AMD64 instructions now) and multi-core design (crossbar-switch, integrated memory controller etc), so it's a better investment in the platform for when the software eventually catches up (unless you use a lot of opensource apps or your own apps ;-)). I'm sure Intel will catch up eventually .. though AMD has recently revealed its plans for the future too, and obviously they are going to try to keep ahead.

     

    Anyway, competition is good .. for both camps ;-).


  18. Hi, I bought this printer off from eBay and the seller said that it works.  The original problem I had was that it kept jamming halfway through a print job.  It said something about a cartridge stall.  So I opened up the printer and took out the cartridges and re-inserted them back in.  Presto...no more jamming.

     

    Now the problem is that the print job goes through but I see nothing on my paper (it's still plain white).  I checked the cartridges and they look ok to me.  Lexmark program said that each of the cartridges has over 50% ink (one has around 70% I think).  I tried doing a black and white copy job using this machine and the same thing happens, the copy is just a plain white paper.

     

    Does anyone know what's wrong here?  I want to open up this printer before (since I couldn't figure out the jamming problem) but couldn't figure out how to.  Seems like Lexmark sealed this puppy pretty tight (removed two screws inside, but won't budge at all).

     

    Thanks.

    1064326434[/snapback]


    Hiya WeaponX,

    many printers are damn annoying these days.

     

    I'm guessing you've tried all the usual printer driver clean cycles etc? If not then, if you're in Windows, go to Control Panel -> Printers -> Printer Properties -> General Tab -> Printing Preferences -> Maintenance .. or something like that. Then click on print head/nozzle cleaning & print head alignment (I can't check a Lexmark driver .. but these are my Epson's terminology).

     

    Did it say "cartridge" or "message carrier stall .. cannot communicate with the printer .. error 0000 .. press power key"? At first I thought that sounds more like the cartridge's chip (if it is one of those :-o) isn't recognised .. maybe it's not an "official" cart/chip etc .. but then I read the pages below & it seems the printer doesn't like the position of the holder of the cartridges & is basically asking the user to move it to the leftmost position!?

     

    A bunch of people here (http://www.fixyourownprinter.com/forums/inkjet/22795) are getting that message, and the general consensus is to clean the contacts on the printer & cartridge.

     

    "The transparet is the most important thing to fix it. Using rubbing alcohol and paper towel clean the strip. this problem get started when the assembly at the carrier stop position get filled with the dropping ink over the period of time. This assembly can be cleaned by removeing it after getting aceess to it after removing two sides and backs of the printer. There are 2 screws in the back should be unscrewed and rest come sout after pulling the snaps. This is what I did to fix the jam issue and then emcountered the carrier stall becuase my dirty fingers touched the transparent encoder."

     

    I'm guessing their manual doesn't say how to take the damn thing apart .. but I'm sure customer services have this info, so maybe it's worth emailing them for a disassembly diagram .. but often these guys get worried & don't like to give out this kinda thing (warranty voids etc & the reason why they make it so annoying to take apart in the first place .. yet more nuisances) ;-). Still the guy above says it's just 2 screws & a bit of a tug!

     

    Also this page (http://forums.xisto.com/no_longer_exists/) might be helpful ...

     

    ""Carrier Stall" or "the cartridges will not move" issues

    Error: Carrier Stall

    To check the cartridge carrier:

    First unplug the printer.

    Move the cartridges to the left by hand and take them out.

    Try to print without the cartridges in the carrier and check to see if you still receive a "Carrier Stall" error message.

    If not, put the cartridges back in the carrier and snap them in at the back. The message should no longer appear.

    If the power light on the printer is not blinking, this error message may be caused by a communication error. In this case:

    Remove any other devices attached to the USB/LPT port and uninstall the Lexmark printer.

    Connect the printer directly to the PC and ensure that power is being supplied to the printer.

    Reinstall the printer.

     

    Error: Carrier Stall. Power light is blinking

    Causes: The cartridge carrier may have lost its location on the print path or there may be a communication error."

     

    So this seems to say there's nothing wrong with your carts, it's just the carrier .. I hope this is the case :-).

     

    hope that helps!

    I'm kinda busy at the moment, sorry, but I'll look into it a bit more tomorrow, if these ideas don't fix it.

    Kam.


  19. Well, Intel is the most popular around the world, plus they have lots of featues and products, so I think Intel is the safest choice in cpus. :mellow:

    1064325755[/snapback]


    He he, Intel are number1 overall since they sold loads over the years. But look at recent/new CPU sales on a month-by-month basis and AMD is actually winning more often (recently)! AMD are so popular now they actually had recent stock problems in keeping up with the recent demand!

     

    Anyway, popularity isn't a very logical reasoning ... lemmings love to jump off cliffs, noone used to get fired for buying IBM but now their PCs are sold by the more efficient Chinese company, Lenovo, beat-em-ups were the most popular games in the charts but for longetivity you can't beat a good RPG, STDs/AIDS are very 'popular' in Africa, slavery used to be very popular, etc ... you get the idea!!

     

    They may have more products, but if they are all underperforming, who cares! The world's first/only board with FOUR PCI-Express-Graphics slots is an AMD/NV based one ... intel boards definitely don't have more features these days, at least not the features important to me .. performance!!

     

    Safest choice .. mm .. I'd feel pretty depressed if I'd bought an intel setup and then got to see a decent AMD setup .. big mistake if you are after FPU performance IMHO!


  20. But other than the Xeon AMD's seem to blow Intel away in speed, however they aren't very stable or a very good deal, I've had quite a few AMD's set on fire from running mediocre tasks, no Intel-based CPU has yet to date ever became a candle stick on me to date. :blink:

     


    Sorry I'm repeating again ... both AMD & Intel chips don't burn up since both have protection circuits and sensor diodes on the chip. Intel high-end CPUs run very hot due to the long-pipeline/high-clockrate design decision which intel have now backed down on and are going back to the shorter-pipeline design again next year.

     

    To avoid stability/heat problems on either AMD or Intel machines, it's very simple .. just read/remember some school-level thermodynamics chapters, follow common sense, buy some thermal sensors, buy/make decent cooling parts and make sure you know all the thermal specs/limits of all your components.. no big deal :-).

     

    I've also had problems, BTW, both happened to be in intel systems but that had NOTHING to do with it. They were all HDD failures due to avoidable dumbness/laziness on my part:

     

    1. a long long time ago I asked an engineering friend stoopeedly presuming he'd know more than me (he did, but only the theory, so I should've just read everything necessary myself anyway, or found someone with previous practical experience on this specific issue!) to help with an HDD problem and he shorted it (still, at least I now know never to let any one else near my main machine ;-) .. DIY everything rather than delegating & then fixing the ensuing errors, it's much less stressful in the long term!)

     

    2. the weather was getting hotter and I knew I didn't have enough fans, so I asked for some several times but eventually gave up asking, and er they never got ordered (another lesson in that if you want something done do it all yourself rather than wasting time asking people all the time .. other people will often forget to do trivial things like this .. but the resulting damage isn't trivial at all!). Then someone turned up the heating on what was a very hot day!! No biggie since I didn't lose any data, but it was such a waste of money etc (I hate wasting money/time/etc .. it's so silly, you know?) to sort it all out afterwards! I now do all my work on my own machines and have very decent cooling in it :-). At work, if I want a machine on 'their' (our!!) network, I have to give them admin rights to my machine, and in some cases must give up admin rights on it (some of my colleagues aren't allowed to install software on their machines .. a problem considering we develop software in a CS lab .. doh!!).

     

    3. had another HDD failure from what turned out to be a totally crap HDD (dodgy uDrv)!

     

    I've never had problems with machines I directly control .. but unfortunately in a workplace 'they' usually don't like to give people full control unless you sign up for admin tasks etc! Best thing is to make sure you have all you main stuff (work or play etc) under your full control (my serious work machines are not on my workplace's LAN .. though I do need some data off the LAN for work I get that off another machine .. anyway!!), and make sure you learn enough to both set it up and maintain it wrt admin chores. Best to learn it when you take a holiday or on evenings/weekends etc.

     

    I know this sounds like a hassle to some people but once you've learnt the basics, you don't have to reread much new stuff & after a while it's very easy once you get used to the terminology etc. Also, once you design a decent setup, you don't have to do any extra work for it ... ideally you should do an automatic nightly incremental backup so then you'll never lose much even if there was some failure etc. If you do everything yourself, you have full control and can do anything you want .. the ideal system YOU want without having to beg for permissions or make silly compromises etc. The worst thing is being constrained by other people, or by greedy corporations, etc. Er, but that's another topic ;-).

     

    Here's my assesment:

    I really think it depends on the CPU you are buying from each brand, from experiance, I have found the Intel Xeon to blow every AMD away in gaming and bussiness-style applications.

    However, the Xeons are pretty expensive, and aren't used very often in PC's you can buy from a manufacturer like Dell or IBM.  They usually seem to use Intel Pentium 4 processors.

     

    So in short I'd definatley go with Intel, just make sure you get the Xeon if you're willing to spend the dough.

    :mellow:

    1064325852[/snapback]


    He he, you must be kidding, right? AMD completely domintates the high-end x86 field in performance. You'd go for AMD not intel if you follow the masses without understanding the underlying computer architectural issues or unless you really want intel for brand/support/etc .. ie. for non-performance reasons!

     

    https://tweakers.net/reviews/442/5/dual-xeon-dual-opteron-en-quad-opteron-serververgelijking-benchmark-details-en-apachebench-scores.html shows the NUMA/bandwidth advantages (that I mentioned earlier) in numbers. As you can see (use babelfish to translate, BTW, or just look at the figs) the dual opterons are about TWICE the performance of the dual Xeons in ApacheBench. It gets even more crazy at quad levels and beyond. AMD architecture is designed to scale for enterprise servers, Intel's current design isn't meant for this.


  21. Well if you go the Tomshardware.com forums/community , almost all ppl will flame the hell out of INTEL and advice you with an AMD

    but since they are almost all GAMERS .. it's normal they tell u so since AMDs really shows more FPS / Performance in GAMES .

     

    I don't think it's that alone. I've met many people who are anti-this-or-that for no good reason at all, besides that it may be 'cool' to be outside the masses or something ,or perhaps they like bashing/moaning, I dunno! Personally I couldn't care less about which company my CPU is designed by, whether intel or AMD, I only care about which is the best performing for my tasks at my price range, and for that AMD has won me over since the Athon Tbird days .. though I don't claim any loyalty either way, it's simply from doing benches etc myself since I usually have access to both types at work/etc!

     

    I personaly own AN AMD Athlon 64 3500+ and a 3000+

    It's very very good .

    but As times go by you realize the fact .. for me as a gamer and 3d modeller . AMD is perfect for gaming rigs

    however for renderings and office work , Intel shows to be better in benchmarks .

    so scale your needs and see whats the right one for you, in the end it's not going to be A FATAL performance increase or decrease .....

    1064325722[/snapback]


    He he, for word processing type things, my 586 is just fine, if I don't have a flight sim running in the background ;-). So I don't really care what CPU it is for that task, and don't base my decision on those benches, sorry.

     

    But I do run a lot of ray-tracers, so I am very interested in your views, cryptonx. Can you show me some benches, please (independent or your own, I don't mind .. just some figures will do plus some info on the test itself so I can do them here)? What apps do you run, or is it your own code, and if so what compilers/options are you using at the moment?

     

    I agree that AMD lagged in some benches like video encoding up until a few months ago, but now with SSE3, they seem to be matching of beating intel at every benchmark. Video isn't as important to me ... mainly I need brute computation power, and for me AMD seems the winner for many years now for an affordable desktop (ignoring damn expensive IA64 setups!).

     

    http://forums.xisto.com/no_longer_exists/

    http://forums.xisto.com/no_longer_exists/

    http://forums.xisto.com/no_longer_exists/

    shows that AMD wins every productivity test, from office to photoshop, and every video encoding test, which is something Intel used to dominate thanks to SSE3 optimised apps. The test is basically A64FX vs P4EE .. gaming type chips .. but FXs are just unlocked/fast Opterons 1xx's! They also test the X2 procs which seem to do well also. That last URL is on 3D workstation apps, and Intel's EE proc seems to be not that different to AMD's FX (ignore the X2 60% lead in Cinebench .. I'll talk about that next!).

     

    http://www.sudhian.com/showdocs.cfm?aid=672&pid=2574

    is an X2 review, but also includes Opteron 275 & Pentium D CPUs. In raytracing & rendering Cinebench tasks (the 1st two figs) when optimised for TWO threads (X2) it's 21% faster than the EE which is faster than the PentiumD, and when running FOUR threads (275) it's about twice as fast! And these aren't even AMD's best CPUs at the moment!!

     

    Finally, sorry I repeat this, but Intel's 64-bit implementation sucks at the moment, and their dual-cores are only for their desktop CPUs, not their rendering/workstation/pro CPUs ... so to me AMD is miles ahead in the pro field of CPUs ... and also in the gaming/home field. Dual-core is obviously important in rendering (as shown in the URLs above) since it's an inherently parallel computation.


  22. As someone else said earlier, if you're into 3D apps you need a decent 3D card. Which one depends on the apps you use or value most etc. For GL you don't need a pro card since most gaming cards also support GL still (though with Vista there's a major issue wrt GL becomes a 2nd class lib under D3D .. seriously crap considering how many pro/academic apps are written in open OGL so they run on giant UNIX clusters .. I guess most academics etc have or are moving to OSX/OpenSolaris/Linux/etc anyway so it's not really an issue, but still, I can't see the point of the move esp with MS making SFU a priority on their servers & making Vista have many UNIX like features ... perhaps different people within MS have differing views so there's no homogeneity in the rather massive/complex whole OS :-((). Er, sorry for the rant/tangent again ... had to get it off my chest & hope others can shed some light on the issue ...

    1064325615[/snapback]


    Hi, again, Jedipi,

    I should've mentioned that if you want to use pro 3D apps like the various CAD variants, Maya, etc .. it MAY be worth spending extra bucks on a pro card that has lower performance in games (though often they have more VRAM, higher bandwidth etc ..). But you could get most of the way there by doing some DIY mods to enable use of the pro drivers, and could overclock to get the extra bps. The only issue is memory & resolution (although hardware clipping, accurate line drawing/anti-aliasing etc might not be enabled in all hacks!). This is probably gonna be less of an issue what with the new unreleased ATI X1800s etc .. I'm sure Nvidia will answer them back ;-).

     

    I can explain in more detail if you want .. just ask ... or take a look at the pro reviews online.


  23. This is my summary of how I view AMD vs Intel, currently ...It used to be said that AMD chips don't do as well as intel for multimedia, but now Intel are stuck because of thermal problems & can't raise frequencies anymore. The most recent AMD reviews show AMD winning in all tasks. They also support SSE3 now. I don't know any games that run better on intel. 64-bit Far Cry is a great example of what x86-64 can do .. an AMD innovation. AMD is definitely not the el-cheapo equivalent anymore, they are the leaders. Intel tried to force people to move to IA64, and had to emulate 32-bit x86 in hardware, an ugly/expensive solution compred to extending the architecture & adding registers etc.AMD also designed this generation for dual-core operation, unlike the underperforming hyperthreading (which was only developed to cover up lacking performance of the deeply pipelined netburst architecture of the P4). Modern AMD K8 CPUs have integrated memory controllers, so there's lower latency, and allows you to use older/cheaper DDR1 memory! It also enabled HT to be used to link processors together, each with their own dedicated dual-channel memory buses, so it could scale indefinitely until, of course, board makers reached their maximum possible density of the physical limitations for placing tracks on the n-layer motherboard. No more "add more procs reduces per proc bandwidth in the shared FSB" problem anymore :-). This was only on mainframes/supercomps until AMD decided to take a leap of faith and to just go for it :-)). And now AMD pro CPUs are finally the recent #1 on a month-by-month basis.I'd say Intel aim at the masses & go for what sounds great (marketing), but is the cheapest solution (so they can sell to the masses). This is what they want to do, they want volume & think huge profits will come that way. But there's only so much far behind you can be before customers decide to ignore those issues. So Intel decided to focus on the platform (which they are really strong in & can beat AMD on because they are so big .. but then AMD partnered with chipset vendors who'd been screwed by intel domintating their space!!). Now Intel are forced to not only accept x86-64 over IA64 for the masses, but also to focus on performance per Watt wrt TDP as AMD have been for all time! AMD don't care about marketing (and have quite bad marketing execs it would seem given their 2nd place .. though that's finally changing!) and they mainly care about what's best for the customer (which is the ideal that I learnt in marketing class bizarrely .. but never followed & often negated!!). They go for the best solution, no matter the cost & are willing to take HUGE risks/gambles if they believe their design is correct.


  24. Just to add a bit more on WeaponX in terms of gaming i think almost all forums favor AMD more for being the "gaming chip" and Intel is more the business type multi tasking cpu. Im sure if you did a survey here at Xisto for the gamers ( i assume thats wat u r into) have in their machines, AMD will come out on top. Countless number of AMD 's adverts are aimed directly at gamers aswell so they boast about it and i have not seen this met by Intel.

    So i will have to disagree on the "its really all in the name" :mellow:

    1064316194[/snapback]


    I dunno. AMD is a great gaming chip, but I wouldn't say Intel is a good business-type multi-tasking CPU compared to AMD, when AMD has the best dual-core implementation by far, and hyperthreading is next to useless in comparison (most people/vendors disable it by default since in some tasks it's actually slower .. it was pretty good at multimedia apps a while ago but thing's have changed now with AMD dual-core being widely available [if only the huge demand didn't outstrip AMD's supply so much .. but I'm sure they're working hard on that .. it was just much larger than anticipated by all those marketing/market research firms ;-)] in many market segments .. it was just a temporary fix to the unique problems of the NetBurst architecture).
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.