Jump to content
xisto Community

rayzoredge

Members
  • Content Count

    1,047
  • Joined

  • Last visited

Everything posted by rayzoredge

  1. You definitely have options here. One obvious one is to unplug your sister from the Internet.? That stinks that you have DSL @ 384KBps shared between the two of you. Someone's gotta give, since you can't exactly automate RapidShare downloads and streaming video is mostly an on-demand thing.You may be able to set things up on your router where YOUR outbound traffic is prioritized over hers... which would mean that your HTTP requests, data, e-mails, etc. will be outbound long before her computer is able to make requests to download her RapidShare files or streaming her videos. I am not familiar with Edimax routers, but by going through the router interface, I'm sure you could dig something up.I was trying to find out how to do this same thing, with traffic shaping, but my issue had to deal with uTorrent settings, which I fixed. I'm not sure what to tell you outside of trying to find a traffic-shaping program that works over a network (which I couldn't find myself) or to adjust router settings, or really, just telling your sister to not do anything Internet-heavy during your work hours.
  2. The hornet's nest: which OS is more secure? And why? It depends on how you look at things and which perspective you approach it from. As far as I know, Linux seems most secure to me, followed by OSX and then, by a long haul, Windows. My reasoning based from what I've learned so far is that Linux and OSX aren't exactly really big targets for malware, not to mention the fact that Linux has a community that frequently scrutinizes the code to prevent exploits and whatnot from ever happening in the first place. There's not enough of a market share by Linux and Apple to compete with being omnipresent as Windows has been, and it makes sense that most malware is being designed to target the Windows environment. And of course, along with these facts, you always hear about Macheads smugly mentioning their snide remarks about being virus-free and whatnot... and sometimes, a Linux user will chime in every now and then. And it's funny when I say now that, although Windows is more apt in being targeted for malware, OSX is actually the most vulnerable OS out there. (Apparently, the answer lies in Snow Leopard, and you'll know why in a second.) Clicky Some of us are aware of the Pwn2Own contest where a contestant successfully hacked in not once, but TWICE into an Apple computer (a MacBook Pro and a Macbook Air, I believe), taking control of the machine within minutes. (Of course, the Apple community remains unfazed by this brazen act that proves that OSX is not invincible, blaming on something other than the operating system and their precious Apple devices.) These were fully-patched machines, mind you, and the point of the contest is to find bugs and attempt to exploit them to take control of the machine or execute abitrary code to be able to do anything malicious from snagging keystrokes to sensitive information to gaining complete control. And yet, a man by the name of Charlie Miller attempted and apparently did the impossible. In all reality, security is just a way to deter if not slow down an attack. Keep this in mind, as anyone with the diligence and the knowledge can get into any machine he or she wants. Microsoft's Windows Vista has something called ASLR (fully implemented with SP1) that is apparently a very effective security feature that deters such attacks. Linux has a weaker version of the same concept but reinforced with PaX and ExecShield. Apple's OSX (Leopard) has some smatterings of binaries of ASLR, but this will be fully introduced in Snow Leopard, the next OS release. So what the heck is ASLR? ASLR stands for Address Space Layout Randomization. It basically keeps abitrary code from executing due to a stack buffer overflow or other such attack where a hacker injects code to be executed to gain control of a machine, download malicious code, etc. The way it works dives deep into how things run on a machine, notably the address space. By randomizing the layout of the address spaces needed to execute code, this concept thwarts the possibility of a stack buffer overflow because the way the mentioned attack works is by injecting executable code at the END of a very long address so that after the program takes in the long address and allows for the "leftover code" to be read (run). If the address spaces are randomized, the "end" of the address that contains the malicious code can be exposed and discarded, usually ending with the program crashing on the user end, but protecting the OS itself from being successfully attacked. Linux has a basis of this, complemented with ExecShield and PaX, which basically do the same thing in preventing attacks in this fashion. OSX, however, does not have much of anything at all regarding ASLR, although this is apparently changing with the release of Snow Leopard. Charlie Miller stirs up the hornet's nest, I'm sure, by declaring that OSX is actually the least secure operating system out of Linux and Windows. However, the saving grace is that with the facts discussed above, OSX is actually the more safe out of Windows and OSX. (He doesn't even mention much about Linux, although with the implementation of basic ASLR and its complements, including the fact that there's not enough of a presence to invoke making itself a target, I would personally say that Linux is the most secure OS.) What do you guys think?
  3. Clicky Above is a link to an article talking about the advantage (so far) of 64-bit gaming. Apparently, with newer games that are coming out that are very taxing on your system *coughCrysiscough* the executables themselves are running into problems addressing more than 2GB of memory. If you throw in the sheer amount of detail, textures, and other graphical doodads that make eye candy in games glorious, you can probably see how this is going to be a problem AND a limiting factor that will force game developers to move on to 64-bit platforms, lest they stick with simpler, less-demanding applications or figure a way to address this issue without resorting to large address requirements. What this means, basically, is that games running in a 32-bit mode or natively in 32-bit are bound to crash from lack of memory address space, where the magical number of 2GB comes into play. In the article, some examples are explained with games as old as Command & Conquer: Generals, where I can imagine 8 players building tons of buildings, implementing tons of defenses, and training and constructing massive armies before throwing up their hands in frustration after the executable folds in to the limitations of a 32-bit system and can't do anything anymore.? Even with Crysis's map editor, since users are bound to create lush, large maps full of detail, it's almost a given that the address space will be trumped, and thus the developers have made it a 64-bit application, only available to be hosted on a 64-bit server. Throw this into the pro list of having a 64-bit system.?
  4. I don't code or program personally so I wouldn't really know how difficult it is to "write" software for 64-bit. This is how I understood this concept: A 32-bit piece has up to 32 "slots" in which data can fill; same concept applies to 64-bit. 64-bit pieces can fit two full 32-bit pieces, but we know that not all 32-bit pieces are actually filled to the brim, so there's usually open slots left open. Therefore, 32-bit pieces can more efficiently fit more code without as much open-slot bloat. Maybe I've got that school of thought mixed up... Your definition of heavy processing would be one of the main reasons why 64-bit processing trumps 32-bit processing, as I'm sure NASA and any other scientific applications, as well as working with large media files, would surpass the 4 billion integer mark that 32-bit is limited to. (But I'm sure they actually have custom software and hardware that takes advantage of 128-bit processing... yes, no?) The supposed slowdown with native 32-bit applications turned 64-bit would, I think, come from the fact that you are now dealing with larger amounts of data at a time using the "larger" architecture that would have been more efficiently done using a 32-bit system, if we apply my thinking about the 32-bit/64-bit open slot/bloat concept. (Then again, I'm probably wrong on that whole concept to begin with.) Sending 64-bit pieces of data to RAM will fill up RAM twice as quickly than 32-bit (as it takes up a larger memory space), so although 64-bit can support the usage of more than 4GB of RAM, it may not use it as efficiently as 32-bit would? (i.e. with 1023 bits of RAM available, only 15 pieces of 64-bit data can be addressed to RAM as opposed to 31 pieces of 32-bit data, which makes up for 32-bit being a wee bit more than 3% more efficient)
  5. Not a lot of people know about the advantages AND disadvantages of 64-bit software. With the emergence of 64-bit availability more widespread with Linux, Vista, Windows 7, and Leopard, I think we should discuss a bit more about what 64-bit brings to the table of computing.A lot of us "know" that 32-bit Windows can't support more than 3.5GB (or 4GB, depending on your sources). (By "know" I mean "agree," and by "agree" I mean I-read-this-on-Wiki-and-a-ton-of-our-peers-say-so. ) A search for the advantages of 64-bit usually yields this basic, one advantage, but it also leaves us in the dark about everything else that 64-bit brings to the table. I dug a little bit deeper with my Google-fu and found some comments and spatterings of knowledge on the Web about the lesser-known facts about 64-bit. And yes, I'm only putting out what I "know," which really comes down to "what I agreed with" or "what I read."? One perspective that a lot of us don't look at is the fact that although 64-bit operating systems allow for support of more than 4GB of RAM, we often overlook that the software that we use is still 32-bit. Even with the limitation of 32-bit software not being able to address more than 4GB of RAM, it's not as "bad" as it might seem. One thing that was brought to light for me was that 32-bit software might be limited to not being able to utilize more than 4GB of RAM, but that fact is actually not a limiting factor when utilizing that 32-bit application IN a 64-bit operating system. Since the most realistic scenario for the most of us would include running multiple applications at a time, to include the resources reserved for the operating system, and if you take into account that each application and each process is "limited" by 4GB... well, you get the idea, right? If you have 16GB of RAM in your system, and 1GB of that RAM was being used by the operating system and background processes, you would have 15GB available for your 64-bit program, or you could run a million tabs in 32-bit Firefox and take up to 4GB of RAM with just Firefox and have 11GB left over for any other applications, 32-bit or 64-bit.Another thing that's hidden in the dark is that 64-bit software is more secure. Why? Most malware and malicious code is written for 32-bit software (i.e. Windows XP), and apparently writing code for 64-bit is a bit more difficult considering that you have to write to address 64 bits (integers) instead of 32 bits. Encryption, in this sense, will be more effective. Which brings some cons to the mix...... that 64-bit software is also a bit more difficult to write. Also, since you are now programming in 64 bits (integers) instead of 32, 64-bit software can actually run slower than 32-bit software, since now your 64-bit software is having to address 64-bit address spaces (doubling your memory pointer requirement from 32-bit), requiring more memory AND space to deal with two times the data being worked with. (However, this can be a good thing, as the program is now able to address more, if you look at it from that perspective.) If the 32-bit software you are running now doesn't need to touch any numbers past 4 billion or cannot utilize this to its potential, then there will be no software advantage for that particular program to go 64-bit. Throw in the fact that 64-bit driver support is rather lacking, and you can see why the race to go 64-bit is at a crawl.Please correct my arguments if I am wrong... I'd like to get more of an understanding as to how 64-bit computing can help AND hamper how we do things today.
  6. What has been eating up all of your bandwidth on your sister's computer?Streaming videos?Peer-to-peer downloads and uploads?What's your bandwidth allocation by your ISP? (Do you have broadband or dial-up?) What kind of router do you have? Are you actually timing out on your HTTP requests or is the Internet just generally slow?
  7. Man... I am a complainer!? Opera also does not support GMail's signature "THUNK" sound when receiving an IM on GCHat or AIM while focus is away from that tab. Pity, because I rely on that intrusive sound to keep me from forgetting about my conversations with people...I might just have to run Firefox with GMail in it and use Opera for everything else... which just seems plain dumb.? Hopefully Opera snags a fix for GMail, although I'm surprised that it has any issues in the first place. Is it the way that Opera interprets what GMail has, or the way that GMail interacts with Opera?
  8. Opera is now driving me nuts with me being used to FireFox. I just lost everything I was typing because I double-clicked on a word to replace it, which instead brought up a context menu and the next key I hit apparently searched for the word that I was about to overwrite.? I did this same exact thing yesterday, losing an entire post because I accidentally hit Ctrl+Left instead of Ctrl+Shift+Left.? Anyway, at the time of my screenshot yesterday, I was running GMail, two Xisto tabs, a NewEgg tab, a Google Results tab, Craigslist, a CNet article, a Speed Dial/blank tab, and two Cracked.com articles. I think that there was a possibility of a Flash ad on the CNet and Cracked.com articles, but every other page seemed to be light. Opera 10a was taking up 55MB of RAM and 94% of my CPU usage. And yes, I'm running the Windows version of Opera. It's too easy to blame it just on Windows... so I'll blame it on Windows.?
  9. I just gave you AND linked you to information to speed up downloads AND uploads. You can only download as fast as a peer is sending you the information, capped by your bandwidth allocation. Basically, if I can upload to you as a peer data @ 400KBps (my upload transfer rate), but your bandwidth only allows a 200KBps download limited by your ISP service, then you can only get 200KBps of data, no matter what you do. The only options here would be to upgrade your service through your ISP. The same concept applies if I had a crappy connection to you as a peer. If I could only send you a maximum of 192KBps because that's my real upload cap thanks to me being cheap and not wanting to pay more for more bandwidth, even if you had a 400KBps download allowance, you would only get my data at anywhere from 0KBps to 192KBps, depending on any other number of factors. The best thing to do to maximize your upload rate is to upload one to a few torrents at a time and actually have decent bandwidth to do so. However, keep in mind that the more you reach the cap for your upload rate, the less data you will be able to request (which makes Internet browsing very, very painful; see my previous post). If you are looking to improve your download/upload ratio, and if you have a good connection to the Internet, seed only a few torrents that are moderately desired, unless you REALLY have an awesome Internet connection. Think about this for a second. If you are uploading a torrent that has 100 leeches and only a few seeds, you will almost be guaranteed to connect to a bunch of these leeches and seed your file at [near] YOUR bandwidth cap, because people will be wanting to snag more data to finish their download. However, if you are uploading to a torrent with 99999 seeds, don't you think that out of the 99999 people seeding, a ton of those people probably have decent if not a better Internet connection than yours? If you are "competing" to seed and you "lose," you will be stuck with seeding fragments of files at lower transfer rates to complement the people who's bandwidths can afford to feed leechers large fragments of files at higher transfer rates. Remember that you are seeding FRAGMENTS/PIECES of files and not actual files. That's how the swarms works together to give everyone what they want, cooperatively.
  10. I found out why the Internet bogs down like crazy with uTorrent and how to fix it. The secret is to disable NAT-PMP and most importantly, DHT. The sheer number of requests to connect to a ton of different peers floods the router itself and prevents any other requests from any device on the network to go through, resulting in time-outs and very slow Internet connectivity. I still haven't found any solutions for inbound traffic shaping, but this ultimately solves the problem that I've been having for the longest time. Basically: if you are running uTorrent and the Internet slows down to a standstill, but it comes back if you shut uTorrent off, disable NAT-PMP and DHT. Allowing fewer global connections works too in conjunction with disabling these two features.
  11. Er... OP is missing the tiny eety beety fact that adjusting uTorrent settings will be RELEVANT to your system and your connection speed. And obviously, you shouldn't change preferences if you have no idea what they do. What anyone should be doing is to add uTorrent as a firewall exception. You should already have this enabled anyway, because Windows Firewall will prompt you whether you want to let uTorrent's traffic through or not from the get-go. It's best if you actually leave Randomize Port unticked and specify a port of your choosing, preferably a port above 10000. When you specify a port, you have the opportunity to open that port and thus let traffic go through as you please through your router (if you have one). And of course, setting up port forwarding is specific to each router, which help on setting this up can be found on PortFoward.com. UPnP stands for Universal Plug and Play. It makes recognition of devices on your network... well, easier, theoretically. I haven't missed it; therefore I've turned it off personally. I would think that this would improve performance through your router by a very, very slight bit... but adjust this at your own discretion. NAT-PMP, basically, allows for automatic port forwarding. I would trust manual settings and, therefore, if you are forwarding a port yourself, to turn this off. Setting an automatic download and upload rate is not recommended because uTorrent is probably not smart enough to know your bandwidth allowance... so I would actually keep this unchecked. What you should do is test your connection at SpeedTest.net, take the numbers IN KILOBYTES (KB) and not KILOBITS (Kb), multiply those numbers by 80% (.8), then put those numbers into your upload and download limits. This way, uTorrent won't rape your bandwidth. Global connections would be the number of connections to peers that uTorrent will make. Connected per Torrent would be the number of peers that you will make connections to per torrent. Making any of these numbers astronomical will net you no more performance than setting this to a smaller, more realistic number. Depending on your bandwidth allocation through your ISP, I would set this to a more realistic number. See the chart and source that I have in this thread, which actually covers the same topic this thread does. Protocol Encryption is a good thing as it encrypts your data going in and out, which means that ISPs can't analyze your P2P traffic and consequently "choke" your transfer rates. (ISPs don't like peer-to-peer.)? The numbers that Ash-Bash puts out are insane. You should never have 60 torrents active, and 63 active downloads when you already specified 60 active? Whoever made the article that Ash-Bash quoted didn't do their math right.? Refer to the aforementioned chart, or you can safely go with something low, like 5 active torrents and 4 active downloads. This will ensure decent speeds with downloading AND uploading, since you won't be struggling to download and upload from a ton of peers. The reason why I say that is because, with the settings Ash-Bash posted, you would be connected to (60 active torrents x 200 possible peers per torrent - 12000 connections, but limited to the 500 global connections you specified earlier). Check your math... it makes more sense when you do, and it ensures that you're not telling uTorrent to do the impossible.? Everything else, I would leave alone, except for the net.max_halfopen parameter. This allows you to make as many connections as you can, but setting this to a high number does nothing if your TCP/IP settings won't allow it. LvlLord's Event 4226 TCPIP.sys patch lifts the limitations of XP and Vista as far as connections go (implemented by Microsoft as a propogation measure in the case of infection by malware) but there's only so much you can gain by applying this patch and modifying this parameter. Again, I want to refer to you to my previous post in the other uTorrent thread for more information. There's a lot of false information out there... so common sense rules all.? Remember that these settings are dependent to your bandwidth allowance. There are no magic numbers for uTorrent that will make rocket go boom.?
  12. Uh-oh... dealbreaker!Opera 10 Alpha does NOT play well with GMail, in my experience. I just crashed twice with O10a trying to reply to someone, not to mention that it was actually kind of slow.Reverted back to Opera 9.64... doesn't crash with GMail (yet, anyway), but it's still slow and choppy.? It's weird considering that O10a supposedly fixed issues with GMail according to the developer blog... but maybe it's a work in progress still.Edit: Double-posted for some reason within the same post...Addendum: Argh. This seems too good to be true... and I'm starting to see the dealbreaker in more light than Opera's better features.I just escaped a crawling disaster with Opera taking up 90+% of my CPU processing power for a minute at a time... with no idea what's doing it. Maybe it's GMail, since I saw before that it didn't play very nicely. Maybe it's the fact that I had 10 tabs running. Maybe a lot of things.Anyone run into any similar issues with "the fastest web browser on Earth?"?
  13. Is this true? I wouldn't think that Linux DEPENDS on the swap file, but it is always a good idea to have one especially if you're running Linux on a low-end machine. (That's probably why Linux is much more viable on more machines, as it is more efficient with the use of the swap as well as utilizing the hardware.) Also, mods: please merge this into my kernel thread as this may be a relevant topic.
  14. I just recently "acquired" a later copy of Windows 7 and burned it to DVD, thinking that my previous copy of Windows 7 downloaded from Microsoft back in the day would not update me to build 7057 (or whichever it was... I can't remember). (The fact that the expiration was extended to March 2010 was more incentive for me to install the later version of Windows 7, not to mention that it's the supposed release candidate. ) However, I hit a snag trying to install it this time around.On inserting and booting from the CD, I went through the prompts, but then I bumped into a problem that apparently plagues Vista users too: no device drivers were found for the hardware that I had. Basically, I could boot from the DVD, but then after getting into the installation program, it didn't recognize my optical drive anymore (it seemed). Retarded, but I supposed that there was a workaround...... and there is, but unfortunately, the solution was to modify BIOS settings that I had no access to because I have a Dell (with their infamous locked BIOS). The solution involved changing the behavior of recognizing SATA devices... and I couldn't do anything about it because Dell makes this impossible. Anyone find a workaround to this?
  15. @bluedragon: But the portability is a large deciding factor... which, including power savings, is why I lean moreso on a laptop than a desktop in my other thread (Laptop vs. Desktop). A laptop also includes a "monitor," although you are limited to the form factor of the laptop.?One wonderment I've had as far as the performance argument goes is that if you take a desktop machine with similar specifications and hardware configuration and compare it to a similarly-spec'd laptop, run the same programs, play the same games, and whatnot, would you draw more power using the desktop than you would the laptop? I did some dummy math and said that a 120W power supply that a desktop replacement laptop would draw a maximum of 120W (give or take a few watts) to power CPU and GPU-intensive tasks like gaming and 3D modeling, whereas a 500W (or more) power supply in a desktop is able to breach that 120W to feed those same desktop-variant components. Maybe I did some bad math in my previous thread, as it doesn't make sense that a laptop with virtually the same hardware as a desktop would draw less power than its desktop variants, but then again, mobile hardware may be designed to sip less from the power grid... which also brings the question as to why we don't use the same technologies in desktops (if mobile hardware actually used less power to do the same thing). I've seen that NVIDIA put out a sort of "green-ish" technology with its GeForce 9800 series where it would be much more selective in drawing power when it was needed (some sort of Hybrid technology... I can't remember) for desktops, but otherwise, I haven't seen much else in using less to do more.I wish I had a desktop system that was similar to one of my laptops so that I could get a Kill-A-Watt and compare the two, but I don't have the moola.? Can anyone tell me if I'm wrong or not as far as the power consumption goes?
  16. I just looked up how to get this to work... and it's one more thing that I'm finding awesome with Opera. I find it occasional that I have to switch back and forth between tabs to compare something (like specifications for hardware I'm planning on buying or doing research on) or if I'm trying to compare changes to a website with one window having the old version and another having the new. What's good about how this works is that I can have just two tabs resized for comparison and leave the rest as they are, which works for me. (I often find myself with 10+ tabs open at a time, since I like to middle-click on CNet articles so that I can read them one at a time later.) Thanks for pointing this out. I haven't seen the snapshot feature in action quite yet (but I did download and am now working out of Opera 10), but I would think that it's about as fast as loading most elements locally, considering the fact that that's how it works. Browsing speed doesn't seem to be of much contention to me, considering that differences can be about a few seconds at most, but loading time is a nice factor to diminish. And Opera opens up pretty quickly compared to Firefox, which is something I do like very much so far.
  17. Right... I figured that the whole driver process was already in existence, but if that was the case, wouldn't it [the device] include a "way" of communicating with the OS and thus have direct communication with Windows without having to jump through a driver [kernel] layer, like what Linux seems to do? I'm guessing this is a much more complicated deal than I'm making it out to be... or is it? And as for Mac OSX, would it be safe to say that any non-Apple-approved upgrades, hardware, or devices would work much less efficiently than ones that already have been placed into an approved list of components? If that's the case, that would mean that OSX would have a device list... which means that it doesn't differ much from Windows on that aspect except that it would be a much shorter list. But if the whole vendor ID concept exists with what I mentioned in my previous post, why would Windows be different than OSX in terms of the efficiency in utilizing hardware if both have a device list that they work off of to ensure compatibility with devices? (I can see why Windows would boot up slower than OSX as Apple would have a much more refined list of hardware that have been "approved" for OSX... but this leaves the efficiency question.) I'm not really saying that since both operating systems utilizing a modified microkernel that they must be the same... but in the way that I'm understanding how both of them work with hardware, there doesn't seem to be much difference, or at least there's a way to change things so that there isn't much of a difference on the hardware management on the kernel side of things.
  18. Want the easy, short and sweet answer? A religion is defined as a set of beliefs. Therefore, atheism is a religion. Lazy link for a definition of religion I actually remember bringing this up in another thread on Xisto... can't remember which one, exactly, but I'm sure you could find a more detailed explanation there. I would go find it now, but I'm being lazy at the moment.
  19. Posting here because this is the first Opera thread I could find.Been using Opera 9.64 for the entirety of today and so far, it's pretty darn good. For an intermediate user or beginner user of the Interwebz, I highly recommend Opera. Why?Scrolling. I don't know if anyone else noticed this or even mentioned it, but scrolling pages is a pleasing action to the eye. Other browsers without specified scripting to handle soft scrolling have choppy scrolling. This alone excited me. I don't know why. Yes, I'm pathetic.? Built-in features. Opera houses a number of cool features in-browser without having to download add-ons or setting them up. So far, I've discovered mouse gestures, [the original speed dial, a skinning/thematic option, widgits, and a variety of ways to customize your browsing experience. (One handy thing that I found that Firefox doesn't have is that Opera shows you your magnification when you Ctrl+Mouse Wheel to zoom in and out.)Tabbed browsing. I don't care who came up with it first... it still rocks.Saved sessions. I thought this was really cool that you have the option to save your sessions. Sure, I might not ever use it. But it's cool to know if I can open up all of my banking websites and save that as a session to open up all of them in one click... and of course, my social networking sites to check on messages and such... the applications can be so very useful.Tab previews. Something that looks rather familiar with Windows 7's "peeks" of taskbar items and it seems pretty cool, although if you have to preview a tab instead of just selecting it and seeing it in full view, why would you even have it open in the first place? (It could be useful, I'm sure... I just don't have a personal purpose for that feature quite yet.)That's all I got for right now. I just wanted to share a happy experience with Opera. I'm not ditching Firefox... I still think it's an outstanding browser. But Opera isn't that far from being another great alternative to Internet Explorer.? (Internet Explorer 8 isn't that bad, but it isn't that good, either. )
  20. So, correct me if I'm wrong here, but wouldn't it be beneficial to Microsoft to follow the POSIX method in only figuring out what drivers should be available for the present hardware and then purging the rest? For any further devices, wouldn't Windows be better off ONLY looking up drivers on an online database when the device is first plugged in so you wouldn't have to load every driver for every possible thing? Also, wouldn't it make sense to include some sort of ID on each device so that operating systems can tell what it is and install the driver software required from that device? (i.e. 1. Insert the device/component. 2. The computer, using an open source protocol, "asks" the device what it is. 3. The device returns a hardware ID tag along with the necessary driver to get it to work. 4. After the computer receives the ID tag, it identifies the hardware, then prompts the user if he or she wants to install the incoming driver from the device. 5. After the driver is installed, normal operability of the device begins.) With the previous suggestion, we do have the means to implement a small bit of information onto every hardware component on a small memory module on a chip, I'm sure... even on the smallest USB nano-receiver. I think that if that were the case, every single operating system could benefit from this sort of technological implementation. Speaking of drivers, I think my question still stands on the kernel part of the discussion: is Linux the most efficient in working with its hardware, considering the fact that the monolithic kernel design enables the operating system to communicate directly with the hardware as opposed to working through a driver layer as Windows does? (I believe OSX does everything in userspace, relaying any hardware input/output through IPC, which tells me that there's still a layer to work through but with the advantage of multiple incoming and outgoing processes able to pass through the IPC layer at one time, and if I understand the kernel layout correctly, wouldn't OSX trump Linux in hardware efficiency, being able to manage sending output out as input comes in.)
  21. Out of curiosity, how much power does a hard drive draw? How about a CPU? Graphics card? These questions came to my mind during my Laptop vs. Desktop argument in which I somehow convinced myself not to build the desktop machine that I've been planning on building for years. After posting 3 unique posts and receiving no replies, I let the topic lie... ... until now. I'm looking to purchase a Hitachi 320GB Travelstar 2.5" hard drive and I'm planning on coupling it with an external enclosure that does not require an external power supply. Inadvertently, on a separate venture to look up backwards-compatibility of SATA 3.0GBps with SATA 1.5GBps, I saw that USB 2.0 offers 2.5W @ 5V to connected devices. I then looked up the Hitachi TS7K320 datasheet... which states that the hard drive takes 1.8W of power during a read/write operation, and idles at 0.2W. (It's interesting to know this as I have two 3.5" 500GB Samsung Spinpoint T Series @ 7200RPM consuming 10-10.6W on seek/read/write, 8.2W on idle, and just over half a Watt during sleep or standby.) Amazing how a USB port can supply enough power to a 320GB hard drive... Anyway, I did some more digging in the power department and found out that Joe Schmoe's Intel E6600 Duo Core draws shy of 120W under load and his Intel Q6600 Quad Core draws about 202W under load. Source Other components, I would assume, would draw a maximum wattage of the voltage rating to the device multiplied by the amperage (i.e. +12V, 1.4A DVD burner would consume up to 16.8W at peak usage). Are my numbers skewed in thinking, therefore my laptop cost-of-ownership argument is incorrect, or are these numbers accurate? Would anyone like to shed some light on this? (And for those of you who think that this is tech jargon that doesn't apply to you... think about the power consumption of your computers and add up how much it costs to run your machines.)
  22. Personally, I'm used to just opening up a saved draft in Gmail, then pasting any tidbits, web addresses, notes, or any other information on the e-mail and either send it to myself or just keep it saved as a draft. Why sign up for another service or install a program or use another application?Hey, it works.
  23. I am definitely responsible for the deaths of many on this grand day.My condolences to their families. Also, I don't feel bad in the slightest.I post this knowing that I may kill another. To statistics!
  24. Every child develops differently and at their own pace, as HH pointed out. My fiancee's kids all started talking at about the year mark; a few months before, a few months after. I think that kids are more apt to start speaking sooner with peers around; they will have someone to talk to and develop a vocabulary with.If a child doesn't start speaking by about a year and a half, there is the possibility of turning to a professional for speech therapy. One of our friends has an almost-3-year-old that still refuses to talk. He is a single child and it's not like he hasn't been encouraged to speak, but he's going through speech therapy now and hopefully it will make a positive difference in his speech behavior (which now mostly consists of screeching).
  25. I have to say that I'm picky about sound, but not enough to be an audiophile. With that being said, I'm sure most people would be happy with what I have for myself. First off, I highly, highly, highly recommend VModa's Bass Freq earbuds if you want some personal listening time. They are less than $20, offer more than the best bang for your buck, are of very, very high sound quality, not to mention that CNet has the same praise. They also offer noise-canceling thanks to the design... once fit snugly into the ears, it's hard to hear anything else but only the sound you want to hear. You will NOT be disappointed owning one or two pairs of these awesome earbuds. The only gripes (or small drawbacks to what you get, really) that I have with these earbuds are that the cable feels very, very fragile and not to mention that it's shorter than the cord for most headphones. That and it's not kid-proof: the two-year-old got ahold of them and chewed the plastic earpieces to... well, pieces. (Did I mention that these came with earpieces to customize the fit of your ear canal?) Secondly, for speakers, I am very happy with the Altec Lansing 2.1 speaker set that I snagged at Wally World (Walmart) for $25. It's insane how much power is behind a seemingly-puny and cheap 2.1 speaker set, and although the two satellites and one subwoofer don't seem like much, it's another best-bang-for-the-buck that I've enjoyed for the longest time. If you can't afford (financially, logistically, or aesthetically) to have a full-blown surround sound system, I would highly recommend this set. I can't find them anywhere anymore, but they're still giving me the same amount of audio enjoyment I've been getting for the past few years. I am still running on my computer a cheap, $50 external USB SoundBlaster Live! 24-bit sound processor which throws my sound towards the speaker system. I suppose I can attribute additional quality of sound to it, but nowadays onboard sound processors are getting better and better. I'm sure that most people would get much better results with a desktop running a dedicated sound card, or they might not even miss the difference at all.
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.