docduke
Members-
Content Count
150 -
Joined
-
Last visited
Everything posted by docduke
-
I would be very interested to learn the final answer to this. I believe I have had a similar problem. I was using a USB drive to transfer files from a Windows machine to SuSE 9.1. I was using shell scripts, and I'm certain I umount'ed the drive, but the FAT32 partition table was corrupted. I know how to recover now, but I didn't then. Fortunately, I had copied what I needed, so I just reformatted the drive. The ideal answer would be to compare the source code for umount with the source code for: bringing sysinfo:/ up in Konqueror, right-clicking on a drive, and selecting "Unmount." That is beyond my skill set. It is also probable that such problems depend on the USB device drivers, and are very hardware-dependent. There are many issues with computers for which there is no final answer. Successful computer users learn to minimize wasted time. If there's a bug you don't understand, find a workaround, and stay away from the bug. In my case, it was learning how to get a partially-functional computer onto the network by booting a live CD Linux on it, and transferring the files through the network instead of using a USB hard drive. It actually turned out to be faster!
-
You asked. I have 7 (currently running) computers. Three of them do 90% of my work. The oldest has Windows 95. The next-oldest has 7 personalities (bootable partitions) and spends most of its time in SuSE linux 8.1 as a MediaWiki server with math extensions. That's were I learned to love ImageMagick. Adding the other 5 computers gets me to over 30 personalities including Win2K, Win XP, Win Vista and many flavors of SuSE and other Linuces. Now, to the subject of this topic. Windows 2000 Pro SP4 has services like Remote Access Auto Connection Manager Telnet Allows a remote user to log on to the system and run console programs using the command line. but does not appear to have remote desktop capability natively in Windows. Windows XP Home has Terminal Services Allows multiple users to be connected interactively to a machine as well as the display of desktops and applications to remote computers. Remote Access Auto Connection Manager Creates a connection to a remote network whenever a program references a remote DNS or NetBIOS name or address. Remote Access Connection Manager Creates a network connection. Remote Desktop Help Session Manager -- Which I believe is primarily for technician access to your computer. Remote Access Connection Manager and Telnet are on by default. Windows Vista Home Basic has Terminal Services Configuration Terminal Services Allows user to connect interactively to a remote computer ... Remote Access Auto Connection Manager Creates a connection to a remote network whenever a program references a remote DNS or NetBIOS name or address. Remote Access Connection Manager Creates a network connection. Remote Access is off by default, Terminal Services on. It is also worth noting that other services, such as Software Licensing are on by default in Vista. In summary, it appears that Win2k has command-mode remote access, XP and Vista have GUI remote access, but it is on by default in XP and off by default in Vista (since it is a potential security vulnerability. If you think Microsoft is sensitive to security issues in Vista, consider that Remote Procedure Call (RPC) is on in all three systems. That is a major security issue, but Microsoft appears to use it in so much of its networking software that it can't function without it.
-
One of the powerful graphics manipulation programs that Xisto provides is home page in the right column: Our Free Web Hosting offer :- ImageMagick Support I have not seen ImageMagick mentioned in any other free, or nearly free, Web service. This is one of the things that attracted me to Xisto. ImageMagick is very powerful. As its website describes its capabilities: ImageMagick® is a software suite to create, edit, and compose bitmap images. It can read, convert and write images in a variety of formats (over 100). ImageMagick is free software delivered as a ready-to-run binary distribution or as source code that you may freely use, copy, modify, and distribute. Its license is compatible with the GPL. It runs on all major operating systems. Personally, I am interested in it because it is a necessary tool in MediaWiki. If one wishes to typeset mathematical equations in MediaWiki (as is done in Wikipedia), ImageMagick is a required component of the extension that provides this capability. A while ago, I had trouble finding it on gamma.xisto.com. This tutorial shows how to test for it, and verify that it is functional. Since the primary focus of Xisto is web hosting, the tutorial is in PHP. This illustrates how ImageMagick can be used in any website that uses PHP scripts. Here is a PHP program to test for the presence of ImageMagick, and print out its version number. Create a file in your public_html folder or a subfolder, paste this text into it, and name it something like test4im.php where the ".php" ending is necessary to get the PHP interpreter to process it. Point a browser at this file (e.g. http://forums.xisto.com/no_longer_exists/ website]/test4im.php). If you have access to "convert," you will get a return code of 0 plus two or three lines of text, reporting the version and location of ImageMagick. Strictly speaking, you do not need to create the alist function since it is only called once here, but if you wish to test other things, it will save repetitive typing. (This is, after all, a tutorial on PHP.) There is one more important step to make sure you can use ImageMagick. Almost everything you do with it will involve creating a file. Therefore, you need to be sure you can give it an argument that correctly identifies the location you want, and provides the appropriate permissions. Suppose you want the output in a folder called "temp" in your public_html folder. First you must create the folder and set its permissions. Use cPanel to create a folder "public_html/temp". Use the "permissions" tool to set "temp" to "777" which means anyone (including the "user" that runs an html task) can write to the "temp" folder. Now add the following two lines to your "test4im.php" program, at the location of the next-to-last line above. The first line creates an ImageMagick logo image and stores it in the "temp" folder. The second prints the return code. Save this modified file and reload the browser page. If you got a version number before, and now get a non-zero return code, you probably didn't create the "temp" folder in the right place, or set its permissions correctly. If you get a return code of 0, examine the "imlogo.gif" file using the "View" tool in the cPanel File Manager. You should see this:
-
How To Copy File & Folders From Linux To Windows?.
docduke replied to kanade's topic in Websites and Web Designing
You need to clarify how Linux and Windows are connected. If you have a dual-boot machine and you're after files from one to the other, do it from the Linux side. Linux can read anything (unencrypted) that Windows can write. It is easiest to keep track of things if you create a separate data partition that is accessible to both operating systems. That reduces the chance that you may accidentally damage something in the Windows operating system. The tips mentioned above are important. FAT32 has major problems with really large files. NTFS is (unfortunately ) the only way to go with big (>2GB) files in Windows. A recent version of Linux should be able to write NTFS. If not, it will probably give you a warning. If you "must" stay in FAT32, the option on the Windows side is HJ-Split. The corresponding functions on Linux are split and cat, but the command-line invocations involve dd. They are pretty arcane, and dd can do really nasty things if it is fed the wrong parameters. If you decide to stay with FAT32 and split files, you are really better off doing it in Windows. If your Linux and Windows machines are running simultaneously on a network, your best bet is Samba. There is a learning curve, but again, most modern versions of Linux have good defaults built in. With a properly configured Samba, Linux can read and write files on the Windows machine and Windows can read and write files on the Linux machine, depending of course on the permissions both machines have set. In this case there is no concern about correctly writing NTFS files on the Windows machine, because Windows native networking commands are performing the operations. Most of my Linux work is on openSuSE. There, if you know how to interpret and use the /etc/fstab file, then it is an easy step to using the /etc/samba/smbfstab file. It lets you define partitions on the Windows machine so that they are accessible on bootup from the Linux machine, just like any Linux file. The samba files may be elsewhere on other Linuces. -
I do most of my Windows stuff on Win 2000 Pro SR4. It has that much (to Services), but it does not have "Terminal Services." Instead, it has "Telnet." That provides a primitive access (directory list, cd, put, get), but you can't run programs through it. I believe I read somewhere that Vista has full remote desktop access, but I haven't seen it. The folks at PC Stats have a page titled Beginners Guides: Remote Access to Computers which appears to address the problem pretty thoroughly. Note that Virtual Network Computing is unencrypted! One Linux "equivalent" is Desktop Sharing, though there are a lot of alternatives in the Linux world. Desktop Sharing is secure (encrypted). The link to alternatives are for openSuSE, which is where I do most of my Linux work. I have set two computers up with a shared desktop, and found both keyboards and mice live. Things don't happen quite as fast on the "remote" desktop, but it's really very close to being on the home machine. If you don't need the full capabilities of a shared desktop, SSH is a secure shell login (very common), and SSH -X lets you run X-windows (GUI) software on the master computer, and have it displayed graphically on the remote machine. This mode is both encrypted and compressed, so the graphical data transfer is fast. If you read documentation on X-windows, be aware that the terms "client" and "server" are reversed from their usage in the rest of the computer world. Also be aware that if you want to access your computer from a truly remote location, you need to know its IP address, and you my also have to tweak your firewall (You have one, right?) to get it to accept incoming traffic. In the Linux world, that is an integral part of the encryption preliminaries.
-
Mysql - So Hard Come in here if you think MySQL is soo hard
docduke replied to Cain1405241557's topic in Programming
SQL is actually a very interesting, if old, computer language design. Borrowing from Wikipedia: Note the word shared. If you ever need an environment with really large bandwidth servers, you will be glad you use something based on SQL. It is designed to minimize the data transfer between the database and the requesting computer. In 1989, my wife was working on a Master's in Computer Science, and she jumped at the opportunity to go to a seminar given by Codd. He had just come out with a new book. He was famous in her circles for Codd's 26 Rules of relational database design. She heard he had added a few rules. She wondered, "Was it 30, maybe 36?" It turned out to be 126, and the talk was one of the more opaque she attended (No connection to OpaQue! ) Anyway, the bottom-line message is that SQL has been thoroughly studied, and optimized in an environment where data transfer is costly. It is worth using, and MySQL makes it free and easy! -
Free Web Host Or Our Own Web Server ?good & Bad
docduke replied to kanade's topic in Websites and Web Designing
Add to that Beg for Bandwidth If you seriously want to make money from a website, you need bandwidth. The reality is that you need to put a lot of work into making your website appealing, in order to get traffic. If you get venture capital, you can hire a bunch of people to make that happen fast. If you are more typical, you will do it yourself to minimize cost -- that means an investment of time. Maybe weeks, more likely many months. I'm here because Xisto offers the best of both worlds. Xisto & Friends when you are working on a shoestring, getting "almost" free hosting when you pay by sharing what you have learned. Then, when the bandwidth (and hopefully money) starts to build up, you can move your website to paid hosting and handle the additional bandwidth. In the meantime, you can develop at your own pace without having money bleed away. The one expense for this (besides computer and internet access) is your own domain name. Last time I checked, GoDaddy sold .info names for $2/yr. That way, you have control of how people find your website, wherever is is hosted! If your DNS entry is busted, I have never seen any place respond faster than Xisto! In summary: My opinion is that free hosting is better while you are developing, then move to paid hosting when you need it, and hopefully can afford it! -
Yes -- and no. There are a handful of complete, and a lot of partial analyses of human genomes. A recent article based on genetics concluded that there was one "Eve" who can be identified as the progenitor of all humans. A method was used which identifies female forbears, not male. Further studies concluded that the human race spread out, and then during an ice age contracted to as few as 2000 women, then spread out again. The headline was that "Early man near vanished in Africa." There is a good, very readable book on what has been learned from genomic studies called Before the Dawn -- Recovering the Lost History of Our Ancestors. It shows what can be determined from these studies. There are also an increasing number of analyses of the genetics of animals. They show many similarities, but also many differences. The science of Biology has classified the differences over many centuries. Before the microscope, there were only animals and plants. By 1977, the classifications had evolved to six kingdoms. The conclusion is that there are at least six dramatically different structures that support life as we know it. One big argument in the U.S. is over whether "evolution" explains these variations, or whether God, or some other intelligence can explain it. Many of the people who attempt to argue such issues fail to define their terms. There is tremendous variation in the human genome. Nevertheless, people are basically very similar, biologically. Taking a step into greater variation, humans and apes are very similar, when contrasted with birds, worms and bacteria. Taking a further step, humans and bacteria are similar in makeup, when contrasted with archaebacteria and eubacteria. Some of these strange objects don't even have the same basic methods of functioning. These real strangers include the life that thrives at the bottom of the ocean near hydrothermal vents, using a very different form of body chemistry. The fact is that no one has successfully explained how these highly varied life forms could have come from, for example, the chemical residues of a lightning strike, though progress is being made in that direction. The fundamental question, which is seldom addressed in such discussions is: Where did life begin? Science has not yet made a persuasive case that it actually arose "from the muck." It could have come from God. It could have come from some extra-terrestrial species, moving on long before anything we have yet examined. It may have come from the muck, but we cannot yet make a persuasive case for that. In conclusion: It is clear that humans and apes are closely related; very close when compared with other species. However, there is nothing that conclusively demonstrates ancestry, and much that shows there are great variedies of life on Earth that clearly cannot be related by ancestry.
-
I guess I should consider myself lucky. (The harder I work, the luckier I get! ) If you're on gamma, you should have 3 ways to get to cPanel (securely): (1) site:2083/ (2) http://forums.xisto.com/no_longer_exists/ (3) http://forums.xisto.com/no_longer_exists/ In each case, you MAY get a complaint or two about a questionable certificate. That is because Shree rolls his own. He don't need no stupid "authority" to tell him his certificate is OK. After you accept it (for one time, or more), you will next be presented with an (insecure) popup ID/Password box. Close it. If you're going to the trouble to use encryption, you don't need to send your password in the clear. Finally, you are presented with a secure cPanel login page. Give it your account name (the username after the ~ in the "home" designation), and the corresponding password, and you should be in. That's what cPanel needs to sort out which linux "user" you are trying to get to, on a shared server. (2) and (3) worked for me as soon as the DNSs had the new address for gamma. The first took longer because I didn't read the instructions carefully. I was pinging on support instead of sales. (April 30: Update on Server Migration: If still your site is not online, immediately send an email to the sales department with your domain, server ip and name in the billing record.) I finally noticed that detail and sent an email to sales at about 10 PM (U.S. Mountain Time) on Saturday, long after "sales" was supposed to be closed. That was what? 4 AM in India? The response was instantaneous, from Shree. That guy is amazing! Six hours later, my domain was recognized with the new IP in the DNSs. Hope this works for you!
-
Two quick questions: (1) Where is ImageMagick? (2) How does one tell PHP where it is? This query is being posted here because this thread is one of the few places ImageMagick is discussed at Xisto. I have uploaded the php scripts for MediaWiki, phpBB and WordPress. Unzipped, the sizes are MediaWiki 24.5 MB, phpBB 7.7 MB and WordPress 4.3 MB. So all of them fit comfortably on the servers here. I have previously installed MediaWiki and phpBB multiple times on various versions of SuSE Linux, but always with root access to the server. Does it appear that the installation will go easily via cPanel? Yes! The new File Manager makes it very easy to upload and unpack the scripts. All three responded to browser access without even being installed. Of the three, phpBB provides the most information about the server environment before installation. That is not surprising, since the developers have spent about 3 years rewriting it from the ground up! It actually finds significantly more of the necessary resources already available in the servers here, than it does on a current installation of openSuSE 10.2 on my home network. OpaQue gets high praise from me for the care he has taken in provisioning his servers! There is, however, one thing the phpBB installer does not find. That is ImageMagick. phpBB reports: Cannot determine location. If you know ImageMagick is installed, you may specify the location later within your administration control panel. I know from the Xisto home page that ImageMagick Support is here, but it appears that its location is not in the PATH environment variable, or for some other reason, PHP cannot find it. I know MediaWiki also needs ImageMagick for equation display, so I really want to find it. This is where I would normally use shell access. Directory listings of the several common locations of executables would probably answer that question. On my SuSE installation, for example, the relevant executable is at /usr/bin/convert. I have no idea how, within the constraints of cPanel, to test whether that is the correct location. If someone can answer that question, or provide the actual location, it would be very helpful. Along a similar line, MediaWiki uses a program called texvc to decode LaTeX scripts, and feed ImageMagick. The recommendation of MediaWiki is to get OCaml, and compile it on your computer. Again, this would require shell access. I have compiled it on my computer, but I have no idea whether it is safe to copy the executable from SuSE Linux 10.2 to the Xisto server and put it where MediaWiki can execute it. I certainly do not want to do something that might freeze the server. Any suggestions would be most welcome!
-
Knoppix linux can access NTFS partitions. So can recent versions of SuSE. At the most basic level, the Linux solution for writing to NTFS (you need to write your registry backup to the main registry) is to use the NTFS-3G driver. If you already have it in your Linux, great! If not, here are the details. You are obviously better off doing it in Windows if you can, but a live Linux CD has solved many a problem like yours. (Gee, I wonder why Microsoft doesn't want to distribute live CDs of Windows? Piracy perhaps )
-
I suspect that of being a photoshop job. Note, in particular, the angle of attack (the angle that the wing tips up relative to the horizontal) and the height above the water. The angle of attack is high enough to be unstable, and the altitude implies 20 minutes at least of climbing -- not likely. I was at the "Transpo" event held at Dulles Airport in the 1970's when one participant flew a hang-glider towed behind a pickup truck. He would tip the nose of the glider up and down to show the audience he had control. The day after I was there, he tipped it up a bit too much, and it stalled. As Rush Limbaugh would put it, he "assumed room temperature" that day. I don't doubt they can be built, and flown. I do doubt that picture.
-
I tried using uTorrent when Knoppix 5.3.1 first came out. I hadn't used Torrent before, but 5.3.1 was only available in DVD at 4.1 GB or so. It seemed like a good time to use something that was reputed to reduce the load on servers. It took a while to get firewalls, routers, etc configured so that uTorrent had both inbound and outbound communication, but I got it working.It found P2P sources, and started grinding. It informed me it had a time estimate for completing the download: 3-1/2 weeks! That estimate proved academic, because it crashed after about 5 minutes. I restarted several times, and never stayed up for more than about 10 minutes.It turns out the problem is not uTorrent, but Comcast, my broadband connection. Comcast apparently has hardware to kill Torrent connections. So if you're planning on using any Torrent software, first make sure your connection doesn't go through Comcast!
-
It depends on what you want to do with them. Many people use server software packages such as MediaWiki (to make a clone of WikiPedia), or phpBB for bulletin boards. In each case, these programs use databases to store and retrieve the contents of their web pages. As a result, they take care of using the database themselves. Usually, to set up such a program, you first need to create a database account on the server, then tell the MediaWiki or phpBB software what the userid and password are for that account. Many "free" or low-cost hosting services limit users to very few database accounts. Xisto does not. That is one of the reasons I am here. If you really want to use one of these databases "barefoot," you need to study SQL. I really do not recommend that, as you will see here and here. If you are determined, have at it! Otherwise, find an application that is already built, and close to what you want. In Windows, look at Ms Access, Writer's Project Organizer or Album Shaper. In Linux, look at Beagle or GnuCash, which you may find more of a challenge to install. On most modern operating systems, the "database engine" understands SQL commands, but the typical user finds an application that generates the SQL commands for him.
-
I have been using Microsoft software for more than 20 years, Linux for 15. Both have undergone tremendous changes. If you want to use commercial application programs, there is almost no alternative to Windows. I make daily (often hourly) use of two Stock Market programs: Telechart and Tradestation. Together, they cost $130/month, so the cost of the operating system (and even the computer!) becomes small compared to their annual rental. However, they earn their keep in the tools they provide for analyzing stock, commodity and futures markets. Neither of these programs run in anything other than Windows -- not even Mac. Each of these platforms has its own, proprietary, programming language(s). Each also has a "Software Development Kit" which allows the skilled programmer to access the innards of the platform in C++, Python or his language of choice, given an even higher skill level. If you insist on "open source" programs, you cannot take advantage of the tens of manyears of effort that have gone into developing these programs. On the other hand, there is really excellent open source software developed for more general purposes. K3b for burning CDs and DVDs, gimp for image manipulation and Linux itself come immediately to mind. If you want a more specialized application, you may have to write it yourself. Fortunately, there is a convergence. It is via "virtualization." Linux, Windows or Mac can be a "host," and it can run virtualized copies of Linux and Windows, simultaneously, inside the host. That means you can, for example, have a Linux, open-source host running itself, one or more copies of Windows, and even other copies of Linux, and probably (though I have not seen this) Mac "guests." Several years ago, a research project in England (Xen) bore fruit in the design of hardware that could run virtualized software at near-native speed. That is, the "guest" operating system could run applications almost as fast as if the "host" operating system were not present. About a year ago, both Intel and AMD added these hardware instructions to their CPUs and the motherboard support chips. It takes a fairly powerful computer to run multiple virtualized operating systems. In particular, each virtual OS and the host all have to have their own dedicated part of RAM. However, once a system is virtualized, it can very easily be moved from one computer to another. How easy, was demonstrated by Novell about a year ago. (I just went looking for the web link and couldn't find it.) In their demo, they had a web server running on one virtual machine, serving streaming video to a number of users. They moved the server software from one physical computer to another, without losing any video streams! I find that very compelling because I have moved a partition containing many interdependent applications from one physical computer to another, because the hardware in the first computer was dying, at least 6 times in the past 10 years. Each time, it has taken from a day to a week to locate and integrate all the different hardware "drivers" for the new computer, and get them working with my software. In my study of virtualization, I have learned that Windows NT, 2000, XP and Vista all have a "hardware abstraction layer," HAL.DLL, to the specialized hardware drivers. Virtualization software replaces what is below HAL, making the move from one virtual environment to another (once you have gotten there in the first place), very easy. I just checked SuSE, using a simple script "ps -ef | grep -i hal" and found that it has a "hald" which is probably a HAL daemon, so the same principle probably works in Linux. My conclusion is that you will probably be able to run whatever you want in a single computer soon. Whatever is easiest for you to use can be run virtualized in a single computer.
-
I'm not sure what kind of search you did, but you missed something! I brought up your post, and the box beneath it showed 14 "related" posts, many of which had paypal in the title! Take a look down there the next time you look at this thread!One thing I really like about Xisto and its IP.Board software is the "related" stuff it organizes for us!
-
I have been a fan for many years of SuSE linux. I believe I started with version 4 or 5. Back in the early days, I fried one monitor and another video card with it, but liked it enough that I persevered. I have SuSE 8.1 running on an ex-Windows 95 machine. It has been running for years, and is functioning as a server for my notes, in a wiki. I just looked at the retail boxes for 8.1 and 9.1. Both are about 3-1/2 inches thick. They have excellent, detailed manuals inside. The box for SuSE 10.1 is under 2 inches thick, with a skimpy manual. That's the last version I bought. Oh, well, I guess I need to switch to online documentation. I now have openSuSE 10.1 and 10.2 on 2 different machines, and just installed 10.3 on another drive in the 10.1 machine. So much for online documentation: It doesn't work in any of the openSuSE 10 series. I'm also finding new and worse problems with their software. K3b (their CD burner) hangs about 25% of the time if I ask it to verify its burn. I then can't even reboot SuSE within KDE. I get a message "Logout canceled by K3b." I have to go to superuser and issue a shutdown command to kill it. On the positive side, SuSE's YaST installer is rock-solid! Now, I'm looking for an alternative. I just spent several hours today installing Ubuntu 7.10 on another partition in the 10.3 drive. The reason it took hours is the Grub installer. Linux is going through a renaming mess. A year or two ago, there were 2 kinds of drives: hd and sd. The sd drives were SCSI (Small Computer System Interface) and the hd's were everything else. More recently, the hd's were renamed pATA (parallel Advanced Technology Attachment), and new sATA (serial ATA) drives were added. For some reason, the Linux gurus decided to use the same "sd" designation for both the SCSI and sATA drives. It shows in the openSuSE evolution. I have a 1-1/2-year-old computer with 2 sATA and 2 pATA drives. SuSE 9.1 called them all hd's. SuSE 10.1 called the pATA drives 'hd', and the sATA drives 'sd'. SuSE 10.3 calls them all 'sd'. To tell them apart, SuSE 10.3 uses the long character-string name of the drive (e.g. scsi-SATA_Maxtor_6L080MO_L269PHVG-part6) instead of /dev/sda6, and Ubuntu uses a 32-byte UUID (Universal Unique IDentifier). The openSuSE folks apparently understand what they are doing (thus YaST's reliable installs), while either Grub or Ubuntu folks do not. Grub uses HD0, HD1, HD2, HD3 to identify the drives. Ubuntu calls them /dev/sda, /dev/sdb, /dev/hda, /dev/hdb, not in the same order! If I did not have the working Grub control files (device.map and menu.lst) from SuSE to analyze, I would have spent a lot longer figuring out how to fix the Grub misconfiguration in Ubuntu. It turns out that the "correct" sequence for Grub is: sda, hda, sdb, hdb. Whether that is unique to my specific hardware, I have no way of knowing. I finally got Ubuntu to boot. I thought I was finished. Nope. It told me my userid/password pair was invalid. I'm positive what I put into the installation fields, so I have no idea why it wouldn't accept it. However, that turned out to be simple to fix (from a security perspective, maybe it is too simple). Just pick the second option in the Grub boot screen (recovery), and it will boot you into text mode as root. Then type passwd <userid>. Ubuntu will reset the password for you. Reboot with shutdown -r 0, and you can log in with the new password. Anyway, after these problems, this leaves me still looking for an alternative to openSuSE.
-
Mordent, your skills will still be in demand. It has been said that the last person who knew most of everything worth knowing was Leonardo da Vinci. In today's world, really useful things are done by groups of people. J.C.Bell's invention will require lots of electronics to make it work.I am interested in the potential for scaling it down rather than up to (1) compress timescales and (2) make it harder for governments to regulate. If there is one thing that has been obvious since the first oil shock in the 1970's, it is that we need to diversify our energy sources, yet government is still standing in the way of nuclear and other potential sources. I read recently that there are of order 25 applications for new coal plants, yet most were put on hold by the power companies when Congress dropped a "carbon sequestration" bill in the hopper. Those coal plants, much less nuclear plants, require 5-10 years to build. I could (in principle) put up a fermentation plant in my back yard in a week. Developing a uranium mine takes 3-5 years. Doubling the number of microbes in a fermentation tank is an overnight process.Big business, especially big energy business, is often in the crosshairs of Congress. We feel their bite in things like our cars, when they are built by big businesses. If each community or even each household had slightly different versions of energy production plants, it would make mischevious regulation much more difficult. It would also allow designers to quickly determine which of the different designs was most productive.
-
If I can rephrase the objective of this topic, consider "unlimited energy" as an objective. I just ran across the description of a method that is currently being scaled up to produce a lot of oil products (gasoline and deisel) from organic waste. I find it interesting because of the possibility to scale it down and provide household, or individual community, sources of fuel. The article I ran across has the headline: Anything that grows 'can convert into oil'. The inventor is J.C. Bell. He has a website under construction. (Just the home page.) His invention is very simple, yet profound. Cows can eat almost anything that comes from plant life. From it, they generate a lot of methane. In fact, they generate so much methane that it contributes significantly to the hydrocarbons in the atmosphere. The implication of the article is that he has engineered stomach bacteria from cows so that they produce chain hydrocarbons. The objective is that one can take waste food, grass clippings, agricultural residue such as cornstalks, and any other organic residue, and persuade the bacteria to convert these things into oil products. His view is that he should be able to scale this technology up to provide for the oil needs of the U.S. To me, what makes even more sense is to scale the technology down so that every home, or if the chemical processing is complex, every community can have a facility that will be able to convert waste products into useful fuel. It can even reduce the amount of waste that goes into the local dump. From the environmental point of view, it is carbon-neutral. It neither adds nor removes carbon from the environment. It simply converts waste into fuel. From an energy point of view, it converts sunlight (which grew the plants) into chain hydrocarbons, which can be used for whatever oil is used for. Sounds like a win-win idea. The energy isn't free, because it requires work to do the processing, but it offers the prospect of producing usable fuel at low cost, and recycle waste in the process!
-
Final notes on Windows cleanup of files with read errors. Most people do not have multiple bootable Windows partitions on their hard drive, so here are a few clues to do the work in the affected partition.Previous folks here have said just delete the file. The details I am providing here are for the case in which you do not know which file is the problem. My backup program simply reports getting a read error somewhere in the partition and quits. These methods will work when you don't know the name or location of the offending file.First, of course, is to use the Windows distribution CD, if you have it. It has a recovery mode, and that is almost certain to have chkdsk. Since the OS on the hard drive is not running when you boot from the CD, that should work. Second, as many people have mentioned in this thread, go into "Safe Mode." There is a somewhat more powerful option, which I have just checked out. It is the "Recovery Console." If you want to be prepared for problems, you can install it into the Windows system on the hard disk. Then, when you boot up, you can be presented with the option of running either the normal system or the recovery console. In the recovery console, I have verified that you can run "chkdsk /r" on the partition with the running OS, so that is probably another way to deal with read errors.But again, this is Microsoft software. When you boot up in the Recovery Console, you have access to only a limited set of commands. (Type HELP to see them.) Chkdsk can be run, but "dir" does not show it in the WINNT folder. Going (cd system32) to the System32 folder, you will find that "dir chk*.*" brings up two files: chkdsk.exe and chkntfs.exe. The former is presumably what runs when you try executing it. Even though "dir" sees chkntfs, the Recovery Console won't let you run it. If you are running from a CD, you may be able to run it (though that may create more problems than it solves, since the system you are running on may be incompatible with the one you are repairing (e.g., different Service Releases.) Anyway, that completes what I have learned (and not learned) about cleaning up read errors in NTFS files. I sincerely hope that (1) you never have to use this information, but if you do, (2) I hope these notes speed the process for you!
-
Having just spent two days figuring out how to delete a corrupt file in Windows NTFS, I can tell you more than you want to know about it. This is on Windows 2000 Pro, so your mileage may vary with other OSs. First, the GUI Checkdisk will not touch it. It will tell you the disk is fine. (I knew it was not, because my backup program quit with a "read error" message.) The defrag program will treat it as untouchable, and defrag everything else around it. As others have said, there are many warnings against using Linux to alter NTFS filesystems. The folks who are trying to change this are at Linux-NTFS, and think they have fixed it. Specifically, they say: We just have released ntfsprogs 2.0.0 with full read/write support! So if you have this version, it may be able to do the cleanup. However, back to doing it in Windows. It turns out that Microsoft has a Knowledge Base Article that relates to this issue. One learns there that the DOS version of chkdsk can do things the GUI version cannot. However, the story is not yet over. If you try to run the DOS chkdsk in the Windows partition you are trying to repair, it will tell you Cannot lock current drive .. the volume is in use by another and refuse to run. You must have a multi-boot system with 2 (or more) Windows partitions, and run chkdsk D: /f /r with "D:" replaced by the drive assigned to the partition with the bad file. Actually, it turns out that chkdsk D: /r will repair it, even though it is not supposed to make any changes without the /f switch. I tried omitting the /f switch because I wanted to test some other options, but the bad sector was removed by that operation. It's amazing to me that the GUI checkdisk gives a disk with read errors a clean bill of health, but don't forget that this software is from Microsoft.
-
Nuclear power is a limited resource, but the limit is very high. Wikipedia puts it at 1500 years. It presents a chart of it here, and cites an internationally-recognized study (pdf). Solar cells are also becoming much more efficient. The Department of Energy last December reported 40%-efficient solar cells. The trick is to "tune" individual components of the solar cell for different wavelengths in the solar flux. Nanotech allows dense packing of these tuned components.
-
There is a Live CD version of Rainbow Tables, called OPHcrack. It is discussed in DistroWatch, which is where I first heard of it. It is imbedded in a copy of Slackware Linux. I tried it on Windows XP, on a system which had 4 user accounts. It cracked only one of them, which had an all-uppercase 8-character alphabetic password. This is neither a testimonial nor a complaint. I had never before heard of Rainbow Tables, and was curious what they could do. If you wish to try them out, a Live CD is certainly a simple way to do it. In praise of OPHcrack, I booted it on a computer that has 4 hard drives. It correctly identified the 4 Windows partitions, and let me tell it which one to attack.
-
Man Dies From Playing World Of Warcraft Gaming? Harmful?
docduke replied to half1405241509's topic in Science and Technology
This thread has a lot of staying power! A while back in it, Abhiram asked for references. I don't have one for the title event, but I have one with well-documented problems for teens. It is Boys Adrift - The Five Factors Driving the Growing Epidemic of Unmotivated Boys and Underachieving Young Men. The author, Leonard Sax, has spent many years studying children and motivation. His second "factor" is Video Games. In his opinion, many teens become so addicted to video games that their school performance, social lives and general well-being all suffer from neglect. He gives numerous examples. I have one teen in college, and a second one in high school. In looking over colleges, I have been surprised to learn that U.S. colleges now have a substantial majority of women students. Boys seem to be losing interest in studies, and to hear Dr. Sax telling it, doing much of anything. This may be more of a problem in the U.S. than elsewhere, but it certainly seems to be something of major concern here! -
Thanks for the information on Bit Torrent. I have never successfully used it before (I tried once about 2 years ago), and I just spent a frustrating day repeating that experience. I got it installed, but after some hesitation, it quit every time I tried to download my target. Knoppix, after more than a year, has a new version out, 5.3.0, and it is only available by torrent. I wanted to get it.I have a router which enforces Zone Alarm (ZA) usage on my network, and multiple computers, each running ZA. I picked one old Win 98 computer, and reconfigured ZA on it so that none of the other computers could be reached from it. (Call me paranoid! ) Then I installed Bit Torrent on it, and tried to download the file. I got a diagnostic telling me that it could not accept incoming connections -- I should fix my firewall. After a few minutes, it quit. I tried many things in both ZA and the router, but nothing worked. I have browsed through the Bit Torrent website and the LifeHacker stuff, but haven't yet found any troubleshooting information. Anyway, I appreciate the links, and if I find some answers, I'll report back here.