Jump to content
xisto Community

travstatesmen

Members
  • Content Count

    119
  • Joined

  • Last visited

1 Follower

About travstatesmen

  • Rank
    Advanced Member

Contact Methods

  • Website URL
    http://travstatesmen.trap17.com

Profile Information

  • Location
    Auckland, New Zealand
  • Interests
    This forum account is dual-managed by a husband and wife team. <br /><br />Statesman of Travian: I play the Travian MMOG on the .com servers, and I run the Statesmen of Travian clan.<br /><br />Radio Chick: I am into radio and have been a breakfast show host, I enjoy meeting people and talking intelligently on a wide range of subjects.
  1. Thanks again velma, and Saint_Michael for helping me understand this. My concern, as I said to SM in the Shoutbox, is that there was talk on the forums about a possible cutoff date for when accounts would need to be transferred across to the new system, but the "front-end rules and regulations" are not in place yet so anyone who goes across to the new system is currently having to accept the old rules, which are plainly not meant for a "free" web hosting service. Sorry to be so anal about this, but the way that your current Xisto - Web Hosting TOS is worded means that you have the right to purse me for payment for my free web hosting, and that is not something that I am willing to sign my name to. I would rather wait until the new TOS is written before I change from the current Xisto hosting to the Xisto - Web Hosting plan with Credit System v3.0. As I said to SM on the Shoutbox, if I miss out on a few MyCents on the way, so be it.
  2. Thanks for the replies, OpaQue and velma. Also, Saint_Michael replied to me on the Shoutbox, giving further information. I am puzzled about a couple of things though. Firstly... Now, the first post in this thread lists a link to https://support.xisto.com/ which, in turn, lists a link where we can Register, as recommended in the post by OpaQue. When I click on the Register link, I am confronted with a form, as below... ...which clearly shows that the personal details, including telephone number, are required in order to register. There was no such requirement for all these details when I registered on Xisto using this form. Secondly... This sounds similar to what Saint_Michael was trying to explain to me. However, in order to register using the Register link that I mentioned above, I also have to accept the Xisto - Web Hosting TOS. I have not as yet seen any other "DIFFERENT CUSTOMIZED rules" that only apply to MyCents users. I am keen to see this resolved, but I will not be accepting the current Xisto - Web Hosting TOS, as it is not as good, in my opinion, as the current TOS and AUP that I accepted for Xisto membership.
  3. Hmm, I am not too happy with the new offer, to be honest. Firstly, when I signed up to Xisto there was no requirement to give out my personal details. Now, under the new Xisto - Web Hosting billing account they are demanding full disclosure of personal details, including my physical address and telephone number. I see no need for such details to be given for what is touted as a free hosting account. Speaking of which, I hardly think that this new scheme is really free at all. Sure, you can earn virtual cents and real American Dollars by posting in the forums, but I see in the Terms and Conditions for Xisto - Web Hosting that there are such clauses as: ?1.3 Default and Cure, ?1.4 Charges, ?1.5 Payment, and ?8.3 Choice of Law and Forum, ?8.7 No Waiver, ?8.9 Survival etc which all pertain to payment issues, and thus the services offered are no longer free. What I am seeing is that, if we fail to post on the forums to keep our hosting account going, then not only will our account be suspend (as it would have been under the old Credit System 2.0) but also we could still be liable for outstanding payment, and I'm sure that if such occurs, then they won't be seeking MyCents to pay the outstanding amount, but will be looking for real Dollars. The provisions under ?8.3 and ? 8.9, for instance, I'm sure would not be able to be negotiated with a payment in MyCents. I don't think the Federal or State courts located in California, for instance, would accept MyCents should negotiations break down and parties end up in court. If you compare the Xisto TOS and AUP with the Xisto - Web Hosting Terms and Conditions you will see some glaring inconsistencies. For instance... Xisto: Xisto.com reserves the right to amend its policies at any time. All accounts are required to comply with any changes made to these policies. Notification of any major changes to the policies will be emailed to all account holders. Xisto - Web Hosting: Xisto - Web Hosting reserves the right to add, delete, or modify any provision of its Terms and Condition, Acceptable Usage Policy at any time without notice. Xisto: All the accounts which are inactive over a period of time as decided by Xisto in its sole discreation would be terminated without notice. Hosting Account Holders who do not have any approved extension period and are inactive over a period of 30 days would be terminated without notice. Xisto - Web Hosting: All provisions of this Agreement relating to your warranties, intellectual property rights, limitation and exclusion of liability, your indemnification obligations and payment obligations shall survive the termination or expiration of this Agreement. I am sure there are other examples if I look hard enough. In a nutshell, they can now change the rules on us without letting us know, and if we don't meet the new ammended criteria (that we were not notified about) and end up defaulting on their new terms of use, then not only will they suspend our accounts but they will pursue us to the grave for payments owing. No, this is no longer free hosting. I'm sorry, but it is time for me to move on. I'm heading off to find a free hosting site, and one that doesn't unnecessarily demand personal contact information. I'm surprised that the application form for Xisto - Web Hosting doesn't include a place for you to put in your Social Security number.
  4. Well, at one stage there I was reinstalling Windows on a daily basis. I have a home network of quite a few computers, as I use it as a testbed for work. Somebody in this thread mentioned the magazine disks that contain shovelware (they just shovel it onto a CD). To be honest, the junk that Microsoft sends out as Beta software is often just as bad as those. I remember a particular program called Microsoft Voice (the M$ answer to Dragon Dictate), I don't think it ever made it past Beta stage, and it crashed my network big time! But as to how to reinstall, I do things differently. Sure, I used to use Norton Ghost, ages ago, even before it got its shiney new grey and blue interface (yeah, its gone fully Windows GUI now, but it used to be grey/blue, and even before that it was command-line only). I used it before Symantec bought it, before it was called Norton Ghost, and used to just be called Ghost. More recently I make use of RIS, the Remote Installation Services provided as part of the Microsoft Intellimirror technologies. I have a copy of the installation CABs for the workstation operating systems sitting on my server, and using a network card that has both WOL and BOOTP capabilities, I can get the server to wake up a workstation on my network by sending a UDP datagram addressed to the MAC address of the client, and when the workstation boots up, it uses BOOTP to obtain a DHCP packet from my DHCP server and can then use RIS to download the OS from the server. This is how I populate a new workstation, even if it doesn't already have an OS installed. All I need to know is the MAC address of the Ethernet NIC, and that is usually written on a sticky label on the NIC itself. Using RIS I can reinstall my workstations as often as I like, sometimes even more than once daily! The installation procedure is fully automated, with an unattended install script, and once the OS is installed, I use another Intellimirror technology called Software Installation and Management (SIM) to push the applications onto the workstation depending on who I first log onto the workstation as. I have published groups of software configurations on my server, and using Group Policy Objects, the server is able to determine which group of software is meant to be installed on a workstation depending on which user is logged on at the workstation. I have a different username for different groups of applications. If I want a workstation to be specialized for programming, and have Microsoft Visual Studio on it, I just log onto the workstation with that user, and it automatically gets that software installed from the server. Of course, all of this is history at the moment for me, as I have trashed my Windows 2000 server computer for now. I'm currently playing with Linux, and trying to establish a similar network environment under Linux. It's a bit of a steep learning curve for me though! Linux is quite different to Windows. SP2 for Windows 2000 Server? We're up to SP4 these days matey, plus there's a new rolled-up update package somewhere too, which contains a lot of the post-SP4 updates all in one package.
  5. This is possible. I remember from my early networking days, we used to make a "crossover cable" by swapping over a couple of wires before crimping the RJ45 plugs on, and this allowed for two computers to be tethered without the need for a hub or router. Have a look at this wikipedia entry for how to make one yourself, or you can buy a crossover adapter, which is a short length of CAT5 with one male and one female RJ45 on the ends, and the crossover is done on this, so you can just add the crossover adapter to any normal CAT5/CAT6 cable to turn it into a crossover cable. Have a look at these nifty small loopback adaptors from ThinkGeek that don't use any length of CAT5 cable, they're just a male and female RJ45 end in a single unit. You can even keep one on your keychain! Nifty! For a really secure home network, you could possibly put two PCI network cards into each computer and run these crossover cables between them all in a series, forming a loop, which is called "daisy-chaining". The computer on the left would plug in to the first network card, then the second network card would be plugged into the computer on the right, and so on right around the loop. Then you could use routing to forward the Ethernet packets between the two network cards, and could even implement such cool security features as NAT and a firewall between each network card, making each crossover cable be its own subnet. Sorry, I'm getting a little carried away here, but it is a use for crossover cables, and a way to have multiple computers in a network without a hub or router device. Normally a crossover cable will only connect two computers with a single network card in each computer. This has brought back some memories, I tell you! It reminded me of when I used to sit at my old workshop desk making null-modem cables (also called Fast Link cables, FastLynx cables, or LapLink cables), and connecting two computers together with them, plugging into the parallel ports (the big DB25 plug where the old-style printers used to plug in). Then we would use a piece of software called "FastLynx" to transfer data between the two computers over the null-modem cable. This was before Windows 95 came out. FastLynx couldn't handle the long filenames natively, and you had to do something extra to resolve the LFN data.
  6. That is awesome! Thanks for the quick reply truefusion. I put the question up on the KDE forums as well, but you were the first to reply to me. I will get on to reading through it all and then implementing it, and I'll let you know how I got on. I will probably look at getting more of my support infrastructure in place first, such as setting up a local DNS server and a local FTP server, as I would like the sources.list files on my clients to point to an ftp site on my network rather than a local file. I still have a lot of learning and implementing to do before I can say that my GNU/Linux installation is equal to or better than my old Windows 2000 Server-based home network, but I'm getting there slowly.
  7. I really like the software distribution model that GNU/Linux uses. The whole idea of a universal updater for all software, whether it be the operating system itself or the applications, can all be centrally administered and updated through programs like Adept Updater. Adept just checks automatically for new updates for all software that you have installed and lets you know if a particular program has an update. You can then choose to apply the selected updates and it will download and install the updates for you. Brilliant! But here is my question. If I have a lot of computers on a LAN all running the same version of the same GNU/Linux distro and with the same GUI interface, can I get them to look at a local source first for updates? For instance, I would like to have one machine (the LAN source) downloading the common updates with Adept, and then making the downloaded packages available for other computers (the LAN clients) on the LAN to download the updates locally from, rather than each machine using my Internet bandwidth to download the same updates. Then, for anything else that is not available from the LAN source the workstations can access the Internet directly to download their own needs. Different hardware configurations, for instance, would require different driver downloads. I would imagine that for such a thing to work smoothly there would need to be a schedule maintained, so that the LAN source would perform its own download first, and then the other LAN clients would poll the LAN source for updates, otherwise the LAN clients would not find the necessary updates available on the LAN source. Does anybody know if this is possible to configure under Kubuntu (Ubuntu v8.04 with KDE v3.5.9) and using Adept Updater v2.1 Cruiser?
  8. As some of you may remember, I have been (albeit rather slowly) undertaking a course of online study into the field of Cyberpsychology. I have recently been looking at the whole issue of online presence and identity, and I would like to get some input from others on this issue. I have been online in one form or another for quite a long time now. In fact, I would probably first have "gotten connected" back in the late 1970's, although that was with BBS's and FidoNet. I think my first exposure to the Internet for real was when I was working for Telecom New Zealand back in the mid 1980's and they rolled out an internal network called "Myriad" which, for us, ran on the sole Intel 286-based computer in the office at the time and was connected through a gateway to the global Internet. Back then, I was known online as "Wampus", and I was quite active in various online communities, including many usenet groups. There is nothing left of Myriad now. A Google search for it doesn't reveal anything, no matter what keywords I try. (If anyone has better luck at searching for it than I did, please let me know). "Wampus" died along with Myriad. Then there came a time in my online life that I am not too proud of now, and I never reveal what my online identities were back in the late 80's to mid 90's. Suffice to say that I am well acquainted with particular online communities that even today are legendary names among the 1337. But I have lost contact with those whom I knew back then. Many of them are either still in prison, or have gone on to form the backbone of the world's cybercrime prevention and computer forensics community, putting their skills that were once used for nefarious purposes to a much better use. To cover my past online indiscretions, I have gone on to create a string of throw-away identities, normally only lasting no more than 12 months each. At first I used to be fanatical about using random proxies to assist in obfuscating my identity. These days I am a lot less paranoid: there are many more people online in New Zealand now that there were back in the 80's; and also, I have changed ISPs so many times now that even I couldn't trace myself by IP address any more! I have been active in a wide variety of online communities since the turn of the millennium, with all sorts of different things holding my interest for diverse time spans. The counter-exploit brigade still pulls me back time and again as it is my online "roots", so-to-speak, but I never go back as the same identity that I was on a previous visit. I try to avoid Usenet, IRC, and anything that is going to tie me up too much these days. My wife already says that I spend too much time on the Internet, so IRC would not be a good place for me to hang out these days. But my favorite haunts are online communities, such as we have here on Xisto. I have met some really interesting people in my time here so far. I flitter around the Internet like a ghost, hither and thither, taking on a different appearance each time. I have even been introduced to myself once! That was a scream! On one Philosophical website I was known as Episteme1 and an acquaintance there, Aristotle1, wanted me to meet a friend of theirs on another Philosophical website that they were on under a different nick. It was then that I found out that I knew this person's other nick as well, and the person that they wanted me to meet was someone known as Phronesis1 which was my my nick on the second Philosophical website! So I had to dutifully make a post introducing myself to myself. I never did let on who I was! It was really quite funny. Using such tools as KeePass Password Safe, I am able to easily keep track of all of my online identities, including usernames, passwords, and related URLs. One thing that I found rather annoying at first though is this incessant need that webmasters seem to have of ensuring activity from their users. Why do webmasters always do that? If I do not visit their site for a certain period of time then they will delete my account for inactivity. For instance, how long does an MSN email account last if you don't log in regularly? The explanation is given that deleting inactive accounts makes room in their database for more active users. But what about someone such as our Saint_Michael, for instance, who has given his time and contributed to the Xisto forums for over four years now, and who must have posts in just about every sub forum on the board, if not in every thread. Should such an identity not be immortalized, at least for the lifetime of the online service itself? As I say, this used to annoy the heck out of me at first, but now I just roll with it. Creating online identities has become second-nature to me now. While some people say that finding a suitable unique nick these days is getting harder due to the increasing Online population, I still think that identifying yourself is only limited by your imagination. Making use of disposable online identities is much easier, I think, than trying to identify yourself as yourself. It is also much safer with the proliferation of online identity thefts being reported these days. There are moves afoot to standardize the way that we identify ourselves online. Microsoft's Passport was one such early innovation, and the concept has been extended as the Microsoft Live ID. Also, there is OpenID, which is another such unified online identification methodology. Back in 2005, Microsoft released a document entitled Microsoft’s Vision for an Identity Metasystem which outlined what they had learned from developing their Passport service and how they would like to implement such technology in the future. Should such a Metasystem be introduced, and together with the tinfoil hat brigade's view of a future iPatriot Act, (which presumably will steamroll over Online civil liberties just as the Patriot Act did in the real world), there is the potential for enforced real-name identification of all Internet users. Imagine having to supply your Social Security Number every time you go online! I could go on, but I'll open this up to further discussion. How do you manage your online identity? Who are you online? [hr=noshade] 1 Names have been changed to protect the innocent. [/hr]
  9. I used to be a Microsoft OEM System Builder, and a Microsoft VAR, selling both the Microsoft RBP and MOLP licenses. I am an MCP and an MCT, having lectured in both CompTIA A+ Certification and Microsoft Windows desktop operating systems. I have been using Microsoft Windows since Win3.0 and have installed and used every version of both Windows and Windows NT since then (only minimally with Windows Millennium, and no experience yet with Vista). I am currently not working in the computer industry per se. I am working for a large multinational corporation in a service industry. This is like an overseas experience for me! I'm working as an end-user instead of as an administrator. I am doing this purposefully in order to gain experience as an end-user in a corporate network environment. I am gaining useful insights into the problems experienced by corporate end-users. I have access to IP telephony, online helpdesk, an intranet, Exchange Server, Proxy Server, roaming profiles, group policies, SIM, and I daily use applications delivered through Terminal Services / Citrix. Much of this type of technology I have installed and administered in the past, but this is the first time that I have used much of it as an end-user in a production environment. I am learning what some of the hassles are that end-users face, so that when I later rejoin the ranks of the administrators I will be more understanding of my users. This is something that I have done for myself, to advance my own career. I have been looking at Linux for quite some time now, dipping my toe into the water every now and then. As you can see from the explanation above, I am very conscientious about advancing my career in the computer industry, and I happen to believe that experience with GNU/Linux will also help me in the future. I have finally "taken the bull by the horns" and started converting my home computer network to GNU/Linux in an effort to achieve this. This is just another string to my bow, and something that I am doing to advance myself. I am not likely to rush out and become a Linux convert, never to touch a Windows computer again. I am balancing my Windows and Linux experience in the hope that having knowledge of each will be beneficial. I have also done some Novell Netware work, and I treat that experience exactly the same way. By contrast, I don't expect my future work in the computer industry to bring me much into contact with the Mac OS, so I have stayed well clear of it.
  10. I now have two of my home network computers running Ubuntu! The second one is dual-booting with Windows XP. I have successfully gotten my 500GB external USB hard drive to work, but I had to temporarily put it back on the WinXP machine as the last time I used it I didn't "safely remove" it and GNU/Linux complained about that. Apparently there is a way to overcome this problem in GNU/Linux from the Command Console, but I'm not confident enough with that yet. So I put the external drive back on the WinXP machine, turned it on, and then safely removed it. When I put it back on the Ubuntu machine it worked perfectly. My next task is to take control of my Samba network neighborhood. I want all of my machines, whether they are running Windows or GNU/Linux, to all be in the same workgroup. Eventually, they will be in the same domain instead of a workgroup, and will be using LDAP for authentication. I may even start looking into RADIUS for authentication, but that will come a lot later on. Centralized authentication, centralized logon, and roaming profiles has been something that I am used to under Windows 2000 Server, and I want to see if I can get similar functionality under GNU/Linux. Anybody want to give some advice on how best to achieve this? I'm really enjoying delving into GNU/Linux now. I spent the day yesterday building a useful set of Internet bookmarks of sites that contain information to help me. I will be taking one of those online training courses soon to give me a head start with learning GNU/Linux. Oh, by the way, if you're wondering why I keep referring to "GNU/Linux", have a look at this website.
  11. I hate to wade into this discussion again, as I know that Saint_Michael has adequately address this problem in a submission for Suggestions For Version 3 Of The Credit System. However, I do find it ironic, as with freeflashclocks example here, that it not only applies to first-time hosting accounts, but also to those who upgrade from the Hosting 1 package to Hosting 2. One thing that someone explained to me at the time that I first encountered this problem was, as galexcd said, "non-hosted credits are not worth as much as hosted credits because when you are hosted your credits go down one per day, so it is much harder to collect credits". But in the example here that was not the issue, as freeflashclocks had worked hard to earn those credits while being hosted, so this statement doesn't apply. I do hope that they get this sorted out soon. Any such fix won't affect me, or freeflashclocks, or anyone else who has been duped by this already, but I'm sure that people (both Xisto staff and member volunteers) are getting tired of answering for this problem.
  12. I only have two tutorials published on the Xisto forums, and both are about subjects that I have at least a couple of years of experience with and could answer questions about them if asked. I wouldn't presume to write a tutorial about networking with Linux after only two or three days of playing with it. I'm sure that there are plenty of people out there with much more experience than I have in this field, so I will gladly leave the cultivation of Credits for publishing such a tutorial to someone else. I wouldn't want the responsibility of putting you or anyone else wrong!
  13. I haven't seen this movie yet, not even the first one, but I am intrigued by the concept that you have outlined in this thread, adriantc. The current turbulent times in the American financial market no doubt make postulating on such a theory to be very opportune. Tonight on television here in New Zealand we saw an article about the major supermarkets of Australasia, including Coles and Woolworths, and how they are freezing out competition, even to the detriment of their own suppliers, by demanding ever higher profits. I remember how this worked when I was still working in the computer industry: as a retailer myself, I could buy, for instance, an HP inkjet printer cartridge cheaper by going to a large retailer than I could by going to a wholesaler. It is all about the economies of scale. If a large retailer, such as *BLEEP* Smith Electronics here in Australasia, or, for instance, Circuit City in the USA, buys up 1,000 units or whatever, they get a better price per unit than I would if I am only buying 100 units. Therefore I can take advantage of the better margin that the wholesale gives to Circuit City by buying from them rather than buying direct from the wholesaler. And it is all money in the back pocket of big business. Eventually supply lines get so tied up that the large retailers demand such a lot of stock that the wholesaler cannot supply anybody else other than the large retailers and there is no stock left over for the smaller competitors. This is what the television show tonight was talking about, how the primary producers, fresh fruit and vegetables, are getting so tied up with the large food chains that they cannot extract themselves. Eventually the large retailers are re-branding the manufacturer's goods with their own brand names and selling and selling the same goods cheaper (at retail rates) than the wholesalers are prepared to sell to anyone else. And yet it is apparently not anti-competitive! I think in light of such problems with capitalism there must be a better solution. I will be interested to see this documentary and see what they have to say.
  14. Thanks for the input, truefusion. You may well be right about the amount of RAM that I am consuming using the full KDE environment. I haven't even worked out simple things like how to see the amount of free and committed RAM yet, so I still have a lot to learn. When it stops running because I filled up the RAM then I may well consider your suggestion. In the meantime, I got everything back under control last night. I reinstalled Ubuntu v8.4 LTS Server Edition from scratch, and reinstalled Kubuntu, the KDE Desktop for Ubuntu. Nothing stopped half way through this time and it all seems to be working. I managed to get a driver working for my Riva TNT2 graphics card, so now I am not stuck on 800x600 resolution any more. My current struggle is trying to set a static IP address. This machine is going to be a server on my network, so I can not very well have it getting a new dynamic IP address every time the lease expires or every time that I reboot. However, while the networking seems to work quite fine with a DHCP packet, whenever I assign a static IP address I can no longer connect to the Internet. I have specified my default gateway and DNS servers manually, exactly as they appear in the DHCP packet, I have the IP address set well outside of the DHCP scope so that I don't cause an addressing conflict, but I cannot reach the Internet with a static IP address specified, only when I have the DHCP client enabled. Any ideas? This is key to me making further use of Linux as a server platform. As for apps, I have a copy of KeePassX installed now, which is the Linux port of my favorite free password manager. An absolute essential as far as I am concerned! I have also added some system tools using Adept Manager. I quite like the software distribution idea within Linux, and it definitely beats the Windows software distribution model. I have added apps to help me manage the network services that I intend to run from this box, including the DNS and DHCP servers, the print server, etc. I can see and access the shares on my Windows-based computers, but I am currently in a different workgroup to them, and haven't yet worked out how to change my workgroup under Ubuntu. Any advice? By the end of this I would like to be able to see all the computers in my network, irrespective of platform, all together from any other computer in the network. I would also like to have them all hanging off the same local DNS server, so that I can ping them all using a FQDN that is resolved by the local DNS server on the Ubuntu machine. Currently all of the Windows-based machines on my network get a DHCP packet from my WiFi router, and they can all browse the Internet happily. I have added the (intended) IP address of the Ubuntu machine as a third DNS server in the DHCP packets that the WiFi router sends, in preparation for this change. I have a multiple NAT router configuration on my network, with many NAT routers in series cascading, as per the example from Gibson Research. I don't think that this configuration will hamper my intended plans, as all NAT activity is translated at the NAT router and is transparent for the clients. I don't need to do anything special to the NAT clients to make them NAT clients other than point them to their local NAT router as the default gateway and say "go for it"! Then on the WAN side of that NAT router it points to the LAN side of the next NAT router in the series as well, and it all happens transparently to the client. Eventually the final NAT router in the series, my ADSL router, spits everything out onto the Internet via my ISP's default gateway, and the internal network is fairly well protected. I do not normally use a proxy, but have been known to occasionally. This gives a brief overview of my network, to aid in advising me on how to overcome any problems that I encounter with my Ubuntu networking. Again, thanks to anyone who offers any advice and suggestions for this project.
  15. Thanks for that input Saint_Michael. Since my last posting I have added the extra 128MB of SD-RAM to the machine, giving me 256MB of system memory. My motherboard doesn't support anything more than 1GB of RAM, and I believe that the best that I can get is two 512MB SD-RAM modules, and they would cost an arm and a leg these days! I have also removed the 20GB drive that has Win2K Pro installed on it and replaced it with the 40GB drive. That was one of the drives that was previously in my old Win2K Server computer, which had a Dynamic partition that had to be destroyed. Fortunately I was able to recover all the data from it first, and have it now stored on my external 500GB drive instead. With the Linux Server system now prepared, I have started installing Ubuntu v8.4 LTS Server Edition. I spent time verifying the system memory and verifying the distro CD first, just to make sure that everything was properly prepared for this, then I began the installation. You were right about the partitioning utility, Saint_Michael. One of the options for partitioning my drive with Ubuntu was to set up an LVM which I chose. I have no idea yet what it does, but I am assuming that it allows me to dynamically repartition my drive on the fly. Still nervous about that from my experience with Windows Dynamic Partitions, but I'll see how it goes. During the installation procedure I selected to install all of the Server services, including DNS, LAMP, Print Server, and File Server options. It will be interesting to get this system to talk via USB to my Brother HL-1430 laser printer and share it on the network for my Win2K / XP clients to print to. I'd also like to see how Ubuntu handles my external, USB-based 500GB hard drive. The installation appears to be complete, and I'm staring at what looks like a DOS prompt of sorts, in form of username@host~$ and I guess this is where it all begins. While I am familiar with DOS and some Linux commands, I prefer the familiarity of a GUI that uses a mouse pointer. I recall from my limited past experience with Linux the names GNOME and KDE, and I guess I need to find out how to start the GUI for this Linux installation. Any suggestions? Help! Does anyone know how to start the GUI for Ubuntu? I'm stuck at the DOS-like prompt! Okay, panic is over, I have found the answer to that problem on The (k)ubuntu Community website, which tells me that the Server Edition of Ubuntu v8.4 LTS does not install a desktop by default, so to add one I have to use one of the following commands from the DOS-like prompt: sudo apt-get install ubuntu-desktop (for the GNOME desktop interface), or sudo apt-get install kubuntu-desktop (for the KDE desktop interface) So, I have decided to use KDE, and it is currently extracting the files. Despite me putting the distro CD back into the drive, it looks like it is accessing a website via my ADSL connection to install this package, which is quite impressive considering that I haven't done anything to configure the LAN setup yet other than what was done during the installation procedure. I haven't even specified a TCP/IP address yet! It must have grabbed a DHCP package all by itself. Smart Linux! I like it so far. Things didn't go as well as I thought they would. Half-way through the installation of KDE (I think it was up to extracting the ttf font files) it suddenly decided that I had a "Read-Only File System" and would go no further. Now whenever I try to boot the Ubuntu Linux server, it comes up with messages saying that the "root contains a file system with errors, check forced", and that "Inodes that were part of a corrupted link list found", and that there is "UNEXPECTED INCONSISTENCY; RUN fsck MANUALLY" because "fsck died with exit status 4". Damn! I don't know what half of this means, but I guess from my windows experience that this means that there are some lost clusters and that scandisk could not fix them in non-interactive mode. My guess anyway. It then puts me into a sort of "Safe Mode" with the DOS-like prompt having changed from username@host~$ to now being root@host~# and I guess that the hash mark on the end of the prompt is somehow important. It says that I can exit this safe-mode by using CTRL+D, and when I do it just reboots the machine and I get the same messages over and over again. Looks like I have successfully killed my first installation of Ubuntu in under 1 hour, just by trying to install the GUI. Not a great start! Any help or advice offered would be great. As you can see, I'm not doing too well! Now my choice is to try to recover from the aborted installation of KDE or start again from scratch, reinstalling Ubuntu from the distro CD. For now I will go with the first option. I did as suggested and ran fsck from the DOS-like prompt, fixing all the problems that this utility found along the way and then rebooted (also, I found out along the way that CTRL+ALT+DEL, the three-fingered salute, doesn't reboot a Linux installation). I am now back at the DOS-like prompt and have logged in again as my username. I decided to try re-doing the KDE installation using the command sudo apt-get install kubuntu-desktop as I found earlier on the The (k)ubuntu Community website. However, this time it recognized that a previous installation had failed, and demanded that I run manually the command dpkg --configure -a to correct the problem. When I tried to do that it complained that I needed "superuser privileges" to run that command. I guess this is where I have to change from my username to the "root" user. Any idea how to do that? The second option, to reinstall Ubuntu from scatch, is looking more and more inviting! I have now been using Linux for about 3 hours, and have not seen anything other than a gross DOS-like prompt. This is why I kept on dabbling with Linux in the past but never getting into it properly. I am starting to hunt for my original Windows 2000 Server installation CD. At least I know with Windows that I can get a GUI interface within about 30-45 minutes! I have fixed it! I booted from the distro CD and used the recovery option to "Rescue a broken system". When I got to the shell prompt I entered the command dpkg --configure -a and it seemed to finish doing the extraction and installation of the KDE desktop and applications, including those pesky ttf font files. Then I started the GUI using startx, and I am now looking at my first Linux desktop since I played with CorelLinux about three years ago. Time to go playing! I want to find out how to have KDE start automatically when I boot the server, and also start configuring some of my hardware. Any further advice or suggestions would be appreciated. Latest update: I'm back to the start, reinstalling Ubuntu from scratch! It turns out that the first aborted installation of KDE somehow knocked out all sorts of things, most of all the network settings. All sorts of network related services were not running any more and with my lack of knowledge about Linux network configuration as yet, I decided that it would be faster for me to just reinstall it all from scratch rather than try to fix the problems. The OS is installing now, and I should soon be back at that now familiar DOS-like prompt, ready to have another go with KDE.
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.