Jump to content
xisto Community

evought

Members
  • Content Count

    244
  • Joined

  • Last visited

Everything posted by evought

  1. You may also want to take a look at LifeNet http://thelifenetwork.org/. They are developing software to build ad-hoc networks using Wi-Fi and Bluetooth on portable devices (e.g. Android smartphones) but they can also make use of wireless access points if they are available and set up correctly. In a rural area, cell phone service can be fragile, and a few well-placed access points could allow someone to route an emergency call, especially if the Wi-Fi has a battery backup. You seem to have little problem installing custom software on your router, so it may be something to play with. In our case, we are moving to a setup somewhat similar to yours. We have a few wired systems, and internal wireless network that is severely degraded by the walls of the house (chickenwire in the old plaster in places around here). Anything important internally goes over SSL/SSH anyway. We are putting a second access point with high-gain antennas on the roof firewalled from the local network and powered off of our small solar (and soon to be wind) R/E system so it will continue to be available in a power outage such as another regional ice storm (we've had two in the last four years and one EF5 tornado nearby). We are playing with the early LifeNet software in conjunction with local Neighborhood Watch efforts. We can also use a Wi-Fi PTT app on our smartphones on or near the farm. As a side benefit, I am loading a whole bunch of documentation on a webserver which will be accessible on the long-range Wi-Fi for all that stuff people wish they had downloaded before an emergency, like how to correctly wire a generator and not fry your linemen. With a good antenna and favorable terrain, we have been making Wi-Fi connections at 8 miles or so.
  2. All of a sudden, I cannot edit my posts after they have been submitted. I have been on this site for quite a while and have routinely done so in the past, often making small corrections years later (e,g, broken or moved links) when I go back to use my own tutorial :-) I have posted two parts of a tutorial recently, need to go back and make some corrections before putting up the next part and wanted to link them together once they are all up (make it easy to go from one part to the next). Problem is, there does not appear to be any button or option to do so... unless I am losing my mind. I am running in Firefox 7.0.1 on Linux FC 15 with NoScript enabled, but I have white-listed Xisto and the various ancillary domains it loads scripts from, so it should not be a script issue. Is this a feature change, a glitch, or... I also cannot seem to post in the "Hosted Members" section...
  3. Panasonic Toughbook CF-29 and CF-30 Fedora Linux install; Part II This is a draft. I need to make an editorial pass and fix the links I have stubbed out. With recent Linux versions, there are no special flags needed for Toughbooks to boot and run the install. Just boot your chosen media (CD/DVD) and install as usual. If you are simply deleting windows, you can keep things fairly simple by letting the installer erase the disk and lay out partitions for you. Otherwise, you need to delete the second partition and put together a basic layout. Since this is a laptop, thought needs to be given to security. Laptops are meant to be portable. That makes it easy for you but also makes it easy for someone else to pick it up and go. The cost of the lost hardware is one thing (and maybe you have insurance which will cover it), but criminals don't tend to take privacy very seriously either and you probably have data you don't want them to have: banking information, emails about where your child goes after school, passwords and so forth. If you are using a Toughbook in the first place, in a law enforcement, medical, disaster relief, context, etc., you probably have to worry about protecting other people's data as well. So a good bit of this tutorial is going to cover some ways you can set them up to be secure--- or as reasonably secure as can be expected. I am going to assume you have basic knowledge of Linux installations in general (outside of Toughbooks). Full Disk Encryption The first step is going to be Full Disk Encryption (FDE). Linux makes it very easy to encrypt the entire contents of the hard drive. This serves two basic purposes: 1) you do not need to selectively figure out what things are "important enough" to be encrypted and then miss something obvious 2) you ensure that an attacker cannot easily misuse incidental data such as system logs, contents of swap space or file names to figure out what you are protecting and how to get at it. Whenever your laptop is off, its contents are effectively useless to criminals of modest resources. The important thing to realize is that whenever your computer is left on, like when you get up from the table to get your coffee order, FDE does you know good whatsoever, even if your screen is locked. A criminal that takes your laptop while on or suspended can bypass the encryption. Sophisticated criminals can recover information from RAM even if it was just very recently on. Start by creating a /boot partition. It has to be unencrypted and big enough to hold the kernel and libraries needed to bring the system up and decrypt the rest of the hard drive. My /boot partitions end up being about 500 MB just as a nice round number. Next create an LVM Physical Volume and let it fill the rest of the disk, checking the box to encrypt the volume. Now create a Logical Volume Group in the Physical Volume. Create a swap partition (usually about the same size as your RAM, so for the CF-29, I create a swap about 768 MB). Create a root partition ("/") for your system files. Since I have plenty of HD space, I set these at ~50GB to make sure I have plenty of space to install packages and applications. Create a home directory for your data and documents. Do not check "encrypt" for these partitions since the container, the "Physical Volume" they are in, is already encrypted. Link to Fedora LVM chapter. I tend to not let the home directory fill the entire disk. 50 GB is more than adequate to start out and the Linux Logical Volume Manager lets you add space to a file system later on. It is a bit more tricky to shrink one on the fly. By leaving unallocated free space, you have something to play with if you run out of room somewhere or want to create a new partition. If you continue the install from here, you will be asked for a password for the encrypted volume. This password will be required every time you boot and every time you recover from hibernation, so it has to be something you can type and remember, but it should also be a good password of probably 8+ characters, mixed punctuation, not a dictionary word, and with digits or symbols thrown in. It is possible to brute force a bad password and then the rest of your protection can be unravelled. NTFS (Windows) 60 GB (or whatever) /boot 500 MB [Physical Volume (encrypted)] swap 768 MB / 50 GB /home 50 GB Free Space XXX GB This leaves you with an unencrypted /boot, an encrypted swap, root filesystem and home directories. An attacker who takes your laptop powered off will have access to your Windows partition and /boot. They cannot readily access your LVM physical volume (AES encryption by default and reasonably hard to crack if you have a good password). Your swap space, which might contain encryption keys for the physical volume, is also protected. System logs which might have information about your documents and habits, systems you connect to and so forth, are encrypted along with everything else. If your data is important enough or if you have pissed off a computer nerd, someone could modify your /boot data, install a keylogger, and record your encryption password (or something equally sneaky). The Toughbooks have quite a few options for locking the laptop at the BIOS level. The easiest way is to simply turn on the BIOS password and require it at boot and at access to the BIOS. You can also set a Supervisor password to lock the BIOS and system setup menus and a separate User password to boot the laptop (say, if you set up the computer but someone else uses it). This keeps someone from easily booting up, tampering with your software and shutting back down. Your nemesis could remove your hard drive and try to get at your data that way (protected by the encryption) or tamper with /boot, but they are not going to do that while you pick up your coffee at the Starbucks counter. The CF-30's have a TPM processor, a system-security chip which can be used to thwart even that: telling you at bootup that your system software has not been tampered with or encrypting the entire contents of your drive with the internal crypto system. I may cover TPM configuration in a later post, but it should be mentioned that there is a simpler method: put /boot on a USB dongle and carry it with you. This is inconvenient with the CF-29 and its single USB connector, but easy with the CF-30. Insert link to article on removable /boot and TPM. The DOD method I have witnessed is also low-tech and effective: pull the removable hard drive and drop it in a safe when not in use. This is the reason Toughbooks have easily removable hard drives. The rest of the install After all of that, the rest of the install proper is pretty simple. I selected the Desktop and Software Development package categories. We will need the software development tools to build tools to set up the touchscreen properly, which I will cover in the post-install. You might as well just check to box now and get it out of the way. Oh, and by the way, if you choose to set up the network during the install, remember to flip the CF-30's wireless cut-off switch to "On" before you spend thirty minutes trying to figure out why the network isn't working. A green LED will light up to tell you it is activated. Post install Fedora Linux will bring you into a setup program on your first successful boot. Here you will set up the network (if you have not already), set a root password, and create a non-root user. The root password will be very rarely used, except critical system maintenance, and if you hose your normal account. Make it a good password and then write it down somewhere safe (like in your key safe or a document safe). Your user account password will be used for every day logins and through the "sudo" tool for routine system maintenance. Make at least one of your user accounts an "administrator" so that this will work properly. If you are truly paranoid, you will have a normal user account you use every day and another with the administrator flag checked. This means that after visiting a malicious web site which takes over your browser, they won't be able to install a boot sector virus. I'll leave that decision up to you and will present another option further on. I often set the administrator flag on a new system and once I have it set up the way I like it, remove it from my normal user and create a separate administrator account for routine maintenance. Updates After you finish the setup and get to the login screen for the first time, you will want to run the Software Update program to get all of the security patches and updates. Since FC-15 has been out for a while, expect to spend some time even with a good network connection. There seems to be a common bug in FC-15 where you get an error message involving "fedora-release-rawhide" in your first update. I solved this by opening a terminal and running "sudo yum update fedora-release-rawhide" and when that finished, "sudo yum update" to do the updates manually (which is always an alternative to the GUI Software Update program). Your mileage may vary. Gnome If you have not used Gnome 3 before (the graphical desktop used by FC-15), it may take some getting used to. I have found that I like it OK on the CF-30 and it is reasonably snappy. On the CF-29, in Gnome's "fallback mode" for those of us in technological backwaters, it is actually quite painful since most of the new ways to get to the applications and settings you need are not there and the old ways are not there either. Reading a short Gnome-3 tutorial is worth the time and things will not make much sense if you don't. On the CF-29, I install a package called "Gnome Do". This allows me to hit Command-Spacebar and pop up a search prompt. I can then start typing the file or program I want, hit tab when it comes up with the right answer, and then run or open it. So, I can start typing "Terminal," about when I get to the r, it pops up the correct program, I hit Tab and Enter and have a terminal window--- much faster than trying to get through the menus, especially before I get the mouse working properly. I intend to figure out how to get WindowMaker (another desktop option) working on FC-15 but have not yet done so. Touchpad/Touchscreen On the CF-29, you will notice that the touchpad is painfully slow. It may take you about a thousand years to move the mouse cursor from one corner to the other. The touchscreen won't work at this point, either. If you tap on a spot on the screen, the mouse cursor will jump but won't go quite where you want it. On the CF-30, the touchscreen will misfunction the same as the CF-29 but the touchpad will work. If the CF-30 touchpad is slow for you, you can adjust it from System Settings->Mouse and Touchpad. System Settings can be gotten to a few different ways, one of which is by going to your account name in the upper right corner of the screen and clicking to pull down that menu. The CF-29 touchpad cannot be adjusted so easily. After much frustration and trial and error, I fixed it by adding the following to a file called .xinputrc in my home directory: ~/bin/fixtouchpad Create this file, add that line, and the save it. Make it executable ("chmod u+x ~/.xinputrc"). Now create a directory called /bin, and create a file called "fixtouchpad" with the following: #!/bin/bashxinput ----set-ptr-feedback 10 1.5 10 1 Make this one executable as well. I tend to add ~/bin to my path in my .bashrc and put various little tools in it. Now, in case this does not work for you, let me explain what it does. "xinput" is a tool for tweaking the behavior of various X Windows input devices: mice, trackpads, trackballs, etc. This line sets the acceleration (ptr-feedback) for a device numbered 10 which happens to be the builtin trackpad for me. It sets the acceleration to 1.5 starting after 10 pixels of movement and counting by 1 px at a time thereafter. Play with these numbers to find what works for you. According to the documentation, "1.5" should not work--- it is supposed to be a whole number--- but it does work and seems to behave better than using "2" for me. You can run xinput in terminal with different settings as many times as you need and then copy the numbers into fixtouchpad. "xinput --get-feedbacks 10" might also be useful to you. When you log out and back in, your .xinputrc should be run automatically and the touchpad should work. If this does not work at all for you, you may need to figure out the device number. xinput has a manpage ("man xinput"). "xinput --list" will give you information on all the input devices: You are looking for the "PS/2 Touchpad". .xinputrc will run fixtouchpad for you every time you log in, but your trackpad settings will go away when you hibernate and thaw. That is why I put the settings in a separate script. When I come back from hibernate, I run fixtouchpad and go (Actually, I type "fixt|TAB|", the shell autocompletes the command, I hit Enter and go. I'm lazy. As soon as I find out how to run this automatically every time I come back from hibernate, I will.) OK, so now the system is relatively usable. Let's fix the touchscreen. As it turns out, the touchscreen works fine, unlike previous version of Linux. The problem is that each touchscreen is just a little different from every other, and it needs to be calibrated using a tool called xinput_calibrator. The normal Gnome tool will not work. Download the "source tarball" of xinput calibrator from Find link. Unpack it, configure, compile, and install. e.g.: mkdir buildcd buildtar -xzf ~/downloads/xinput_calibrator.whatever.the.filename.iscd xinput_calibrator./configure <<--- this isn't right, check it!!!makesudo make install Now you can run "xinput_calibrator --output-type xorg.conf.d". It will ask you to tap in various places (best to do with the stylus so it is precise) and then will spit out the settings you need. Cut and paste them into a file called "/etc/X11/xorg.conf.d/99-calibration" . You will need to edit the file using sudo, as it will be owned by root. Mine looks like this: # Calibration values generated by xinput_calibrator for CF-29 TouchscreenSection "InputClass" Identifier "calibration" MatchProduct "LBPS/2 Fujitsu Lifebook TouchScreen" Option "Calibration" "271 3995 317 3869"EndSection Log out and restart. Your touchscreen should now work. It is fine-grained enough in Linux to do some handwriting recognition and sketching using the stylus, though the screen is akward to write on. Stay tuned for Part III with even more Toughbook-tweaking madness!
  4. Panasonic Toughbook CF-29 and CF-30 Fedora Linux install; Part I This are notes from installing Fedora Core 15 Linux on a Panasonic CF-29 and a CF-30 Toughbook. The Toughbooks are a line of rugged and semi-rugged laptops which are often used for military and law enforcement applications but can be useful for farms, fieldwork, factories, disaster relief or other situations where equipment has to deal with a punishing environment and used laptops can be had for decent prices off-lease. The CF-29 is a dated model based on Intel Pentium Mobile processors, Mil-Spec 810F for shock and vibration, with niceties like a builtin harddrive heater for extreme cold, and the CF-30 with Intel Core Duo processors is more recent, but the 29 is still popular with law enforcement in our area. There are actually a number of pages on the Internet about these laptops and set-up on linux, but they are outdated and therefore confusing. This tutorial is an attempt to put all the information I needed up-to-date and in one place. It will be posted in a few pieces because it is long; as I figure more things out, I intend to post updates as well. The CF-29 The CF-29 has a Pentium Mobile processor of up to 1.6 Ghz depending on exact sub-model or "mark". The letters in the model number after the 29 tell you what hardware the laptop came with (some of which may have changed if you got it used):
  5. No, it refuses to work if there are not four functioning cartridges. We have gotten an Epson Workforce 840, also a multifunction printer and I will probably have a review posted at some point. The HP still works, after a fashion. It is currently not refusing to print, but this has happened before, where I have to go in, reset it to factory default settings and set it up again. Overall, the printer is just not worth the hassle and is rapidly becoming e-waste.
  6. That depends on the country. Here, the least expensive products are from abroad because local producers pay high taxes, have to comply with expensive regulations, and have a high cost of living. Many of the countries which export products to us can get away with paying their workers a fraction of what is paid here. That is even starting to be true of food, even though we have some of the best farmland in the world. So, it is not that we can't produce local goods but that the system has been set up in such a way that the local economy has been destroyed. But it is very interesting to hear the perspective from another country. Thank you.
  7. As some posters have mentioned, any text editor of any kind, including Notepad, vi, etc., will work. There is a definite benefit to having automatic brace-matching and some syntax highlighting to cue you into problems, though. Flexible indentation control is a big convenience, but I find that even the best don't tools indent the way I want it done; you indent code to help someone understand it and doing it well would require a tool which itself understood the code (and read your mind). Being able to collapse or hide some of the markup is also a really good thing in writing heavily marked-up documents such as for Docbook XML. At some point, the XML syntax buries the actual meaning of the page unless it can be gotten out of the way. I often work on more than one type of machine (usually Linux and Mac OS X) and mix multiple kinds of markup or markup and code in one project, so I've gravitated toward editors that handle that well on more than one OS. I also look for editors that work smoothly with version control systems like svn so I can track my changes as I go. Two editors that any serious designer should look at are JEdit ( http://jedit.org/ ) and Emacs ( http://forums.xisto.com/no_longer_exists/ ); JEdit is a free Java-based environment which runs well anywhere you need it and supports all kinds of plugins for specialized tasks (e.g. validating, transforming, and previewing various XML dialects). It has plugin support for both CVS and SVN version control (among others) and, even without the plugin, recognizes and deals with changes to the files on disk (from committing/reverting changes, for instance) smoothly. If you add a lot of plugins, it does hog a bit of memory but is quite fast on today's machines. You can easily customize the GUI to your work preferences: which buttons appear in the toolbars, where tools display, moving or collapsing the different information displays and so forth, and it works very well with keyboard shortcuts. It also has support for editing remote files over ftp, ssh or whatever. It works great for simple HTML/CSS sites or as a full-blown DocBook production environment. You can run into some trouble with too many plugins as some of them can interfere with one another. Emacs is one of the early UNIX editing environments which has grown not only a large user base and a lot of customization, but practically its own religion. I used it constantly in college years ago for everything from coding to system administration to writing papers. It takes work to get used to as it has its own way of doing things, but many times the ways make sense. You can strip emacs down to run quickly from the command-line over a network connection or as a full-blown GUI environment. There are several ports or adaptations of it that have their own unique features (e.g. http://aquamacs.org/ for OS X). One of emacs' best features is that pretty much *everything* can be done without your hands leaving the keyboard. When writing heavily marked-up documents, this can be a big thing. I have found that a lot of Emacs' handling of international characters can be clunky if you are used to the way the OS handles them (such as dead keys and compose characters).
  8. When I am saying "sitemap" here, I am saying an expanded index page for the reader to navigate quickly to other parts of the website. Instead of having to go from A to B to C to D, just go from A to the index to D where the want to end up. If you have such a page on your site, I am not seeing it. But if you have that and it is easy to see, you reduce the number of links on your leaf pages. When I read complex sites like that, I usually have the map/index/trunk page open in one tab and open leaf pages in separate tabs so I don't need to keep going back and forth. Many web sites are designed around a linear route through the site. Many technical users open tabs and navigate in a more parallel fashion, the same as we would do with a book and a good index. An image map is an image where clicking on different parts of it go to different places. So, for instance, I have a diagram of the nervous system, can move my mouse over the brain, click, and go to the brain page. With status bar feedback, the reader can see where they are going as they hover the mouse over the image, so you can have fairly fine-grained navigation. There is a quick page on how to do it here: http://www.pageresource.com/html/imap1.htm . Notice the status bar change (in Firefox, at least) when you hover the mouse over "Tables" and "Frames" in the example at the top of the page. They are not too hard to do, but you need to test thoroughly in a couple of browsers. I should have said "rel" attribute, technically. "rel" is an attribute on the anchor (<a>) and link (<link>) tags that tells the browser (or the search engine) what the purpose of a link is. You put link tags in your page header (with the title and so forth). Example: <head><title>Nervous System</title><link rel="contents" href="contents.html"><link rel="prev" href="muscular.html"><link rel="next" href="respiratory.html"></head> and... <ul> <li><a rel="chapter" href="muscular.html">Muscular System</a></li> <li><a rel="chapter" href="nervous.html">Nervous System</a></li> <li><a rel="chapter" href="respiratory.html">Respiratory System</a></li></ul> The rel attribute is telling the spider what the meaning of the repeating links are so it can figure out whether to pay attention to it or not. It also allows the browser to give clues to the user (some do, some don't, slowly getting better). Some details here: http://www.w3.org/TR/html401/struct/links.html%23h-12.1.2 . As browsers get better, the result will be that the browser will create the sidebar from the link relationships instead of relying on clunky CSS; this is especially important on new devices like phones and tablets where the CSS might flat not work, anyway. They are basically an ebook-publishing service. You can post your ebooks with them, they provide search and navigation, users can broadcast ("readcast") what they are reading or what interests them and build collections of books which they can share/recommend with others. So it provides the word-of-mouth advertising which is effective to get your work in front of people. They let you choose whether to allow readers free access or to preview and buy before reading and downloading your work. You can even control how many or what pages the reader can preview. You can also choose to use DRM to restrict their ability to copy or cut and paste from the document and so forth. I prefer not to because I think DRM just annoys customers and makes them less likely to come back and buy more. When a user buys a book, it gets credited to your account minus their commission and they cut you periodic checks (minimum of $100 in your account before they start writing checks). There are several services like it now. Scribd is just, in my opinion, one of the better ones and they have a lot of readers. It would probably not be hard to put a PDF of your site up on there (or a similar service) and, if it does not sell, no loss--- you still have your web site and ad revenue. NoScript is a Firefox extension and a fairly popular one ( https://noscript.net/ ) which blocks execution of flash, javascript, silverlight, certain kinds of redirects or iframes and similar content. People use it to protect themselves from malicious scripts, particularly malicious scripts in ads. It has very sophisticated control of which scripts can run and which cannot. For instance, I can choose to allow scripts from Xisto.com but not scripts from anywhere else that this web page might load (from external content or ads). I may trust a website to not have bad content, but I don't necessarily trust everybody else they load scripts from, and I know that the webmaster cannot possibly check advertisements served to them by 3rd parties such as google. I don't mind ads, necessarily, but I don't want them executing scripts or flash---they are a serious source of malware and a source of cross-site scripting attacks. People also use NoScript to prevent cookies and scripts from being misused for tracking purposes (e.g. with GoogleAnalytics). "Default deny" means that any website I have not explicitly said I trust gets no scripts or flash at all until I say otherwise. That is how most people use the tool. What this means to you is that when I go to your website from a search engine, I won't see any dynamic content at all. Only if I decide that I like your site and trust you will I turn on script support for it and if I cannot see enough without scripts to make it worth my while, I won't do it. I will often check the site's reputation with services like WOT ( https://www.mywot.com/ ) to see if they have had malware problems. There is an extension for Chrome now that does something similar to NoScript now, too. I would imagine it is going to become a standard thing over time.
  9. The other interesting thing PayPal does which we have gotten bitten by several times now is their checkout process will occasionally glitch and charge either the wrong amount or to the wrong bank account and rack up huge amounts of banking fees. One time, between the payment and confirmation page, PayPal multiplied the amount by $100. One form page was using integer cents and the next was using a float value for the amount. I caught it at the last moment before it charged several thousand dollars to an account which did not have the money. Another one we have gotten hit by twice is when you specifically select the payment to come from one account but it is actually charged to another (with attendant bank fees if it overdrafts). The first time this happened, I just happened to print a copy of the confirmation screen and was able to therefore show that the confirmation screen and the email did not match, so the bank and PayPal between them resolved the or so in bank fees. But I do not do that every time and recently it did it again just before we left on a vacation. The account it tried to draft the money from is kept deliberately low in order to protect our main funds from fraud. We only transfer money into the account when we need it. PayPal tried twice to take out of the wrong account. Partway through, I figured out what was going on and called their support. They said they could not cancel a transaction in progress from their end and that, "Don't worry, it will automatically try the correct account once it fails." It did. PayPal got their money, buy by the time everything was said and done, a PayPal transaction of like $8 ended up costing us $200 in overdraft fees. Liberty Bank and PayPal pointed fingers at each other and basically said they were not interested in working anything out. We closed our account with Liberty Bank and will close our account with PayPal as soon as we find any alternative or, if we don't find one soon will just close it anyway. The problem is that PayPal has access to electronic draft your money. If you authorize any transaction, that authorizes every transaction. When they screw up, they can cause devastation, not just in fees but in hours of trying to straighten things out. When something like that goes wrong, they don't care, especially the bank which makes quick money on the fees. Liberty Bank claimed to be a small-town, customer-oriented bank, and they were until they were recently bought. Now they are just another business out to screw you over, just like PayPal. Does anyone have any decent alternative? I have spent quite a bit of time looking for them, but most of the best seem to either no longer exist or not be offered in the US. We had an account with e-gold, but since they got screwed over by the US government, their system is painful to use, they claim the right to charge you whatever they want pretty much on a whim, and they don't operate in Missouri at all. I could accept payment with e-gold, but apparently I could never withdraw the money. So what is the option, both for personal payments and for offering website checkout?
  10. It can, if they see where they want to go more quickly than they can hit the "back" button. So there has to be a balance between enough links to go where they want and not too many that it becomes cluttered. CSS menus and especially Javascript menus are not necessarily SEO-friendly, no, but neither is any other alternative, really. There are at least several other alternatives for the kind of navigation you are talking about, each with their own problems: 1) Have a "Site Map" page with all of the links expanded or possibly with an image-map in your case. That means any page is at most two clicks away. I would work very hard to make that page accessible and convenient, with text links (possibly a CSS drop down that will fail gracefully if CSS isn't doing what it should) to back up the image map and careful use of alt attributes and such to let screen-readers and so forth work well. I would probably do this regardless and use a "rel" tag to tie it to the main page (for the handful of browsers that know what to do with it) and so spiders know what it is for. Just explode your "Quick Navigation" menu or something. 2) Do a TOC in a separate frame, pane, or inclusion. This is not SEO-friendly either and causes problems with bookmarking, etc. Frames are a real problem (again, look at the Ozark Herbal site I inked to as an example of what not to do). HTML5 is providing some better ways to do this and to tell search-engines/browsers what it is for so they can process it properly... but HTML5 support is not yet universal either in spiders or browsers. Quite a few spiders have some idea what to do with rel tags at least to the point of being able to ignore external content they do not care about, so if the links are all in another page they can ignore there are potential advantages there. 3) Put more of your text content into single pages with anchors for the sections. This replaces cross-page navigation with internal page navigation. You can put a lot of text in an html file without increasing its size and download-time markedly. I am playing with a new version of the Herbal where almost all of it is in one file. Then the problem is to reduce the size of the images which are initially loaded. You can do this with thumbnails (with CSS overlays if the browser supports it), with incremental JPGs, or other tricks, all of which the search engine will just ignore. You actually have a lot of text and detail outside of the diagrams, so you might be able to remove a level of HTML-chunking and still do well. Good internal cross-references are a must to avoid scrolling all over, but if someone is navigating within the local structure, it will speed up page loads significantly. It will also allow quick browser-searches within more content. 4) A combination of #1 and #2, letting CSS or HTML5 tricks display the site map for browsers that can do it properly and single-layer fallback/noscript links in the pages themselves (in case it does not work). Search engines should just ignore it. Your site has a lot of rich technical content that an interested reader might want to sit and sift through. Something else you might seriously consider is presenting a PDF-download option so that people can go through it locally either on screen or on the increasingly-popular e-readers. That then gives them bookmarking, navigation, faster following of internal links, etc. Let people access the advertising-supported online content or pay for an ebook. You can either set something up to take payment and allow download or use something like scribd ( https://www.scribd.com/ ) to do it for you (you can either post content free or charge-to-view/download and the preview is customizable). That is the direction I am trying to go with the 2nd edition of my Herbal (very similar highly-technical content), so take a look at how Scribd allows people to preview mine: https://www.scribd.com/doc/51838055/Pages-From-An-Ozark-Herbal-2nd-Edition-Draft . Constructive feedback is appreciated as well. Calibre ( calibre-ebook.com ) is a free book-organizer/e-reader application which has built-in conversion from website-to-ebook. I find it either works really well or not at all depending on the website, and it costs you very little to see what yours looks like as an epub (free format supported by most portable readers). LyX ( http://www.lyx.org/ ) also has some html-import capability (same thing: either works or it doesn't) and can then produce e-reader-friendly PDFs as output (supposedly it can be made to produce epubs but I have never figured out how--- the PDFs look nice on my Nook). Anyway, hope some of that helps. I like your site and will probably go through it more at some point. We home-school, so we are always looking for in-depth educational sites and so are the folks in our organization. By the way, you should probably know that, with NoScript running in default-deny mode (how I see most sites) I don't see any ads on your site at all. The navigation links and CSS work fine in Firefox 4 (OS X) but no ads are visible and no in-site search. I don't mind that, but you might :-)
  11. It might be worth removing the ads for several days to gather statistics, though. It will at least give you an idea whether you have room to improve elsewhere. Then you can figure out whether you need to make adjustments to your main content or better target your ads. One of the things which makes me click away from pages without even reading them is if the ads cause the page to load very slowly. I have satellite, though, so my link is temperamental--- High bandwidth but very high latency; takes awhile to establish lots of connections for individual content elements and for certain communication-intensive scripts to run. Either that or I run with NoScript and a lot of the ads simply do not display.
  12. As some people have mentioned, Mac OS X has built in parental controls. You can create a new account for your child and then, from the System Preferences _ Accounts pane, you can check "Enable parental Controls" and then click the button for "Manage Parental Controls". That will let you select which applications the child can access from their account and gives you a couple of ways to control which websites they can access (specific whitelist/blacklist or use one of the popular rating services) and you can have activity logged. You can even filter what they can see in the Dictionary app if you really want to. We are using OS X 10.5.8 and it is reasonably flexible. One annoying aspect is that when you are working with your kid and run into something they are not allowed to access but should be, you need to switch to an Admin user to make a lot of changes. But not too bad, and supposedly the newer Mac OS X versions are easier. The other option is to go to an environment designed from the ground up for a child. We are just starting to play with an environment called "Sugar" ( https://en.wikipedia.org/wiki/Sugar_(desktop_environment) ). It is a complete operating system designed as a learning environment and works well with touchscreens. It should run on any recent Mac or PC Laptop, I think. It is free, you can download it and burn it to a CD to try it and decide whether you like it. There is no substitute, however, for being with your child or in the room much of the time they are using the computer, not just to monitor what they are doing, but to help them and see how they are using the software, what interests them, and what they might need to use it better. Also to limit the number of hours they spend staring at the thing. Having the PC they use in a public area of the house is not a bad idea. If you want something they can read/work on privately in their room or whatever, think about an e-reader with much more limited functionality. Also remember that young kids and expensive electronics are sometimes not a good combination.
  13. One way is to have good navigation aids prominent on the page. If your page contains, as you say, quite specific information, maybe too specific for the search which brought the reader there, letting them easily navigate back up to the general topic or to closely related topics may keep them within the site instead of bouncing back to the Google (or other engine) search page. I know I have this problem with some of my technically oriented content. Pages From An Ozark Herbal ( http://forums.xisto.com/no_longer_exists/ ) is a prime example--- when a reader comes into the page for a specific plant from a search engine, they often do not get the frames-based navigation. Conversely, if they navigate to a page through the frames, it is hard to bookmark the specific page. So, yes, high bounce-back rates. In this case, it is a limitation of the tool I used to create the content and export it as HTML, so I am working on changing toolsets and doing a version which will be better organized in HTML, PDF, and print versions partly to reconcile this. So take that site as an example of what not to do. The new version will be done with LyX 2.x which will then generate the different formats and do all the cross-linking. I have also worked with DocBook XML ( http://docbook.org/ ) for this kind of thing, but the DocBook tool chains either tend to be very buggy or very expensive, sometimes both (e.g. Oxygen - http://www.oxygenxml.com/xml_editor/docbook_editor.html - mainly problems with bad stylesheets). LyX ( http://www.lyx.org/ ) is a free and fairly stable LaTeX-based toolchain, runs on Mac/Linux/Windows, and they are redoing and improving their HTML export in their recent releases. In a setup like that, you essentially write your content as a book and create the website from that, adding semantic information like cross references, glossary definitions, index topics, and so forth. You can see a preview of what the new ebook/printed version looks like at https://www.scribd.com/doc/51838055/Pages-From-An-Ozark-Herbal-2nd-Edition-Draft . I would be happy to show people the LyX source so they can see how it is put together to help them with their own document production. Hopefully I will have a snapshot of the new HTML up in the next week or so. If you have the option. just using a simple server-side include with navigation links in each page would help and you can do that in most hosting setups with simple PHP. That will let you improve the site quickly while looking for a long-term solution.
  14. I will third the idea of testing under a Virtual Machine and add the suggestion that most distros have live-DVD distributions making it really easy to boot it up and see how it works. Most hard-drives these days and Logical-Volume Management (LVM) support make it pretty easy to actually install several distros on the same hard drive and throw away what you do not like. As far as Fedora-vs.-Ubuntu specifically, I have found in the last few years that Ubuntu/Kubuntu have been much less stable on what I would term less-mainstream hardware. Trying to install Ubuntu/Kubuntu on older hardware, including my CF-29 Toughbook, was an absolute nightmare with constant crashes even after I got it "working". Fedora 13 did much better with tweaks to get the touchscreen and DVD-burner working, Fedora 14 installed and worked right out of the box. Fedora packages seem to enjoy more testing on a wider array of machines, especially if you stay one major release behind. When it works, I think I like Ubuntu better, but when it doesn't work... fixing it is HARD. You say, "server", but that does not necessarily mean "no graphics". If you still will sit at the console to perform maintenance, you want a good desktop and admin tools. Even over the network, you can run graphical apps remotely using the X-windows protocol. Fedora seems to be decent at letting you get at the guts from the command-line or the GUI, the GUI being better and more stable in Gnome than KDE currently (in my experience). I used to use WindowMaker ( http://windowmaker.org/ ) often, especially on servers because it is light and fast (and pretty--- I always liked NeXTStep), but Fedora 13's packages for it were HORRIBLE and very buggy I have not yet tried it under FC 14. Fedora does have a very steady stream of package updates which is good and bad. We have satellite, which is slow and bandwidth-metered. Every time I log in with Fedora it wants to install and update something new. I have not had any real problems with updates breaking existing functionality, though.
  15. I just got an Alpha AWUS036H 1000W Hi-Gain WIFI Adapter through Amazon ( http://www.amazon.com/gp/product/B002BFMZR8 ) mainly to be able to be able to get access to our farm WIFI network from a Panasonic Toughbook in the field running Linux and also to be able to use the laptop as a WIFI hotspot in emergencies/disaster response. Specifically, we want to be able to use WIFI to stitch together Rural Neighborhood Watch activities when the cellphone/phone networks might be down. We are replacing an old Netgear 802.11b/g PCMCIA card. The package I got cost $39.99 and came with two removable antennas, one 9dbi and one 5dbi, both omnidirectional. Overall, I am flabbergasted by this product: it has exceeded my expectations in every way. Various statements by the manufacturer told me to expect reception from up to a mile or more. I have made connections at almost twelve miles (perfect terrain) and sustained them at well over three. OS Support One of the reasons I went with Alpha is that they claimed to support Mac OS X and Linux out of the box, and this was true. I simply plugged in the cable on the Linux laptop and I had a connection; the same went for an older MacBook. No drivers, hassle, or configuration. The Linux laptop was a Panasonic CF-29 Toughbook (mil-spec rugged laptop) running Fedora Core 13 (just upgraded to FC 14). I use the Gnome Desktop's network management applet. The MacBook is running Macintosh OS X 10.5.8. We do not have any Windows machines to test on. Installation This is a USB-based network adapter. The product is a small device (the size/weight of a deck of cards, roughly) with a detachable USB cable, a detachable rubber-duck antenna, and a clip-mount for attaching to your laptop or a vehicle window. To install, you attach one of the two antennas, slide the NIC into the clip mount, and plug in the USB cable. The angle of the antenna is adjustable so you can set it on a table next to you or suction-cup the device to the window of a vehicle or building. Wireless Support The device supports 802.11b/g (up to 54 or 108 Mbps depending on the access point). It does not support Wireless N. I did not find this much of a limitation. Testing and Use Again, the device powered up and established a network connection immediately. The access point here is a recent Cisco/Linksys Wireless N router (which also supports 802.11b/g) and we have a Linksys WRT-54G 802.11b/g router we use as a repeater (to extend the network coverage). We have lathe-and-plaster and lathe-and-wire walls in the old farmhouse which often degraded the signal from the older PCMCIA wireless card. Alpha's adapter shows full signal strength with the smaller, portable antenna everywhere inside and over most of the 10-acre property. Signal is significantly degraded inside the tin-side pole barn, but that is no real surprise. What was a surprise is when I suction-cupped the larger 9-dbi antenna to the outside of the passenger window on the truck and went for a drive up the road. The included USB cable was nearly too short for this use. The manufacturer recommends using a USB extension cable to add a bit of reach for better positioning of the antenna. The metal shell of the truck does degrade the signal, so it is better to have the antenna outside. When I did this, we could get a signal from the Cisco/LinkSys router with its dinky internal antenna from nearly the nearby Interstate. When we got to high ground on the Interstate, I immediately picked up 23 separate networks from this rural area, one of the routers I know to be at least 12 miles from that point and I connected for a short bit to a router from a hotel in a neighboring town (perhaps 8 miles). It was very difficult to maintain a solid connection while moving (and I was reluctant to leave the antenna suction-cupped at highway speeds...), but when well-positioned on good terrain, the device is amazing. The external antennas provided are compatible with the WRT-54G's removable antennas, so I am in process of ordering two more of the high-gain antennas with magnetic mounts to put on the WIFI repeater mounted on the farm house tin-roof. I expect to be able to establish connections with neighbors between three and five miles away for Neighborhood Watch purposes. Our cell phones are WIFI-capable and have an android Push-To-Talk program called VirtualWalkieTalkie ( http://forums.xisto.com/no_longer_exists/ ). I can run a local Java-based server for this from the ToughBook and then the phones can connect to each other even if the cell network and local (satellite) Internet connection is down (but we usually use handheld FM radios for local voice communication). Both the access point and the Toughbook will readily run off of 12-volt power from our battery bank, so then, we can access emergency documents, transfer photos (of, say, a damaged building or a suspicious vehicle) over a several mile radius, with the Toughbook running off of the vehicle power and providing a local access point/repeater. The Alpha hi-gain adapter basically provides an inexpensive glue for an ad-hoc rural network which would otherwise require much more expensive equipment. Because WIFI is industry standard, we can tailor the software on top of it to our needs. Battery usage is noticeably increased when using this device, though not as bad as I thought it would be. Typical life for me has gone from 5-6 hours per charge to closer to 4-5. It does, however, seem to suck power even after putting the laptop to sleep or hibernation, so I have gotten used to unplugging the USB cord when I do not need it. Caveats Two words of warning, however: You can connect to an access point from much farther away than you can meaningfully transfer data. I have connected to some distant access points in testing at two-three bars of signal strength (several miles away) and had very bad transfer rates, I think because of dropped packets. It seems to be related to the quality of antenna/receiver on the access point you are connecting to. If you have a crappy antenna on one end, the Alpha can only do so much, especially with intervening obstacles. Good terrain, few obstacles, and a decent access point on the other end gives consistent transfers even miles away. Being able to access the network from a long distance away does not do away with traffic congestion. If your network is suddenly accessible over a large area, you can overload it with too many simultaneous users. It works better for our applications therefore in a rural areas where you have a small number of users spread over a large distance as opposed to an urban area where you have dozens or hundreds of users within a mile or two. Conclusion In any case, I would definitely recommend this product and second the opinions of some other reviewers that the more powerful 2000 mW version is simply not necessary. This one has all the transmitting power for the job and there is no need to waste battery-life with more. The difference between the 5dbi and the 9dbi antenna or in better antenna positioning seems to be much greater than the difference from adjusting transmitter power at either the adapter or the access point. Hopefully, I will convince folks in the county to buy a number of these little devices.
  16. I use Solution Calculator for the same purpose ( http://forums.xisto.com/no_longer_exists/ ), also an Android app. It handles equivalents as well (for acid-base reactions) which I find handy for doing dilutions in either soap-making or dye baths. We raise sheep for wool and do some of our own dying; many common dye baths require an acid wash and the recipe may specify amounts in one acid, say 5% white vinegar and we may be using muriatic acid for a particular dye job or vice-versa. Not to discourage your efforts; the basic concept is definitely handy. One useful feature would be an internal database of common commercial compounds at common dilutions.
  17. They used to. The phone providers themselves maintained the SMS gateways; you just had to know where they were and which service the receiver used. The receiver paid for the message as usual. I used to use this for system admin tasks, paging me when long-running jobs were finished (like loading a backup from a tape robot) or when a filesystem was full. For most phone systems there were email gateways which could be used from command-line or scripts. That is all this plugin did. Now the providers have gotten greedy and make you pay at both ends. There are still working plugins for pay-per-message servers like http://smsflatrate.net/ and http://site25.way2sms.com/. With smartphones, it is usually less hassle to just use email.
  18. You should also be aware that running a secure browser from an insecure computer, especially a public terminal at a library or whatever, does not make you safe. You might be making your requests through Tor, but there are a gazillion ways for someone who had control of that computer to track what you were doing, such as key-loggers or Trojan horses (though as a friend of mine likes to point out, the actual horse was Greek). It is also very difficult to delete individual files securely from a USB drive. There is some good detail on that problem here: http://forums.xisto.com/no_longer_exists/ or here: http://forums.xisto.com/no_longer_exists/ . it is probably better to use a laptop or smart phone with TOR and a https over wireless (or VPN) than a USB-based "secure" browser on a public computer.
  19. A lot of interesting discussion here. A lot of the big media companies take copyright much farther than copyright was intended to protect. I don't use You-Tube much myself (little as a submitter, more as a consumer), but I use similar sites for writing and graphics, such as Scribd ( https://www.scribd.com/ ), fanfiction.net, the Wikimedia Commons ( https://commons.wikimedia.org/wiki/Main_Page ) , and so forth. Limited Time First of all, it was intended to protect for a limited time and that time has become effectively unlimited in the days of the Sonny Bono Copyright Extension Act and the like ( https://en.wikipedia.org/wiki/Copyright_Term_Extension_Act ). The idea that, say, J.R.R. Tolkien should be allowed to make money off his works (if someone wants to buy it) is fine. As a freelance writer, I like that idea. The idea that his grandchildren still need to be making money off of his writings decades after he is dead is a bit ridiculous (barring the posthumous editing and compiling efforts by Chris Tolkien). The idea that the Tolkien descendents can say who does and does not have the right to make a film based on the works or publish fan-fiction goes much farther into detracting from the arts rather than promoting them. If you don't like Jim Bob's adaptation of the Lord of the Rings (or The Lion, the Witch, and the Wardrobe) for film, don't watch it, but having to get permission to do it from people who were not even alive when the original was written makes no sense, whatsoever. What would this have done to Shakespeare's work if he could not adapt the tales of Julius Caesar for his own use? Most of what he wrote was based on older works; most of everything written today is likewise based on things which have already been done. Now current works are a different matter. I wrote a fanfiction based around the Stargate Atlantis franchise ( https://www.scribd.com/document/51672613/Last-Straw ). I know I cannot make any money off of that, ever, and that is fine; it was still fun to write and maybe someone who likes it will consider some of the other things I wrote. Somewhere, we have to make some kind of reasonable compromise on what 'limited time' really means, and that is something the Supreme Court continually sidesteps. Quoting In Commentary Another place where the bounds are seriously overstepped is in commentary. It is absolutely always fair use to quote someone's work as part of commentary and criticism. A lot of the automated DMCA drones don't care about that. They just flag anything which looks even remotely like something a media company claims, even similar titles some of the time. Every so often they do that to the wrong party, like the law professor who got a DMCA takedown-notice for home barbecue footage (if I recall correctly), but most of the time, stuff gets taken down or accounts get banned and the target has no real recourse. This chills speech across the board. I am just not going to put the same effort into something if I know it's going to get taken down or I am going to get in a big fight over it. Copy Protection Hurts the Market The DMCA is supposed to require the copyright holder to put in good-faith effort to ensure that they actually own the content and that it really is infringing. Does this make it hard for copyright holders to protect their media against mass-uploading? Sure. But it does not really matter. The point of Copyright in the first place was not to guarantee artists (and especially big media companies who prey on artists) a living wage. It was to promote the science and useful arts. The law is "copy right" and it is copying which is the right, not prohibiting people from copying. The cost to society, and even to artists themselves, of draconian measures to stop copying is much higher than any possible gain. The freeloaders will never net an artist profit anyway: as one poster said, these parasites don't want to (and won't) pay for anything; they are not a lost sale. The goal of an artist is always to get their work in front of someone who wants to pay. Baen Books went a long way in recognizing this with their free ebook library ( http://www.baen.com/categories/free-library.html ), requiring and encouraging their authors to freely release at least one title. A satisfied viewer/reader is the best advertiser there is. iTunes Plus or similar offerings are also a good enticement: being copy-protection free, I get a guarantee that I can use content I purchase on future devices (such as my phone). That is worth paying a little extra for and it is something I look for when buying music online. I won't buy from audible.com anymore because I am sick of dealing with their buggy copy protection. I, myself, don't bother putting copy protection on my online works and anyone buying CDs of, say, my Herbal ( https://www.scribd.com/doc/51838055/Pages-From-An-Ozark-Herbal-2nd-Edition-Draft ) could certainly copy and upload them somewhere. Actually, most of my work ends up licensed under one of the Creative Commons licenses ( https://creativecommons.org/licenses/ ) anyway, so people could do so legally in many cases and major chunks of my herbal have been free online for years ( http://forums.xisto.com/no_longer_exists/ ), some early drafts of articles and research were posted here on Xisto ( http://forums.xisto.com/topic/89374-topic/?findpost=1064342399 ). I usually bar commercial reuse (without permission) because I want the opportunity to negotiate/deny resale if I want to, but it is on the honor system. I usually also post drafts of my works because I often get good and useful feedback. Big Media Is a Dinosaur A big part of this is that the big media sales model is neither beneficial to the customer nor to the artist. Artists used to give the lion's share of their revenue to the media companies, big publishers, etc., because the publisher really did something for the money: they sought out talent and read through the 'slushpiles' of submissions, absorbed capital costs associated with publication/distribution, took financial risk, and got the artist's work in front of potential customers. Now, they have largely gotten out of the habit of doing anything for the money and technology has made it easier to just go around them. There is a really interesting blog post/discussion by two self-publishing writers on this subject ( http://forums.xisto.com/no_longer_exists/ - Long but really worth reading through). A big publisher wants to give me about 11% of revenue for the privilege of publishing my ebook, putting it under customer-annoying copy-protection and screwing up the marketing. Scribd gives me closer to 70% and I have the option of copy protection if I really want it (which I don't). Lulu (https://www.lulu.com/) and so forth give me a lot better options if I want a paper-bound book and still want to preserve future residual income. There just is not a place anymore for the big media companies and they don't like being locked out of the process. Cry me a river. So, if you want to support my work and pay me for it, please do, but I will not support heavy-handed approaches to making people support my work--- or anyone else's. Find a business model that works, find a patron, take the traditional route of the artist and starve (or get a day job), or whatever, but the law is not there as an endowment of the arts. If people pay me, I'll produce more work, faster. If people don't, I'll still write because it is something I have to do, but it is obviously going to take a back seat to other priorities.
  20. I believe history will show that we never had a "recovery" at all. Gold and silver are still bouncing around north of $1500/oz and $40/oz. respectively, and pressure on (US) bond interest rates has not eased. In all likelihood, UK's economy is healthier right now than ours in real terms, but I think you folks will find that your economy and much of the world's are too tightly chained to our sinking ship. The disaster in Japan really did not help things either. But I think the real disaster that is setting things on a downward spiral is the recent and continuing travesty over the US budget. We fought back and forth, tooth and nail, for several weeks over a 2011 continuing resolution which cut at most $32 billion out of a $1.3 trillion deficit (and at worst estimated as having cut only a couple of hundred million USD). We are heading down the same road with 2012: not arguing over how to actually reduce our debt or even reduce the deficit in real terms but merely about whether to reduce the growth in the deficit by a miniscule fraction while exempting most of the budget from any cuts at all. This just tells our creditors that we are not capable of taking any action to protect their money and their investment which is what S&P's murmurings are about. US bonds were a good (or at least 'safe') investment as long as there was some hope we would get around to providing a return on that investment someday. I think the world markets are a good bit less clear on that right now, and the only thing keeping us afloat are the Federal Reserve's price support on bonds and the fact that investors are reluctant to drive commodities providing an alternative to US bonds (e.g. silver, gold, wheat, oil, sugar) any higher.
  21. We are having the same problem now: won't work on Firewire but will on USB. We are looking for a newer external burner with Firewire/USB/Mac/Linux support but not getting very far. I would like to be able to burn Blueray disks just to be able to take advantage of higher capacity and do larger backups.
  22. We got a letter at some point about the settlement of the HP lawsuit dealing with some of these matters, such as fraudulent ink usage claims and cartridges reporting "empty" almost as soon as you install them: https://www.hpinkjetprintersettlement.com/ . We are seriously steamed. It looks like we can get about $9 in COUPONS toward more HP products as the result of having spent perhaps a few hundred dollars on cartridges we should not have had to spend, not to mention the personal cost of putting together the paperwork to get our portion. So, we are taking the easier route out--- never buying another HP printer as long as we live. We will keep using this one for a while but are looking for a more economical (non-HP) printer to replace it down the road. What I want is one of these: http://www.notebookreview.com/news/eco-ink-made-from-old-coffee-grounds/ , a hand-cranked printer using coffee grounds for its ink supply--- unfortunately only a prototype with no commercial production that I have yet found, but it would be great to pack with the Toughbook for use in the field.
  23. Some updates on the original review:I have occasional complete lockups on the OS-X desktop which I have narrowed down to the tablet driver. This requires a log-out and restart, occasionally with the logout hanging and requiring a hard reboot (or an SSH connection and shutdown). It seems to happen when RAM gets extremely low (such as after doing batch photo processing) where the tablet driver does not deal gracefully with the situation and becomes a Zombie. So, maybe this is similar to what a user reports above under Windows.Now that I have the plasma screen mounted on the wall and an LCD on the desk, the erratic multitouch mouse behavior has pretty much gone away. With the new desktop rearrangement, though, I also have more room for a normal mouse, so it has become less important.I should also mention that I have successfully used the Bamboo tablet with Fedora Core 13 Linux on a Panasonic CF-29 Toughbook and am playing with it under Fedora Core 14 on the same laptop. The laptop itself has touchscreen and stylus support, but the tablet works better, mainly because I can use it in a more natural position.
  24. I have not tried to scan to the laptop over wireless. I have had occasional trouble scanning to the desktop from the printer front pannel which is connected by ethernet to the wireless hub. My work-around for this has been to initiate the scan from the desktop rather than from the printer. If you click on the Printer icon (usually open in the Dock) to pull up the print queue, you can select "Scanner" from the top row of buttons. An "HP Scan Pro" window opens and "New Scan" will be a button in the top left. This will let you pre-scan your image, correct it, then do the final scan and save the document without using the printer's front panel. If your printer is not close to your computer, this means running back and forth to perform the scan (what networking was supposed to avoid in the first place) but with a laptop you can at least carry it with you to the printer.
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.