Jump to content
xisto Community

twoq

Members
  • Content Count

    58
  • Joined

  • Last visited

About twoq

  • Rank
    Member [Level 1]

Contact Methods

  • Website URL
    http://
  1. Mine looks like this: http://img.photobucket.com/albums/v159/Gazmuyloco/desk.png
  2. I?m looking into RFID for some stuff. Is anyone here familiar with it? It would be nice if there is some kind of package for it.Let?s say that you have a bunch of tags; when you have a new customer or new item, you create new id for him/her or the item, and you create the tag that contains that id. You give it to the customer or put it on the item. Next time you just scan it and you can retrieve the id.There is something like this for barcode. It has a set of hardware and also it has source code and/or API for this.Is there some package for RFID for this? I?d appreciate any help on this.
  3. why i can't upload over 16Kb files?lt. upload 200KB file,but server only recived 16KB!( on PASV mode)but some server if it suppor Port Mode ,it will good work.win2k+sp4+flashfxp 3.0 bulid 1039+kasperskys antivirus 5.0,
  4. One way I've seen this done is to use a splash page for the sound. So when people hit the site they get the index file that contains only the sounds and a "Enter" link. When the link is clicked, then main site loads as a popup. So the sounds can keep playing as long as the splash page is still in the background.However, so many users have popup blockers, this may not be any better than frames now.
  5. Ok i'm working with a css for just the font, but since I have the font as a link, it reverts back to making it blue and underlined. What am I doing wrong here? I think I am having a problem with the "selector" portion of this. I want my links to just show as a font, color, and size that I choose. No underlining.in the head:<title>2 G's Productions</title><style type="text/css"><!--P{font-family: verdana;font-size: 14px;font-style: normal;color: #000000;}--></style></head>in the body:<p><a href="company.htm">OUR COMPANY</a></p><p><a href="merch.htm">MERCHANDISE</a></p><p><a href="bios.htm">BIOS</a></p><p><a href="studio.htm">STUDIO</a></p><p><a href="events.htm">EVENTS</a></p><p><a href="contact.htm">CONTACT</a></p>
  6. [ This TOPIC has been Found to be copied from SITE POINT FORUMS and has breached the TOS of Xisto. The User has faced termination of his web hosting account and also been removed from HOSTED Group ]-OpaQueXisto Management.
  7. Google is the Web's most popular search engine, powering not only the popular Google.com Website, but also Yahoo! and AOL. Being listed in Google is very important, and being listed highly in Google can bring great benefit to your site.However, there are many myths about how Google works and, while fairly harmless in themselves, these myths tend to allow people to draw incorrect conclusions about how Google works. The purpose of this article is to correct the most popular Google myths.Myth #1: The Higher Your Google PageRank (PR), the Higher You'll be in the Search Results ListingThis myth is frequent, and is the source of many complaints. People often notice that a site with a lower PageRank than theirs is listed above them, and get upset. While pages with a higher PageRank do tend to rank better, it is perfectly normal for a site to appear higher in the results listings even though it has a lower PageRank than competing pages.To explain this concept without going into too much technical detail, it is best to think of PageRank as being comprised of two different values. One value, which we'll call "General PageRank" is nothing more than the weighting given to the links on your page. This is also the value shown in the Google Toolbar. This value is used to calculate the weighting of the links leaving your page, not your search position.The other value we'll call "Specific PageRank." You see, if PageRank equated to search engine results rank then Yahoo, the site with the highest PR, would be listed #1 for every search result. Obviously, that wouldn't be useful, so what Google does is examine the context of your incoming links, and only those links that relate to the specific keyword being searched on will help you achieve a higher ranking for that keyword. It's very possible for a site with a lower PageRank to in fact have more on-topic incoming links than a site with a higher PageRank, in which case the site with a lower PageRank will be listed above its competitor in the search results for that term.PageRank aside, there are also other factors that contribute Google search results -- though PageRank remains the dominant one.Myth #2: The Google Toolbar will List Your Actual PageRankWhen Google created their toolbar it was a boon for many Webmasters as this was the first time we got to see any value related to our PageRank. However, the toolbar has also caused some confusion.The toolbar does not show your actual PageRank, only an approximation of it. It gives you an integer rank on a scale from 1-10. We do not know exactly what the various integers correspond to, but we're sure that their curve is similar to an exponential curve with each new "plateau" being harder to reach than the last. I have personally done some research into this, and so far the results point to an exponential base of 4. So a PR of 6 is 4 times as difficult to attain as a PR of 5.The exponential base is important because it illustrates how broad a range of pages can be assigned a particular PR value. The difference between a high PR of 6, and a low PR of 6, could be hundreds or thousands of links. So if your PR as reported by the toolbar increases or drops, it's important to remember that it could be the result of a small change, or a large change. Additionally, it's possible to lose or gain links and see no change in your reported PageRank.The other issue with the toolbar has to do with the fact that sometimes the PageRank it displays is only a guess. People will often notice pages on Geocities or another free hosting provider having a high PageRank. This is because when Google hasn't spidered a page, but has spidered the root domain, the toolbar will guess a PageRank based on the value of the root domain. Therefor it's common to see pages on Geocities with a PR of 6 or 7. The PageRank does not equate in any way to a high Google listing, in fact in this case it indicates the opposite: that the page isn't even in Google. Once Google spiders the page, it will be assigned a more appropriate (and usually lower) PageRank.Myth # 3: PageRank is a Value Based on the Number of Incoming Links to Your SiteThis myth is a frequent source of incorrect assumptions about Google. People will often see that a site with fewer incoming links than their own site has a higher PageRank, and assume that PageRank is not based on incoming links.The fact is that PageRank is based on incoming links, but not just on the number of them. Instead PageRank is based on the value of your incoming links. To find the value of an incoming link look at the PR of the source page, and divide it by the number of links on that page. It's very possible to get a PR of 6 or 7 from only a handful of incoming links if your links are "weighty" enough.Also remember that for PageRank calculations every page is an island. Google does not calculate PageRank on a site-wide basis -- so internal links between your pages do count. This is very important, as instituting a proper structure for your internal links can drastically improve your rankings.Myth # 4: Searching for Incoming Links on Google Using "link:" will Show you all Your Backwards LinksSimilar to Myth #3, people will sometimes look for backwards links to a site on Google and fine none, but if the site does have a PR listed and it is in Google's cache, they know that the toolbar isn't just guessing.The reason for this is that Google does not list all the links that it knows about, only those that contribute above a certain amount of PageRank. This is especially evident in a brand new site. By default, all pages in Google have a minimum PR. So even a page without any incoming links has a PR value, albeit a small one. If you have a brand new site with 20 or 30 pages, all of which Google has spidered, but you have no incoming links from other sites, then your pages will still have a PageRank resulting from these internal links. As your home page is likely linked to from every page on your site, it might even get a PageRank of up to 1 or 2 from all these little boosts. However, in this situation searching for incoming links will likely yield 0 results.You can also see this happening on pages that have been around for awhile. For instance, this page has 0 incoming links listed in Google, yet it has a PageRank of 3. We can see that Google has spidered it by checking its cache, so the PageRank is not a guess. We also know that Google has spidered this page, again by checking its cache. Therefore, we can be sure that Google knows of at least 1 link to the page in question, both by its listed PR, and the fact that Google has spidered a page that links to it.However, if you look at the DMOZ.org page with the Google Toolbar installed, you'll notice the page has a PR of 0, which is very low. Furthermore, if you count the number of links on the page, you'll notice it has over 20. So you're dividing a very low PR among over 20 links. Thus each link carries very little weight, so Google doesn't list these links when you search for them. However, Google does count the links, which is why the page in question has a PR listed.It's very important to remember how Google lists incoming links. Often, people see their number of incoming links drop, and they think they have lost those links. In reality, the linking page could have lost some weight and consequentially, the links might have dropped below the value threshold that's required in order for links to be listed. Or the linking page could have added more links, causing each link's share of the weight to be lower, and again causing the link to drop below the value threshold. In either case the link is still counted, it just isn't listed.Why does Google do this? Perhaps the answer has to do with technical limitations. If the average number of links per page is 20 then Google would have to deal with over 60 billion links, which might create an index that was too large to be publicly searchable.Myth #5: Being Listed in the Open Directory Project Gives you a Special PageRank BonusGoogle uses Open Directory Project (DMOZ.org), to power its directory. Coupling that fact with the observation that sites listed in DMOZ often get decent and inexplicable PageRank boosts, has lead many to conclude that Google gives a special bonus to sites listed in DMOZ. This is simply not true.The only bonus gained from being in DMOZ is the same bonus a site would achieve from being linked to by any other site. However, DMOZ data is used by hundreds of sites. The biggest user of DMOZ data is Google, but it is also used by thousands of other sites. The links from these sites are often too weak to be listed in a link search, but Google does crawl them, and the links do count. So if you're listed in DMOZ, you're actually gaining the benefits of hundreds of lightly-weighted incoming links, and when you add all those up, the total can amount to a decent PageRank boost.There are two other benefits you can gain by being listed in DMOZ. For one, your directory description will appear with Google search result listings, which may increase the likelihood of someone clicking on your link. The other benefit is that, as Google does crawl DMOZ, being listed there will ensure that you're also listed in Google. However, as it's so easy to be listed in Google, this benefit is slight at best.Myth #6: Being Listed in Yahoo! Gives you a Special PageRank BonusThis myth evolved much in the same was as Myth #5. Google has been partnered with Yahoo! for a number of years by providing secondary search results, and just recently (Fall, 2002), Yahoo! started using Google to provide primary search results.New Revised 3rd Edition Out NOW!Because Yahoo! uses Google, many have assumed that Google also uses Yahoo!, which is not the case. The only PageRank you will gain from being listed in Yahoo! is the same as the PR you'd gain from any other site of equivalent weight. However, some people achieve a larger-than-normal boost from their listing in Yahoo!, which again leads to this incorrect conclusion.The fact is that being listed in Yahoo!'s main directory will often get you into regional directories, so, much like DMOZ, one Yahoo! listing can result in multiple links. These links are often weak in nature so they may not show up in a link search, but they are there -- and Google knows about them.Additionally, once you're listed in any search engine or directory you have an increased chance of someone finding your site, liking it, and adding a link to it from their own site. As such, being listed in Yahoo! could result in you receiving links from elsewhere -- links whose weight is too low to list, but which do contribute to your PageRank.Myth #7: Google Uses Meta Tags to Rank Your SiteThis myth is left over from the days when most search engines used meta tags. However, Google has never used them. This fact may be contested by some people, so I wouldn't post it without proof.To prove to yourself that Google doesn't use meta tags, put words into your meta tags that do not appear elsewhere on your page. Then, using an advanced search, search for those words while limiting the results to your domain only. You can try this on any search engine -- and if results appear, you'll know that engine uses meta tags. If no results are displayed, then you know meta tags are not used. It is important, though, that the words only appear in your meta tags and no where else on your page.Google can sometimes use the meta description tag to create an abstract for your site, so it may be useful to you if your home page is primarily composed of graphics. However, do not expect it to increase your rank.Myth #8: Google Will Not Index Dynamic PagesSome search engines have, in the past, had problems with dynamic pages, that is, pages that use a query string. This was not due to any technical limitation, but rather, because search engines knew that it was possible to create a set of an infinite amount of dynamic pages, or they could create an endless loop. In either case, the search engines did not want their crawlers to be caught spidering endless numbers of dynamically generated pages.Google is a newer search engine, and has never had a problem with query strings. However, some dynamic pages can still throw Google for a loop.Some shopping carts or forums store session information in the URL when cookies are unable to be written. This effectively kills search engines like Google because search engines key their indexes with URLs, and when you put session information in the URL, that URL will change constantly. This is especially true as Google uses multiple IP addresses to crawl the Web, so each crawler will see a different URL on your site, which basically results in those pages not being listed. It is important that if you use such software, you amend it so that if cookies are unable to be written, the software simply does not track session information.So, you don't need to use search engine-friendly URLs to be listed in Google. However, these URLs do have other benefits, such as hiding what server side technology you use (so that you may change it seamlessly later), and they are more people-friendly. Additionally, while Google can spider dynamic pages, it may limit the amount of dynamic pages it spiders from one particular site. Your best bet for a good ranking is to use search-engine friendly URLs.Myth # 9: Google Will Not List Your Site, or Penalize it, if you use PopupsThis is a relatively minor myth but it still pops up (pun intended) every once in a while. Google has an advertising program called Adwords, and one of their policies is that they do not allow sites that use popup windows to participate in this program.This policy only exists for the Google Adwords program, but either through hearsay, or people hearing that Google has a policy against popups and incorrectly assuming that this includes Google's main index, this myth has flourished. The suggestion that you won't be listed in Google if you use popups is simply not true: many sites that use popups, including SitePoint, are well ranked on Google. In fact, it is doubtful that Google even understands all the Javascript that can create a popup.Myth # 10: Google will Penalize you if You're Linked to by a Link FarmGoogle has policies against the use of artificial means to increase your PageRank, which specifically include things like joining a link farm. There are sites or services out there that set up automatic link exchanges to increase your PageRank. The links are usually hidden from people through the use of CSS, and either making the text the same color as the background, or by putting the links in an invisible layer. As search engines don't render CSS, they will see the hidden links and thus count them when calculating your link popularity.However, despite all this, Google will not penalize you for being linked to by a link farm. After all, you have no control over which sites links to you, so it wouldn't be fair to penalize site owners on this basis. Additionally, link farms often have low PageRanks and a high number of outgoing links, so each link will contribute only a very small amount to your total PageRank -- and thus this method of abuse is not very effective.Even so, Google can punish you if you link to a linkfarm from your site, or otherwise put hidden links in your pages. So the simple truth is that you can be punished for what you do to your own site, but not for getting linked by another site.
  8. The mighty (and now public) juggernaut that is Google is emerging as the most formidable potential competitor in the Web hosting space. Who will suffer the most when Google enters Website hosting? Yahoo!"No way!" you say. "Google doesn't even offer Web hosting." You mean Google doesn't even offer Web hosting yet. Let's look at the facts and see exactly why Google will defeat Yahoo! in the Web hosting war.Why Will Google Launch a Web Hosting Product?Web hosting is a (still) highly fragmented marketplace.Even the very largest hosting providers garner no more that 4-5% of the market. While there are easily more than 100 sizable (more than 25,000 active shared accounts) Web hosting companies, there may be more than 10,000 small to mid-sized (less than 10,000 active shared accounts) hosting competitors.Why Google will get in: with the millions in extra cash that they have lying around, Google can easily acquire even the most premier hosting company and use it as a base to build a business around. Alternatively, Google could simply buy up tons of customers from smaller hosts. Of course, they could always start from scratch and be a top 25 hosting company in about 3 months?The barriers to entry are extremely low.In fact, anyone with a Web connection can simply resell any of a dozen fairly competitive packages.Why Google will get in: last I heard, Google still had a Web connection.Arguably, the single most important element in gaining new customers is online advertising.A survey of the top slots in the pay-per-click search engines tells the story: hosting clicks are in very high demand, and the top players are willing to pay handsomely to get them. For example, a recent glance at Overture's max bids yields the following results:Why Google will get in: while competing with their customers may never have crossed Google's mind, the simple economics of Web hosting will necessitate a different approach. Google will give up the number 1 advertising slot to market a Google Web hosting package. This will not even dent the vast ad revenues generated by the hosting industry.Additionally, Google could simply create a new ad unit that rests at the very top of the page, preserving all existing advertising slots. Adding thousands of new hosting accounts each month will quickly generate millions in recurring revenues for Google. Oh yeah, one other thing I forgot -- Yahoo! did this a while ago and they're making bank.Web hosting requires tons of servers, data centers, and skilled technical support staff.A great hosting company needs to stave off DDOS attacks, worms, viruses, and a litany of other hacker-induced mayhem. It also needs super-fat redundant pipes to handle all the traffic, and the additional mandatory goodies like physical security, fire suppression, and backup generators to maintain 100% uptime.Why Google will get in: It's not a stretch to say that some of the Web's best technical minds are currently employed in Mountain View -- and they've been protecting a premier Web property from hackers for the last 6 years. I think they also know a thing or two about data centers and redundant connectivity.Email and Web hosting go hand in handMany customers register a domain name and set up a hosting account mainly to have the email address that they want, such as you@yourcompany.com. The ability to process and manage a high volume of email while excluding spam, viruses and other security exploits is a mandatory skill set for hosting.Why Google will get in: one word: Gmail! Okay, so they messed up a little with the whole privacy/big brother issue. They'll get past that. The point is that Google knows email. Even with the throttled launch and invitation-only system of Gmail, they probably already have more email customers that most hosting companies. This gives Google 2 more clear reasons to get into hosting: they can handle email as well as anybody, and they have a huge email list to which they can market their service.Many businesses and consumers will only do business with a brand that they trust.If a no-name company were to attempt to enter the Web hosting arena today, they would face an uphill battle to overcome the branding efforts of the large hosting companies that got into the game early.Why Google will get in: no-brainer -- it's Google, for goodness sake! They may have the most trusted and well-known brand online. The moment that Google enters the Web hosting space, they will attract a new, untapped audience that will overcome their reservations about having their own Website, simply because Google says it's okay to do so.Why Will Google's Web Hosting Product Reign Supreme?It's pretty much a given that Google will launch a Web hosting product at some point. But how will they crush Yahoo! (and every other Web host out there)? Here are a few points on which Google trumps its nearest rival? * Google brand versus Yahoo! brand. Winner = Google. Google is now, hip, and exciting. Yahoo! is yesterday, tired, and old-school. * Google's execution versus Yahoo's execution. Winner = Google. Google's awesome tactic of launching full blown, fully developed products as 'beta' tests overcomes the pressure and scrutiny of conventional product launches. Plus, they're just really good at executing Web-based technology. * Google's reach versus Yahoo's reach. Winner = Google. By some accounts Google now accepts 70% of all Web searches each day. Unless it stumbles, Google will keep growing at Yahoo's! (and everybody else's) expense. * Google's synergies versus Yahoo's! synergies. Winner = Google. Google bought Blogger, and it already knows and understands how to attract, market to, and partner with content builders. Google's AdWords and AdSense products are already in use by well over 150,000 individual Websites. Google has made a science out of knowing which markets are searching for what, and where they go to find it.ConclusionPoor Yahoo! Poor, poor Yahoo! First, they had their butt kicked by Google in the search space. Then, they play catch-up and once again get bested by Google in the pay-per-click marketing space as Yahoo's Overture property lags behind Google's AdWords in both luster and market penetration.Now, Google seems poised to crush Yahoo's fledgling Web hosting effort. If it's any consolation, at least Yahoo! has been leading the way -- and blazing a clear profit trail for Google to follow.
  9. The Google update of the 17th-20th February 2004 (nicknamed 'Brandy' by WebmasterWorld) resulted in major changes in the results the search engine returns.The 'Brandy' update seems to have incorporated some pre-'Florida' results (another major update that occurred at the end of 2003), mixed with numerous new factors. Google stores its index on a number of data centers around the world. Since 'Florida', some of the old data centers were taken offline, and pundits believe that Google has kept the old SERPs (Search Engine Results Pages) in a preserved state for the last few months.Indeed, Google brought these data centres back at the same time that Yahoo! broke from Google, in favour of its new Inktomi-based results. Consequently, I don't think this is the last of the major changes we'll see in Google, but it does seem that Google is getting closer to what it aims to achieve.Five ChangesBrin, one of the founders of Google recently said,Google has made five significant changes to its algorithmic formulas in the last two weeks.(Associated Press (AP), Feb 17th 2004)While we can only guess at what those changes were, the following are probably a good bet. 1. Increase in Index Size Google's spider, Googlebot, has had a busy few weeks -- at the time of the update, Google announced that it had massively increased the size of its index. This move was probably made to ensure Google made headlines at the same time as Yahoo! (for example, in this report in the BBC News, Feb 18th 2004). However, in order to increase the index size, Google may have had to re-include some of the pre-Florida results that had previously been dropped. 2. Latent Semantic Indexing (LSI) This is a very significant new technology that Google has always been interested in, and the incorporation of LSI has been on the cards for some time. If you are an insomniac, then Yu et al.'s paper is quite helpful in explaining the concept, but, in short, LSI is about using close semantic matches to put your page into the correct topical context. It's all about synonyms. LSI may see Google effectively remove all instances of the search keyword when analysing your page, in favour of a close analysis of other words. For example, consider the search term 'travel insurance'. LSI-based algorithms will look for words and links that pertain to related topics, such as skiing, holidays, medical, backpacking, and airports. 3. Links and Anchor Text Links have always been the essence of Google, but the engine is steadily altering its focus. The importance of Page Rank (PR), Google's unique ranking system, is being steadily downgraded in favour of the nature, quality, and quantity of inbound and outbound link anchor text. If PR is downgraded, and the wording of inbound links is boosted, this may explain, to a large degree, the position in which many sites currently find themselves. For example, most people will link to a site's homepage. In the past, due to internal linking structures, PR was spread and other pages benefited. Now, it is more important for Webmasters to attract links that point directly to the relevant pages of their sites using anchor text that's relevant to the specific pages. Furthermore, Google seems to be using outbound links to determine how useful and authoritative a site is. For example, directories that are doing well are those that direct link to the sites, rather than use dynamic URLs. 4. Neighbourhoods Now, more than ever, has the question of who's linking to your site become critical. Links must be from related topic sites (the higher the PR the better); those links are seen to define your 'neighbourhood'. If we again consider the example of travel insurance, big insurance companies might buy links on holiday-related sites in order to boost their ranking. These businesses will actively invest in gaining targeted inbound links from a broad mix of sites. Consequently, their neighbourhoods appear tightly focused to Google. 5. Downgrading of Traditional Tag-Based Optimisation Clever use of the title, h1, h2, bold, and italics tags, and CSS, is no longer as important to a site's ranking as it once was. It is very interesting to listen to Sergey (co-founder of Google) talk about this, because he's the one usually quoted about the ways in which people manipulate his index. Google has taken big steps to downgrade standard SEO techniques in favour of LSI and linking, which are far less manipulable by the masses.The Impact of BrandyThese changes make for sober reading if you're a Webmaster -- to optimize your site successfully for Google has become a lot more difficult. Nevertheless, there are a number of practical steps that can be taken to promote your ranking in the short and long term. 1. Synonyms As LSI appears to be so significant, it is important to start looking carefully at the information architecture of each major section of your site, and to increase the use of related words. It is also important to re-examine the title tags to include this concept; good title tags have synonyms and avoid repetition of the key phrase. 2. Outbound Links Link to authority sites on your subject. In the travel insurance example, these authority sites could include places like the State department, major skiing directories, etc. Not only will this help with LSI, it also allows Google to define the neighbourhood more easily. Furthermore, you could engage in link swaps with other companies so that you gain the benefit of an on-topic, LSI-friendly link. 3. Inbound Links and Link to Us Pages Based on what we have just said, sites need to formulate a link development strategy. A budget needs to be set aside to buy links and develop mini-sites. Look to set up links with university sites (.edu or ac.uk), as these seem to be valuable given Google's informational bias. Each section of a site should have its own link-to-us page. For example, HotScripts, the major computer script directory, has a great link-to-us page. By providing people with creatives and cut-and-paste HTML, you can vastly improve your chances of attracting reciprocal links to your site. You'll need to have a separate page for each section, to maximise on-topic inbound links. 4. Mini-Sites It is important to develop separate mini-sites (also known as satellite sites) for each key subject of your Website. This is a useful tactic that improves your chances of appearing in the SERPs for your keywords. Furthermore, as the last three Google updates have shaken things up so much, having more than one site reduces the likelihood that your business will be disrupted by the engine's updates. However, Google is likely to view satellite sites as spam, so you must take some steps to reduce the chances of your being blacklisted on this basis. First, make it as hard as possible to for Google to detect host affiliation between your main site and its mini-sites. Google may define sites to be owned by the same person if the first 3 octets of the sites' IP addresses are the same (e.g. 123.123.123.xxx). Therefore, if you're going to run mini-sites, put them on different Web hosts. Secondly, use different domain names for your mini-sites, rather than sub-domains of your main site. In the past, Google has not penalised sub-domains, but the early results from the Brandy update show a considerable reduction in the presence of sub-domains in the SERPs. Finally, be very careful with the linking strategy you use between mini-sites -- Google will look at the linking structure very critically. Don't plaster each of your sites with links to the others, and don't reciprocate links between the sites. Mini-sites make it easier to create on-topic neighbourhoods and experiment with LSI techniques. Creating a large network can be a means to boost your main site's rank, but make sure you're well aware of the risks involved with creating these mini-sites before you embark. Use Brandy to your Advantage!Google optimisation is now a lot harder than it used to be. However, the index is still manipulable. Success involves hard work, and potentially the expenditure of funds to develop a good mini-site network and buy links on relevant pages.
  10. In this third and last article of our vector graphic series, you'll use your knowledge of faux 3d vector graphics as we walk step-by-step through a real-life, practical application that I think you'll find extremely useful.You'll learn how to make your own Windows XP-style artwork, which you may end up using for Website graphics, software application icons, or in other projects.Note: I'll be making my diagrams and commands using Adobe Illustrator 8, but those who use Freehand should be able to follow along just as easily.Let's start by deconstructing a familiar XP icon -- the flat-screen monitor shown here:1418_1First, create a rectangular shape for the screen using the Pen tool. It's helpful if you understand a little bit about how to draw with perspective (see the links at the end of this article if you want to learn more).1418_2Fill the shape with a radial, white-to-aqua gradient.The first step is to get familiar with Microsoft's style and design guide for creating Windows XP icons. This gives you some guidelines on what the "look" is all about ("fun, color, and energy"), provides a general color palette (duplicated below), and gives examples of how objects should be angled and grouped.1418_gifThe color palette used in Windows XP icons.Did you enjoy reading through the style guide? Then, let's get started with this last step-by-step example, where I'll show you how I make a dog house graphic in the Windows XP style.In The Dog House!First, I make a rough sketch of what I want the icon to look like, keeping in mind the angle guidelines provided by Microsoft.1418_23Then, I create a grid of lines that match the perspective grid from the Microsoft style guide, and use the pen tool to create the front of the doghouse. (You may download an Illustrator file with the same grid.)1418_24Once I have the front of the doghouse, I hide the grid (View > Hide Guides) and use the pen tool to draw the side of the dog house:1418_25I fill the side and the front with a red-to-darker-red gradient:1418_26Then I draw two more shapes for the roof and fill them with a darker red-brown gradient. The second shape (the right diagram) is the back side of the roof; it's what the shape would be if you could "see through" the house. It looks a bit odd in the diagram because it's lying on top of all the other pieces of the dog house.1418_27So, I right-click the second shape (the "back" roof) and choose Arrange > Send to Back.1418_28This gives me a nice dog house shape.1418_29I use the pen tool again to create a door shape and fill it with the darker gradient colors.1418_30Then, I use the pen tool -- set to "no fill" -- to draw an outline around the entire shape. I fill it with an even darker red-brown gradient to get a "border" for the shape.1418_31Since this new shape is lying on top of everything else (and hiding the other parts of the image), I send it to the back as before.1418_32My finished vector graphic:1418_33After I copy it into Photoshop and add a drop shadow, the graphic is complete!1418_3Using the gradient tool, click and hold near the upper middle edge of the shape and drag down, releasing the mouse button just before you reach the bottom of the shape.1418_4Now, we'll create a rounded-edged shape around the screen, which will act as the frame of the monitor. Set the fill of the shape to "none" to help you see what you're doing, then use the pen tool to draw an edge around the screen with rounded corners.1418_5One the shape looks the way you like it, fill it with a grey-to-purple gradient.1418_6Right-click on the shape and choose Arrange > Send to Back. This will allow the blue screen shape to lie on top.1418_7Now, use the Pen tool to create a rounded "L" shape along the top and left edges of the screen and fill it with a purple-to-white gradient. Use the gradient tool to fill the shape with the white coming from the top left corner to make it look like the light is coming from that direction.1418_8Create a highlighted edge along the bottom of the screen using the Pen tool as well. Don't be afraid to zoom in and use the white arrow tool to adjust the edges. We also filled this with the same white-to-purple gradient.1418_9The screen part is done! Now, for the stand?The stand is essentially made up of two shapes that make the front surface and side surface. First, create the shape for the front surface. I filled it with the grey-to-purple gradient.1418_10Next, create the side shape and fill it with a purple-to-white gradient. The gradient changes from dark-to-light from top to bottom, which makes it look as if the light is hitting the monitor from that side, but the top part is slightly shaded by the screen.1418_11Finally, create a "border" for the shape. This new shape juts out slightly on the left and bottom of the stand, but aligns with the right border of the stand.1418_12bWhen you fill this shape (with a dark purple color, or purple-to-grey gradient), it will hide the pair of shapes that make up the stand.1418_12Again, right-click on the shape and go to Arrange > Send to Back to send the "border" behind the shapes of the stand.1418_13Finally, using the ellipse tool, create the base of the stand, filled with a light white-to-lavender fill.1418_14Again, "Send to Back" to place the base behind all the other objects.1418_15Choose the Scale tool.1418_16bWith the base still selected, click and hold the mouse button to scale it down very slightly. Hold the Alt key (Option for Mac) while you drag slightly inward and let go of the mouse button. What this does is scale down a copy of the shape. Now you should have two ellipses, one inside of the other.1418_16Select the outer shape and fill it with the darker grey-to-purple gradient colors.1418_17And there you have it: a vector version of a flat-screened monitor that looks disturbingly like the Windows XP monitor graphic, using nine individual shapes and lots of gradients.1418_19Now, let's put this shape into Photoshop and add a drop-shadow to make it look "officially" XP-ish. Select the entire group of objects and copy.1418_18Paste the shape into a Photoshop document ("as pixels"), resize it as needed, then go to Layer > Layer Style > Drop Shadow.1418_20Set the angle of the drop shadow to 135. Set the Distance and Size to 2 or 3 pixels and adjust it to suit the size of your graphic.1418_21And there you have it: an XP-style computer monitor!1418_22Now, to create and make up your own Windows XP-style graphics will take a bit more artistic skill. In particular, you have to be comfortable with drawing in perspective, and you must be able to take the object you want to draw and "simplify" it so that the icon or graphic will work at small or large sizes.
  11. http://archive.phong.com/tutorials/ http://planetphotoshop.com/ At those sites you can find some handy and helpfull Photoshop tutorials to get started. The more you use the tutorials, the more you will start understanding photoshop and you will become ready to be a webmaster guru
×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.