Jump to content
xisto Community

TheBoutique-

Members
  • Content Count

    10
  • Joined

  • Last visited

Posts posted by TheBoutique-


  1. hi

    what type of logo were you looking for?

    here is a few types

    Iconic/Symbolic - Icons and symbols are compelling yet uncomplicated images that are emblematic of a particular company or product. They use imagery that conveys a literal or abstract representation of your organization. Symbols are less direct than straight text, leaving room for broader interpretation of what the organization represents. example of this logo is the nike swoosh.
    Logotype/Wordmark - A logotype, commonly known in the design industry as a "word mark", incorporates your company or brand name into a uniquely styled type font treatment. Type fonts come in thousands of possible variations, shapes, sizes, and styles, each conveying a slightly different impression upon your intended audience. Script fonts imply a sense of formality and refinement. Thick fonts proclaim strength and power, whereas slanted type fonts impart a sense of motion or movement. Type font treatments can also include hand-drawn letters, characters or symbols that have been rendered in such a way as to intrigue the eye and capture the interest. Images can also be integrated into a logotype, often to great visual effect. Of prime consideration when selecting a logotype or wordmark is legibility and ease of recognition, even when reduced to the size required for printing your business cards. examples of this is the fed ex logo.

    Combination Marks - Combination Marks are graphics with both text and a symbol/icon that signifies the brand image that you wish to project for your company or organization. Concise text can complement an icon or symbol, providing supplemental clarity as to what your enterprise is all about.

    Notice from jeigh:


    There are integrated and stand alone combination marks. For instance, Starbucks logo has the text with the graphic integrated, whereas the AT&T logo has the icon separate from the text.

    examples of these are macdonalds, starbuks and pringles?

    Does this help?

    if u reply with the type i could help

  2. Hi,i have my own business and i was looking for somewhere to sell my products on a site with a shopping cart, but i wanted it to be free!i then found vstore, and its great but there arent many templates to choose from and it isnt very customizable!AND IT HAD ADS!anyone else has this problem


  3. Hi,
    I have noticed there are a lot of people interested in all aspects of web design. As i have done for many other topics, i have searched through the internet looking for information. I have found some information on Web design, which i think will help many of you and answer the questions you have all been asking. I do hope this helps, but if it doesnt, just let me know and i will be happy to have any questions you may have. I too am a hufge fan of web design and all its aspects, and this helped me understand it greatly. Thank you!

    Web design is a process of conceptualization, planning, modeling, and execution of electronic media delivery via Internet in the form of Markup language suitable for interpretation by Web browser and display as Graphical user interface (GUI).

    The intent of web design is to create a web site -- a collection of electronic files that reside on a web server/servers and present content and interactive features/interfaces to the end user in form of Web pages once requested. Such elements as text, bit-mapped images (GIFs, JPEGs, PNGs), forms can be placed on the page using HTML/XHTML/XML tags. Displaying more complex media (vector graphics, animations, videos, sounds) requires plug-ins such as Flash, QuickTime, Java run-time environment, etc. Plug-ins are also embedded into web page by using HTML/XHTML tags.

    Improvements in browsers' compliance with W3C standards prompted a widespread acceptance and usage of XHTML/XML in conjunction with Cascading Style Sheets (CSS) to position and manipulate web page elements and objects. Latest standards and proposals aim at leading to browsers' ability to deliver a wide variety of media and accessibility options to the client possibly without employing plug-ins.

    Typically web pages are classified as static or dynamic.

    Static pages don't change content and layout with every request unless a human (web master/programmer) manually updates the page.

    Dynamic pages adapt their content and/or appearance depending on end-user's input/interaction or changes in the computing environment (user, time, database modifications, etc.) Content can be changed on the client side (end-user's computer) by using client-side scripting languages (JavaScript, JScript, Actionscript, etc.) to alter DOM elements (DHTML). Dynamic content is often compiled on the server utilizing server-side scripting languages (Perl, PHP, ASP, JSP, ColdFusion, etc.). Both approaches are usually used in complex applications.

    With growing specialization in the information technology field there is a strong tendency to draw a clear line between web design and web development.

    Contents [hide]1 History
    2 Web site design
    2.1 Multidisciplinary requirements
    3 Issues
    3.1 Lack of collaboration in design
    3.2 Liquid versus fixed layouts
    3.3 Flash
    3.4 CSS versus tables
    3.5 How it Looks vs. How it Works
    4 Accessible Web design
    5 Website Planning
    5.1 Purpose
    5.2 Audience
    5.3 Content
    5.4 Compatibility and restrictions
    5.5 Planning documentation
    6 See also
    7 References
    8 External links



    [edit] History
    Tim Berners-Lee, the inventor of the World Wide Web, published a website in August 1991.[1] Berners-Lee was the first to combine Internet communication (which had been carrying email and the Usenet for decades) with hypertext (which had also been around for decades, but limited to browsing information stored on a single computer, such as interactive CD-ROM design).

    Websites are written in a markup language called HTML, and early versions of HTML were very basic, only giving websites basic structure (headings and paragraphs), and the ability to link using hypertext. This was new and different to existing forms of communication - users could easily navigate to other pages by following hyperlinks from page to page.

    As the Web and web design progressed, the markup language used to make it became more complex and flexible, giving the ability to add objects like images and tables to a page. Features like tables, which were originally intended to be used to display tabular information, were soon subverted for use as invisible layout devices. With the advent of Cascading Style Sheets (CSS), table-based layout is increasingly regarded as outdated. Database integration technologies such as server-side scripting and design standards like CSS further changed and enhanced the way the Web is made.

    The introduction of Macromedia Flash (now Adobe Flash) into an already interactivity-ready scene has further changed the face of the Web, giving new power to designers and media creators, and offering new interactivity features to users, often at the expense of usability for persons with disabilities, search engine visibility and browser functions available to HTML.


    [edit] Web site design
    A Web site is a collection of information about a particular topic or subject. Designing a website is defined as the arrangement and creation of Web pages that in turn make up a website. A Web page consists of information for which the Web site is developed. A website might be compared to a book, where each page of the book is a web page.

    There are many aspects (design concerns) in this process, and due to the rapid development of the Internet, new aspects may emerge. For typical commercial Web sites, the basic aspects of design are:

    The content: The substance, and information on the site should be relevant to the site and should target the area of the public that the website is concerned with.
    The usability: The site should be user-friendly, with the interface and navigation simple and reliable.
    The appearance: The graphics and text should include a single style that flows throughout, to show consistency. The style should be professional, appealing and relevant.
    The visibility: The site must also be easy to find via most, if not all, major search engines and advertisement media.
    A Web site typically consists of text and images. The first page of a website is known as the Home page or Index. Some websites use what is commonly called a Splash Page. Splash pages might include a welcome message, language/region selection, or disclaimer. Each web page within a Web site is an HTML file which has its own URL. After each Web page is created, they are typically linked together using a navigation menu composed of hyperlinks. Faster browsing speeds have led to shorter attention spans and more demanding online visitors and this has resulted in less use of Splash Pages, particularly where commercial websites are concerned.

    Once a Web site is completed, it must be published or uploaded in order to be viewable to the public over the internet. This may be done using an FTP client. Once published, the Web master may use a variety of techniques to increase the traffic, or hits, that the website receives. This may include submitting the Web site to a search engine such as Google or Yahoo, exchanging links with other Web sites, creating affiliations with similar Web sites, etc.


    [edit] Multidisciplinary requirements
    Web site design crosses multiple disciplines of information systems, information technology and communication design. The website is an information system whose components are sometimes classified as front-end and back-end. The observable content (e.g page layout, user interface, graphics, text, audio) is known as the front-end. The back-end comprises the organization and efficiency of the source code, invisible scripted functions, and the server-side components that process the output from the front-end. Depending on the size of a Web development project, it may be carried out by a multi-skilled individual (sometimes called a web master), or a project manager may oversee collaborative design between group members with specialized skills.


    [edit] Issues
    As in most collaborative designs, there are conflicts between differing goals and methods of web site designs. These are a few of the ongoing ones.


    [edit] Lack of collaboration in design
    In the early stages of the web, there wasn't as much collaboration between web designs and larger advertising campaigns, customer transactions, social networking, intranets and extranets as there is now. Web pages were mainly static online brochures disconnected from the larger projects.

    Many web pages are still disconnected from larger projects. Special design considerations are necessary for use within these larger projects. These design considerations are often overlooked, especially in cases where there is a lack of leadership, understanding or concern for the larger project to facilitate collaboration. This often results in unhealthy competition or compromise between departments, and less than optimal use of web pages.


    [edit] Liquid versus fixed layouts
    On the web the designer has no control over several factors, including the size of the browser window, the web browser used, the input devices used (mouse, touch screen, voice command, text, cell phone number pad, etc.) and the size and characteristics of available fonts.

    Some designers choose to control the appearance of the elements on the screen by using specific width designations. This control may be achieved through the use of a HTML table-based design, or through the use of CSS. Whenever the text, images, and layout of a design do not change as the browser changes, this is referred to as a fixed width design. Proponents of fixed width design prefer the control over the look and feel of the site and the precision placement of objects on the page. Other designers choose a liquid design. A liquid design is one, like Wikipedia, where the design moves to flow content into the whole screen, or a portion of the screen, no matter what the size of the browser window. Proponents of liquid design prefer to use all the screen space available. Liquid design can be achieved through the use of CSS, by avoiding styling the page altogether, or by using HTML tables set to a percentage of the page. Both liquid and fixed design developers must make decisions about how the design should degrade on higher and lower screen resolutions. Sometimes the pragmatic choice is made to flow the design between a minimum and a maximum width. This allows the designer to avoid coding for the browser choices making up the long tail, while still using all available screen space.

    Similar to liquid layout is the optional fit to window feature with Adobe Flash content. This is a fixed layout that optimally scales the content of the page without changing the arrangement or text wrapping when the browser is resized.


    [edit] Flash
    Adobe Flash (formerly Macromedia Flash) is a proprietary, robust graphics animation/application development program used to create and deliver dynamic content, media (such as sound and video), and interactive applications over the web via the browser.

    Flash is not a standard produced by a vendor-neutral standards organization like most of the core protocols and formats on the Internet. Flash is much more restrictive than the open HTML format, though, requiring a proprietary plugin to be seen, and it does not integrate with most web browser UI features like the "Back" button unless a hyperlink is programmed to link a new html page from the Flash file, in which case the animation of the previous page would reset. However, those restrictions may be irrelevant depending on the goals of the web site design.

    According to a study [2], 98% of US Web users have the Flash Player installed [3], with 45%-56%[4] (depending on region) having the latest version. Numbers vary depending on the detection scheme and research demographics[5].

    Many graphic artists use Flash because it gives them exact control over every part of the design, and anything can be animated and generally "jazzed up". Some application designers enjoy Flash because it lets them create applications that don't have to be refreshed or go to a new web page every time an action occurs. Flash can use embedded fonts instead of the standard fonts installed on most computers. There are many sites which forego HTML entirely for Flash. Other sites may use Flash content combined with HTML as conservatively as gifs or jpegs would be used, but with smaller vector file sizes and the option of faster loading animations. Flash may also be used to protect content from unauthorized duplication or searching.

    Flash detractors claim that Flash websites tend to be poorly designed, and often use confusing and non-standard user-interfaces. Up until recently, search engines have been unable to index Flash objects, which has prevented sites from having their contents easily found. This is because many search engine crawlers rely on text to index websites. It is possible to specify alternate content to be displayed for browsers that do not support Flash. Using alternate content also helps search engines to understand the page, and can result in much better visibility for the page. However, the vast majority of Flash websites are not disability accessible (for screen readers, for example) or Section 508 compliant. An additional issue is that sites which commonly use alternate content for search engines to their human visitors are usually judged to be spamming search engines and are automatically banned.

    The most recent incarnation of Flash's scripting language (called "ActionScript", which is an ECMA language similar to JavaScript) incorporates long-awaited usability features, such as respecting the browser's font size and allowing blind users to use screen readers. Actionscript 2.0 is an Object-Oriented language, allowing the use of CSS, XML, and the design of class-based web applications.


    [edit] CSS versus tables
    For more details on this topic, see Tableless web design.
    Back when Netscape Navigator 4 dominated the browser market, the popular solution available for designers to lay out a Web page was by using tables. Often even simple designs for a page would require dozens of tables nested in each other. Many web templates in Dreamweaver and other WYSIWYG editors still use this technique today. Navigator 4 didn't support CSS to a useful degree, so it simply wasn't used.

    After the browser wars were over, and Internet Explorer dominated the market, designers started turning toward CSS as an alternate means of laying out their pages. CSS proponents say that tables should be used only for tabular data, not for layout. Using CSS instead of tables also returns HTML to a semantic markup, which helps bots and search engines understand what's going on in a web page. All modern Web browsers support CSS with different degrees of limitations.

    However, one of the main points against CSS is that by relying on it exclusively, control is essentially relinquished as each browser has its own quirks which result in a slightly different page display. This is especially a problem as not every browser supports the same subset of CSS rules. For designers who are used to table-based layouts, developing Web sites in CSS often becomes a matter of trying to replicate what can be done with tables, leading some to find CSS design rather cumbersome due to lack of familiarity. For example, at one time it was rather difficult to produce certain design elements, such as vertical positioning, and full-length footers in a design using absolute positions. With the abundance of CSS resources available online today, though, designing with reasonable adherence to standards involves little more than applying CSS 2.1 or CSS 3 to properly structured markup.

    These days most modern browsers have solved most of these quirks in CSS rendering and this has made many different CSS layouts possible. However, some people continue to use old browsers, and designers need to keep this in mind, and allow for graceful degrading of pages in older browsers. Most notable among these old browsers are Internet Explorer 5 and 5.5, which, according to some web designers, are becoming the new Netscape Navigator 4 — a block that holds the World Wide Web back from converting to CSS design. However, the W3 Consortium has made CSS in combination with XHTML the standard for web design.


    [edit] How it Looks vs. How it Works
    Some web developers have a graphic arts background and may pay more attention to how a page looks than considering other issues such as how visitors are going to find the page via a search engine. Some might rely more on advertising than search engines to attract visitors to the site. On the other side of the issue, search engine optimization consultants (SEOs) obsess about how well a web site works technically and textually: how much traffic it generates via search engines, and how many sales it makes, assuming looks don't contribute to the sales. As a result, the designers and SEOs often end up in disputes where the designer wants more 'pretty' graphics, and the SEO wants lots of 'ugly' keyword-rich text, bullet lists, and text links. One could argue that this is a false dichotomy due to the possibility that a web design may integrate the two disciplines for a collaborative and synergistic solution. Because some graphics serve communication purposes in addition to aesthetics, how well a site works may depend on the graphic designer's visual communication ideas as well as the SEO considerations.

    Another problem when using lots of graphics on a page is that download times can be greatly lengthened, often irritating the user. This has become less of a problem as the internet has evolved with high-speed internet and the use of vector graphics. This is an engineering challenge to increase bandwidth in addition to an artistic challenge to minimize graphics and graphic file sizes. This is an on-going challenge as increased bandwidth invites increased amounts of content.


    [edit] Accessible Web design
    Main article: Web accessibility
    Accessible Web design is the art of creating webpages that are accessible to everyone, using any device. It is especially important so that people with disabilities - whether due to accident, disease or old age - can access the information in Web pages and be able to navigate through the website.

    To be accessible, web pages and sites must conform to certain accessibility principles. These can be grouped into the following main areas:

    use semantic markup that provides a meaningful structure to the document (i.e. web page)
    Semantic markup also refers to semantically organizing the web page structure and publishing web services description accordingly so that they can be recognised by other web services on different web pages. Standards for semantic web are set by IEEE
    use a valid markup language that conforms to a published DTD or Schema
    provide text equivalents for any non-text components (e.g. images, multimedia)
    use hyperlinks that make sense when read out of context. (e.g. avoid "Click Here.")
    don't use frames
    use CSS rather than HTML Tables for layout.
    author the page so that when the source code is read line-by-line by user agents (such as a screen readers) it remains intelligible. (Using tables for design will often result in information that is not.)
    However, W3C permits an exception where tables for layout either make sense when linearized or an alternate version (perhaps linearized) is made available.


    [edit] Website Planning
    Before creating and uploading a website, it is important to take the time to plan exactly what is needed in the website. Thoroughly considering the audience or target market, as well as defining the purpose and deciding what content will be developed are extremely important.


    [edit] Purpose
    It is essential to define the purpose of the website as one of the first steps in the planning process. A purpose statement should show focus based on what the website will accomplish and what the users will get from it. A clearly defined purpose will help the rest of the planning process as the audience is identified and the content of the site is developed. Setting short and long term goals for the website will help make the purpose clear and plan for the future when expansion, modification, and improvement will take place. Also, goal-setting practices and measurable objectives should be identified to track the progress of the site and determine success.


    [edit] Audience
    Defining the audience is a key step in the website planning process. The audience is the group of people who are expected to visit your website – the market being targeted. These people will be viewing the website for a specific reason and it is important to know exactly what they are looking for when they visit the site. A clearly defined purpose or goal of the site as well as an understanding of what visitors want to do/feel when they come to your site will help to identify the target audience. Upon considering who is most likely to need/use the content, a list of characteristics common to the users such as:

    Audience Characteristics
    Information Preferences
    Computer Specifications
    Web Experience
    Taking into account the characteristics of the audience will allow an effective website to be created that will deliver the desired content to the target audience.


    [edit] Content
    Content evaluation and organization requires that the purpose of the website be clearly defined. Collecting a list of the necessary content then organizing it according to the audience's needs is a key step in website planning. In the process of gathering the content being offered, any items that do not support the defined purpose or accomplish target audience objectives should be removed. It is a good idea to test the content and purpose on a focus group and compare the offerings to the audience needs. The next step is to organize the basic information structure by categorizing the content and organizing it according to user needs. Each category should be named with a concise and descriptive title that will become a link on the website. Planning for the site's content ensures that the wants/needs of the target audience and the purpose of the site will be fulfilled.


    [edit] Compatibility and restrictions
    Because of the market share of modern browsers (depending on your target market), the compatibility of your website with the viewers is restricted. For instance, a website that is designed for the majority of websurfers will be limited to the use of valid XHTML 1.0 Strict or older, Cascading Style Sheets Level 1, and 1024x768 display resolution. This is because Internet Explorer is not fully W3C standards compliant with the modularity of XHTML 1.1 and the majority of CSS beyond 1. A target market of more alternative browser (e.g. Firefox and Opera) users allow for more W3C compliance and thus a greater range of options for a web designer.

    Another restriction on webpage design is the use of different Image file formats. The majority of users can support GIF, JPEG, and PNG (with restrictions). Again Internet Explorer is the major restriction here, not fully supporting PNG's advanced transparency features, resulting in the GIF format still being the most widely used graphic file format for transparent images.

    Many website incompatibilities go unnoticed by the designer and unreported by the users. The only way to be certain a website will work on a particular platform is to test it on that platform.


    [edit] Planning documentation
    Documentation is used to visually plan the site while taking into account the purpose, audience and content, to design the site structure, content and interactions that are most suitable for the website. Documentation may be considered a prototype for the website – a model which allows the website layout to be reviewed, resulting in suggested changes, improvements and/or enhancements. This review process increases the likelihood of success of the website.

    First, the content is categorized and the information structure is formulated. The information structure is used to develop a document or visual diagram called a site map. This creates a visual of how the web pages will be interconnected, which helps in deciding what content will be placed on what pages. There are three main ways of diagramming the website structure:

    Linear Website Diagrams will allow the users to move in a predetermined sequence;
    Hierarchical structures (of Tree Design Website Diagrams) provide more than one path for users to take to their destination;
    Branch Design Website Diagrams allow for many interconnections between web pages such as hyperlinks within sentences.
    In addition to planning the structure, the layout and interface of individual pages may be planned using a storyboard. In the process of storyboarding, a record is made of the description, purpose and title of each page in the site, and they are linked together according to the most effective and logical diagram type. Depending on the number of pages required for the website, documentation methods may include using pieces of paper and drawing lines to connect them, or creating the storyboard using computer software.

    Some or all of the individual pages may be designed in greater detail as a website wireframe, a mock up model or comprehensive layout of what the page will actually look like. This is often done in a graphic program, or layout design program. The wireframe has no working functionality, only planning.


    [edit] See also
    asp.net
    Color tool
    Content management
    Faceted navigation
    Information architecture
    Interaction design
    Java
    Knowledge visualization
    PHP
    Server-side scripting
    Streaming Media
    Style sheet (web development)
    User interface design
    Web 2.0
    Web colors
    Web indexing
    Web integration
    Web usage mining
    Website architecture
    Website builder



  4. Hi!
    I have noticed many people on here are involved with business and making money, and i have read many questions on Eccommerce and how it actually works. As i did with databases, i have looked through the internet finding some information on Ecommerce, and it is below. I hope this helps you understand ecommerce, if you dont understand this or have any further questions, please ask me! I will be happy to look for the answer for you! Thankyou

    Electronic commerce, commonly known as e-commerce or eCommerce, consists of the buying and selling of products or services over electronic systems such as the Internet and other computer networks. The amount of trade conducted electronically has grown dramatically since the wide introduction of the Internet. A wide variety of commerce is conducted in this way, including things such as electronic funds transfer, supply chain management, e-marketing, online marketing, online transaction processing, electronic data interchange (EDI), automated inventory management systems, and automated data collection systems. Modern electronic commerce typically uses the World Wide Web at at least some point in the transaction's lifecycle, although it can encompass a wide range of technologies such as e-mail as well.

    A small percentage of electronic commerce is conducted entirely electronically for "virtual" items such as access to premium content on a website, but most electronic commerce eventually involves physical items and their transportation in at least some way.

    Contents [hide]1 History
    2 Success factors
    2.1 Technical and organizational aspects
    2.2 Customer experience
    3 Problems
    4 Product suitability
    5 Acceptance
    6 See also
    7 References
    8 External links



    [edit] History
    The meaning of the term "electronic commerce" has changed over the last 30 years. Originally, "electronic commerce" meant the facilitation of commercial transactions electronically, usually using technology like Electronic Data Interchange (EDI) and Electronic Funds Transfer (EFT), where both were introduced in the late 1970s, for example, to send commercial documents like purchase orders or invoices electronically.

    The 'electronic' or 'e' in e-commerce refers to the technology/systems; the 'commerce' refers to be traditional business models. E-commerce is the complete set of processes that support commercial business activities on a network. In the 1970s and 1980s, this would also have involved information analysis. The growth and acceptance of credit cards, automated teller machines (ATM) and telephone banking in the 1980s were also forms of e-commerce. However, from the 1990s onwards, this would include enterprise resource planning systems (ERP), data mining and data warehousing.

    In the dot com era, it came to include activities more precisely termed "Web commerce" -- the purchase of goods and services over the World Wide Web, usually with secure connections (HTTPS, a special server protocol that encrypts confidential ordering data for customer protection) with e-shopping carts and with electronic payment services, like credit card payment authorizations.

    Today, it encompasses a very wide range of business activities and processes, from e-banking to offshore manufacturing to e-logistics. The ever growing dependence of modern industries on electronically enabled business processes gave impetus to the growth and development of supporting systems, including backend systems, applications and middleware. Examples are broadband and fibre-optic networks, supply-chain management software, customer relationship management software, inventory control systems and financial accounting software.

    When the Web first became well-known among the general public in 1994, many journalists and pundits forecast that e-commerce would soon become a major economic sector. However, it took about four years for security protocols (like HTTPS) to become sufficiently developed and widely deployed. Subsequently, between 1998 and 2000, a substantial number of businesses in the United States and Western Europe developed rudimentary web sites.

    Although a large number of "pure e-commerce" companies disappeared during the dot-com collapse in 2000 and 2001, many "brick-and-mortar" retailers recognized that such companies had identified valuable niche markets and began to add e-commerce capabilities to their Web sites. For example, after the collapse of online grocer Webvan, two traditional supermarket chains, Albertsons and Safeway, both started e-commerce subsidiaries through which consumers could order groceries online.

    The emergence of e-commerce also significantly lowered barriers to entry in the selling of many types of goods; accordingly many small home-based proprietors are able to use the internet to sell goods. Often, small sellers use online auction sites such as EBay, or sell via large corporate websites like Amazon.com, in order to take advantage of the exposure and setup convenience of such sites.


    [edit] Success factors
    In many cases, an e-commerce company will survive not only based on its product, but by having a competent management team, good post-sales services, well-organized business structure, network infrastructure and a secured, well-designed website. A company that wants to succeed will have to perform 2 things: Technical and organizational aspects and customer-oriented. Following factors will make business of companies succeed in e-commerce:


    [edit] Technical and organizational aspects
    Sufficient work done in market research and analysis. E-commerce is not exempt from good business planning and the fundamental laws of supply and demand. Business failure is as much a reality in e-commerce as in any other form of business.
    A good management team armed with information technology strategy. A company's IT strategy should be a part of the business re-design process.
    Providing an easy and secured way for customers to effect transactions. Credit cards are the most popular means of sending payments on the internet, accounting for 90% of online purchases. In the past, card numbers were transferred securely between the customer and merchant through independent payment gateways. Such independent payment gateways are still used by most small and home businesses. Most merchants today process credit card transactions on site through arrangements made with commercial banks or credit cards companies.
    Providing reliability and security. Parallel servers, hardware redundancy, fail-safe technology, information encryption, and firewalls can enhance this requirement.
    Providing a 360-degree view of the customer relationship, defined as ensuring that all employees, suppliers, and partners have a complete view, and the same view, of the customer. However, customers may not appreciate the big brother experience.
    Constructing a commercially sound business model.
    Engineering an electronic value chain in which one focuses on a "limited" number of core competencies -- the opposite of a one-stop shop. (Electronic stores can appear either specialist or generalist if properly programmed.)
    Operating on or near the cutting edge of technology and staying there as technology changes (but remembering that the fundamentals of commerce remain indifferent to technology).
    Setting up an organization of sufficient alertness and agility to respond quickly to any changes in the economic, social and physical environment.
    Providing an attractive website. The tasteful use of colour, graphics, animation, photographs, fonts, and white-space percentage may aid success in this respect.
    Streamlining business processes, possibly through re-engineering and information technologies.
    Providing complete understanding of the products or services offered, which not only includes complete product information, but also sound advisors and selectors.
    Naturally, the e-commerce vendor must also perform such mundane tasks as being truthful about its product and its availability, shipping reliably, and handling complaints promptly and effectively. A unique property of the Internet environment is that individual customers have access to far more information about the seller than they would find in a brick-and-mortar situation. (Of course, customers can, and occasionally do, research a brick-and-mortar store online before visiting it, so this distinction does not hold water in every case.)


    [edit] Customer experience
    A successful e-commerce organization must also provide an enjoyable and rewarding experience to its customers. Many factors go into making this possible. Such factors include:

    Providing value to customers. Vendors can achieve this by offering a product or product-line that attracts potential customers at a competitive price, as in non-electronic commerce.
    Providing service and performance. Offering a responsive, user-friendly purchasing experience, just like a flesh-and-blood retailer, may go some way to achieving these goals.
    Providing an incentive for customers to buy and to return. Sales promotions to this end can involve coupons, special offers, and discounts. Cross-linked websites and advertising affiliate programs can also help.
    Providing personal attention. Personalized web sites, purchase suggestions, and personalized special offers may go some of the way to substituting for the face-to-face human interaction found at a traditional point of sale.
    Providing a sense of community. Chat rooms, discussion boards, soliciting customer input and loyalty programs (sometimes called affinity programs) can help in this respect.
    Owning the customer's total experience. E-tailers foster this by treating any contacts with a customer as part of a total experience, an experience that becomes synonymous with the brand.
    Letting customers help themselves. Provision of a self-serve site, easy to use without assistance, can help in this respect. This implies that all product information is available, cross-sell information, advise for product alternatives, and supplies & accessory selectors.
    Helping customers do their job of consuming. E-tailers and online shopping directories can provide such help through ample comparative information and good search facilities. Provision of component information and safety-and-health comments may assist e-tailers to define the customers' job.

    [edit] Problems
    Even if a provider of E-commerce goods and services rigorously follows these "key factors" to devise an exemplary e-commerce strategy, problems can still arise. Sources of such problems include:

    Failure to understand customers, why they buy and how they buy. Even a product with a sound value proposition can fail if producers and retailers do not understand customer habits, expectations, and motivations. E-commerce could potentially mitigate this potential problem with proactive and focused marketing research, just as traditional retailers may do.
    Failure to consider the competitive situation. One may have the will to construct a viable book e-tailing business model, but lack the capability to compete with Amazon.com.
    Inability to predict environmental reaction. What will competitors do? Will they introduce competitive brands or competitive web sites? Will they supplement their service offerings? Will they try to sabotage a competitor's site? Will price wars break out? What will the government do? Research into competitors, industries and markets may mitigate some consequences here, just as in non-electronic commerce.
    Over-estimation of resource competence. Can staff, hardware, software, and processes handle the proposed strategy? Have e-tailers failed to develop employee and management skills? These issues may call for thorough resource planning and employee training.
    Failure to coordinate. If existing reporting and control relationships do not suffice, one can move towards a flat, accountable, and flexible organizational structure, which may or may not aid coordination.
    Failure to obtain senior management commitment. This often results in a failure to gain sufficient corporate resources to accomplish a task. It may help to get top management involved right from the start.
    Failure to obtain employee commitment. If planners do not explain their strategy well to employees, or fail to give employees the whole picture, then training and setting up incentives for workers to embrace the strategy may assist.
    Under-estimation of time requirements. Setting up an e-commerce venture can take considerable time and money, and failure to understand the timing and sequencing of tasks can lead to significant cost overruns. Basic project planning, critical path, critical chain, or PERT analysis may mitigate such failings. Profitability may have to wait for the achievement of market share.
    Failure to follow a plan. Poor follow-through after the initial planning, and insufficient tracking of progress against a plan can result in problems. One may mitigate such problems with standard tools: benchmarking, milestones, variance tracking, and penalties and rewards for variances.
    Becoming the victim of organized crime. Many syndicates have caught on to the potential of the Internet as a new revenue stream. Two main methods are as follows: (1) Using identity theft techniques like phishing to order expensive goods and bill them to some innocent person, then liquidating the goods for quick cash; (2) Extortion by using a network of compromised "zombie" computers to engage in distributed denial of service attacks against the target Web site until it starts paying protection money.
    Failure to expect the unexpected. Too often new businesses do not take into account the amount of time, money or resources needed to complete a project and often find themselves without the necessary components to become successful.

    [edit] Product suitability
    Certain products or services appear more suitable for online sales; others remain more suitable for offline sales. While credit cards are currently the most popular means of paying for online goods and services, alternative online payments will account for 26% of e-commerce volume by 2009 according to Celent.[1]

    Many successful purely virtual companies deal with digital products, (including information storage, retrieval, and modification), music, movies, office supplies, education, communication, software, photography, and financial transactions. Examples of this type of company include: Google, eBay and Paypal. Other successful marketers such as use Drop shipping or Affiliate marketing techniques to facilitate transactions of tangible goods without maintaining real inventory. Examples include numerous sellers on eBay.

    Virtual marketers can sell some non-digital products and services successfully. Such products generally have a high value-to-weight ratio, they may involve embarrassing purchases, they may typically go to people in remote locations, and they may have shut-ins as their typical purchasers. Items which can fit through a standard letterbox — such as music CDs, DVDs and books — are particularly suitable for a virtual marketer, and indeed Amazon.com, one of the few enduring dot-com companies, has historically concentrated on this field.

    Products such as spare parts, both for consumer items like washing machines and for industrial equipment like centrifugal pumps, also seem good candidates for selling online. Retailers often need to order spare parts specially, since they typically do not stock them at consumer outlets -- in such cases, e-commerce solutions in spares do not compete with retail stores, only with other ordering systems. A factor for success in this niche can consist of providing customers with exact, reliable information about which part number their particular version of a product needs, for example by providing parts lists keyed by serial number.

    Purchases of pornography and of other sex-related products and services fulfill the requirements of both virtuality (or if non-virtual, generally high-value) and potential embarrassment; unsurprisingly, provision of such services has become the most profitable segment of e-commerce. [citation needed]

    There are also many disadvantages of e-commerce, one of the main ones is fraud. This is where your details (name, bank card number, age, national insurance number) are entered into what look to be a safe site but really it is not. These details can then be used to steal money from you and can be used to buy things on line that you are completely unaware of until it is too late. If this information is leaked into the wrong hands. People are able to steal your identity, and commit more fraud crimes under your name. Finally there are many problems with e commerce some of which are:

    Failure to understand customers, why they buy and how they buy. Even a product with a sound value proposition can fail if producers and retailers do not understand customer habits, expectations, and motivations. E-commerce could potentially mitigate this potential problem with proactive and focused marketing research, just as traditional retailers may do. Failure to consider the competitive situation. One may have the will to construct a viable book e-tailing business model, but lack the capability to compete with Amazon. Inability to predict environmental reaction. What will competitors do? Will they introduce competitive brands or competitive web sites? Will they supplement their service offerings? Will they try to sabotage a competitor's site? Will price wars break out? What will the government do? Research into competitors, industries and markets may mitigate some consequences here, just as in non-electronic commerce. Over-estimation of resource competence. Can staff, hardware, software, and processes handle the proposed strategy? Have e-tailer's failed to develop employee and management skills? These issues may call for thorough resource planning and employee training.

    Products less suitable for e-commerce include products that have a low value-to-weight ratio, products that have a smell, taste, or touch component, products that need trial fittings — most notably clothing — and products where colour integrity appears important. Nonetheless, Tesco.com has had success delivering groceries in the UK, albeit that many of its goods are of a generic quality, and clothing sold through the internet is big business in the U.S. Also, the recycling program Cheapcycle sells goods over the internet, but avoids the low value-to-weight ratio problem by creating different groups for various regions, so that shipping costs remain low.


    [edit] Acceptance
    Consumers have accepted the e-commerce business model less readily than its proponents originally expected. Even in product categories suitable for e-commerce, electronic shopping has developed only slowly. Several reasons might account for the slow uptake, including:

    Concerns about security. Many people will not use credit cards over the Internet due to concerns about theft and credit card fraud.
    Lack of instant gratification with most e-purchases (non-digital purchases). Much of a consumer's reward for purchasing a product lies in the instant gratification of using and displaying that product. This reward does not exist when one's purchase does not arrive for days or weeks.
    The problem of access to web commerce, mainly for poor households and for developing countries. Low penetration rates of Internet access in some sectors greatly reduces the potential for e-commerce.
    The social aspect of shopping. Some people enjoy talking to sales staff, to other shoppers, or to their cohorts: this social reward side of retail therapy does not exist to the same extent in online shopping.
    Poorly designed, bug-infested e-Commerce web sites that frustrate online shoppers and drive them away.
    Inconsistent return policies among e-tailers or difficulties in exchange/return.

    [edit] See also
    Drop shipping

    [edit] References
    ^ Celent Report: According to figures published by Celent 25 May 2006.
    Chaudhury, Abijit; Jean-Pierre Kuilboer (2002). e-Business and e-Commerce Infrastructure. McGraw-Hill. ISBN 0-07-247875-6.
    Frieden, Jonathan D. & Sean Patrick Roche (2006-12-19), "E-Commerce: Legal Issues of the Online Retailer in Virginia", Richmond Journal of Law & Technology 13 (2)
    Kessler, M. (2003). More shoppers proceed to checkout online. Retrieved January 13, 2004
    Nissanoff, Daniel (2006). FutureShop: How the New Auction Culture Will Revolutionize the Way We Buy, Sell and Get the Things We Really Want, Hardcover, The Penguin Press, 246 pages. ISBN 1-59420-077-7.
    Seybold, Pat (2001). Customers.com. Crown Business Books (Random House). ISBN 0-609-60772-3.



  5. Hi,
    I noticed there were a lot of questions in all of the database topics, so i went around the internet using different sources, and have some information that will answer your questions and help you understand databases. I hope this helps, and if you have any other questions which arent answered below, please send a message asking. Thankyou.

    In computing, a database can be defined as a structured collection of records or data that is stored in a computer so that a program can consult it to answer queries. The records retrieved in answer to queries become information that can be used to make decisions. The computer program used to manage and query a database is known as a database management system (DBMS). The properties and design of database systems are included in the study of information science.

    The term "database" originated within the computing discipline. Although its meaning has been broadened by popular use, even to include non-electronic databases, this article is about computer databases. Database-like records have been in existence since well before the Industrial Revolution in the form of ledgers, sales receipts and other business-related collections of data.

    The central concept of a database is that of a collection of records, or pieces of information. Typically, for a given database, there is a structural description of the type of facts held in that database: this description is known as a schema. The schema describes the objects that are represented in the database, and the relationships among them. There are a number of different ways of organizing a schema, that is, of modeling the database structure: these are known as database models (or data models). The model in most common use today is the relational model, which in layman's terms represents all information in the form of multiple related tables each consisting of rows and columns (the true definition uses mathematical terminology). This model represents relationships by the use of values common to more than one table. Other models such as the hierarchical model and the network model use a more explicit representation of relationships.

    The term database refers to the collection of related records, and the software should be referred to as the database management system or DBMS. When the context is unambiguous, however, many database administrators and programmers use the term database to cover both meanings.

    Many professionals consider a collection of data to constitute a database only if it has certain properties: for example, if the data is managed to ensure its integrity and quality, if it allows shared access by a community of users, if it has a schema, or if it supports a query language. However, there is no definition of these properties that is universally agreed upon.

    Database management systems are usually categorized according to the data model that they support: relational, object-relational, network, and so on. The data model will tend to determine the query languages that are available to access the database. A great deal of the internal engineering of a DBMS, however, is independent of the data model, and is concerned with managing factors such as performance, concurrency, integrity, and recovery from hardware failures. In these areas there are large differences between products.

    Contents [hide]1 History
    2 Database models
    2.1 Flat model
    2.2 Hierarchical model
    2.3 Relational model
    2.3.1 Relational operations
    2.3.2 Normal Forms
    2.4 Object database models
    2.5 Post-relational database models
    3 Database internals
    3.1 Storage and Physical Database Design
    3.1.1 Indexing
    3.2 Transactions and concurrency
    3.3 Replication
    3.4 Security
    4 Applications of databases
    5 Database development platforms
    6 Notes
    7 References
    8 See Also



    [edit] History
    The earliest known use of the term 'data bases' was in November 1963, when the System Development Corporation sponsored a symposium under the title Development and Management of a Computer-centered Data Base[1]. Database as a single word became common in Europe in the early 1970s and by the end of the decade it was being used in major American newspapers. (Databank, a comparable term, had been used in the Washington Post newspaper as early as 1966.)

    The first database management systems were developed in the 1960s. A pioneer in the field was Charles Bachman. Bachman's early papers show that his aim was to make more effective use of the new direct access storage devices becoming available: until then, data processing had been based on punched cards and magnetic tape, so that serial processing was the dominant activity. Two key data models arose at this time: CODASYL developed the network model based on Bachman's ideas, and (apparently independently) the hierarchical model was used in a system developed by North American Rockwell, later adopted by IBM as the cornerstone of their IMS product. While IMS along with the CODASYL IDMS were the big, high visibility databases developed in the 1960's, several others were also born in that decade, some of which have a significant installed base today. Two worthy of mention are the PICK and MUMPS databases, with the former developed originally as an operating system with an embedded database and the latter as a programming language and database for the development of data-based software.

    The relational model was proposed by E. F. Codd in 1970. He criticized existing models for confusing the abstract description of information structure with descriptions of physical access mechanisms. For a long while, however, the relational model remained of academic interest only. While CODASYL products (IDMS) and network model products (IMS) were conceived as practical engineering solutions taking account of the technology as it existed at the time, the relational model took a much more theoretical perspective, arguing (correctly) that hardware and software technology would catch up in time. Among the first implementations were Michael Stonebraker's Ingres at Berkeley, and the System R project at IBM. Both of these were research prototypes, announced during 1976. The first commercial products, Oracle and DB2, did not appear until around 1980. The first successful database product for microcomputers was dBASE for the CP/M and PC-DOS/MS-DOS operating systems.

    During the 1980s, research activity focused on distributed database systems and database machines, but these developments had little effect on the market. Another important theoretical idea was the Functional Data Model, but apart from some specialized applications in genetics, molecular biology, and fraud investigation, the world took little notice.

    In the 1990s, attention shifted to object-oriented databases. These had some success in fields where it was necessary to handle more complex data than relational systems could easily cope with, such as spatial databases, engineering data (including software engineering repositories), and multimedia data. Some of these ideas were adopted by the relational vendors, who integrated new features into their products as a result. The 1990s also saw the spread of Open Source databases, such as PostgreSQL and MySQL.

    In the 2000s, the fashionable area for innovation is the XML database. As with object databases, this has spawned a new collection of startup companies, but at the same time the key ideas are being integrated into the established relational products. XML databases aim to remove the traditional divide between documents and data, allowing all of an organization's information resources to be held in one place, whether they are highly structured or not.


    [edit] Database models
    Main article: Database models
    Various techniques are used to model data structure.

    Most database systems are built around one particular data model, although it is increasingly common for products to offer support for more than one model. For any one logical model various physical implementations may be possible, and most products will offer the user some level of control in tuning the physical implementation, since the choices that are made have a significant effect on performance. An example is the relational model: all serious implementations of the relational model allow the creation of indexes which provide fast access to rows in a table if the values of certain columns are known.


    [edit] Flat model
    This may not strictly qualify as a data model, as defined above. The flat (or table) model consists of a single, two-dimensional array of data elements, where all members of a given column are assumed to be similar values, and all members of a row are assumed to be related to one another.


    [edit] Hierarchical model
    In a hierarchical model, data is organized into a tree-like structure, implying a single upward link in each record to describe the nesting, and a sort field to keep the records in a particular order in each same-level list.


    [edit] Relational model
    Three key terms are used extensively in relational database models: relations, attributes, and domains. A relation is a table with columns and rows. The named columns of the relation are called attributes, and the domain is the set of values the attributes are allowed to take.

    The basic data structure of the relational model is the table, where information about a particular entity (say, an employee) is represented in columns and rows (also called tuples). Thus, the "relation" in "relational database" refers to the various tables in the database; a relation is a set of tuples. The columns enumerate the various attributes of the entity (the employee's name, address or phone number, for example), and a row is an actual instance of the entity (a specific employee) that is represented by the relation. As a result, each tuple of the employee table represents various attributes of a single employee.

    All relations (and, thus, tables) in a relational database have to adhere to some basic rules to qualify as relations. First, the ordering of columns is immaterial in a table. Second, there can't be identical tuples or rows in a table. And third, each tuple will contain a single value for each of its attributes.

    A relational database contains multiple tables, each similar to the one in the "flat" database model. One of the strengths of the relational model is that, in principle, any value occurring in two different records (belonging to the same table or to different tables), implies a relationship among those two records. Yet, in order to enforce explicit integrity constraints, relationships between records in tables can also be defined explicitly, by identifying or non-identifying parent-child relationships characterized by assigning cardinality (1:1, (0)1:M, M:M). Tables can also have a designated single attribute or a set of attributes that can act as a "key", which can be used to uniquely identify each tuple in the table.

    A key that can be used to uniquely identify a row in a table is called a primary key. Keys are commonly used to join or combine data from two or more tables. For example, an Employee table may contain a column named Location which contains a value that matches the key of a Location table. Keys are also critical in the creation of indices, which facilitate fast retrieval of data from large tables. Any column can be a key, or multiple columns can be grouped together into a compound key. It is not necessary to define all the keys in advance; a column can be used as a key even if it was not originally intended to be one.


    [edit] Relational operations
    Users (or programs) request data from a relational database by sending it a query that is written in a special language, usually a dialect of SQL. Although SQL was originally intended for end-users, it is much more common for SQL queries to be embedded into software that provides an easier user interface. Many web sites, such as Wikipedia, perform SQL queries when generating pages.

    In response to a query, the database returns a result set, which is just a list of rows containing the answers. The simplest query is just to return all the rows from a table, but more often, the rows are filtered in some way to return just the answer wanted. Often, data from multiple tables are combined into one, by doing a join. There are a number of relational operations in addition to join.


    [edit] Normal Forms
    Main article: Database normalization
    Relations are classified based upon the types of anomalies to which they're vulnerable. A database that's in the first normal form is vulnerable to all types of anomalies, while a database that's in the domain/key normal form has no modification anomalies. Normal forms are hierarchical in nature. That is, the lowest level is the first normal form, and the database cannot meet the requirements for higher level normal forms without first having met all the requirements of the lesser normal form.


    [edit] Object database models
    In recent years, the object-oriented paradigm has been applied to database technology, creating a new programming model known as object databases. These databases attempt to bring the database world and the application programming world closer together, in particular by ensuring that the database uses the same type system as the application program. This aims to avoid the overhead (sometimes referred to as the impedance mismatch) of converting information between its representation in the database (for example as rows in tables) and its representation in the application program (typically as objects). At the same time, object databases attempt to introduce the key ideas of object programming, such as encapsulation and polymorphism, into the world of databases.

    A variety of these ways have been tried for storing objects in a database. Some products have approached the problem from the application programming end, by making the objects manipulated by the program persistent. This also typically requires the addition of some kind of query language, since conventional programming languages do not have the ability to find objects based on their information content. Others have attacked the problem from the database end, by defining an object-oriented data model for the database, and defining a database programming language that allows full programming capabilities as well as traditional query facilities.


    [edit] Post-relational database models
    Several products have been identified as post-relational because the data model incorporates relations but is not constrained by the Information Principle, requiring that all information is represented by data values in relations. Products using a post-relational data model typically employ a model that actually pre-dates the relational model. These might be identified as a directed graph with trees on the nodes.

    Examples of models that could be classified as post-relational are PICK aka MultiValue, and MUMPS.r


    [edit] Database internals

    [edit] Storage and Physical Database Design
    Main article: Database storage structures
    This short section requires expansion.

    Database tables/indexes are typically stored in memory or on hard disk in one of many forms, ordered/unordered Flat files, ISAM, Heaps, Hash buckets or B+ Trees. These have various advantages and disadvantages discussed further in the main article on this topic. The most commonly used are B+trees and ISAM.

    Other important design choices relate to the clustering of data by category (such as grouping data by month, or location), creating pre-computed views known as materialized views, partitioning data by range or hash. As well memory management and storage topology can be important design choices for database designers. Just as normalization is used to reduce storage requirements and improve the extensibility of the database, conversely denormalization is often used to reduce join complexity and reduce execution time for queries. [2]


    [edit] Indexing
    All of these databases can take advantage of indexing to increase their speed, and this technology has advanced tremendously since its early uses in the 1960s and 1970s. The most common kind of index is a sorted list of the contents of some particular table column, with pointers to the row associated with the value. An index allows a set of table rows matching some criterion to be located quickly. Typically, indexes are also stored in the various forms of data-structure mentioned above (such as B-trees, hashes, and linked lists). Usually, a specific technique is chosen by the database designer to increase efficiency in the particular case of the type of index required.

    Relational DBMSs have the advantage that indexes can be created or dropped without changing existing applications making use of it. The database chooses between many different strategies based on which one it estimates will run the fastest. In other words, indexes are transparent to the application or end-user querying the database; while they affect performance, any SQL command will run with or without indexes existing in the database.

    Relational DBMSs utilize many different algorithms to compute the result of an SQL statement. The RDBMS will produce a plan of how to execute the query, which is generated by analyzing the run times of the different algorithms and selecting the quickest. Some of the key algorithms that deal with joins are Nested loop join, Sort-Merge Join and Hash Join. Which of these is chosen depends on whether an index exists, what type it is, and its cardinality.


    [edit] Transactions and concurrency
    In addition to their data model, most practical databases ("transactional databases") attempt to enforce a database transaction . Ideally, the database software should enforce the ACID rules, summarized here:

    Atomicity: Either all the tasks in a transaction must be done, or none of them. The transaction must be completed, or else it must be undone (rolled back).
    Consistency: Every transaction must preserve the integrity constraints — the declared consistency rules — of the database. It cannot place the data in a contradictory state.
    Isolation: Two simultaneous transactions cannot interfere with one another. Intermediate results within a transaction are not visible to other transactions.
    Durability: Completed transactions cannot be aborted later or their results discarded. They must persist through (for instance) restarts of the DBMS after crashes
    A cascading rollback occurs in database systems when a transaction (T1) causes a failure and a rollback must be performed. Other transactions dependent on T1's actions must also be rolled back due to T1's failure, thus causing a cascading effect.
    In practice, many DBMS's allow most of these rules to be selectively relaxed for better performance.

    Concurrency control is a method used to ensure that transactions are executed in a safe manner and follow the ACID rules. The DBMS must be able to ensure that only serializable, recoverable schedules are allowed, and that no actions of committed transactions are lost while undoing aborted transactions.


    [edit] Replication
    Replication of databases is closely related to transactions. If a database can log its individual actions, it is possible to create a duplicate of the data in real time. The duplicate can be used to improve performance or availability of the whole database system. Common replication concepts include:

    Master/Slave Replication: All write requests are performed on the master and then replicated to the slaves
    Quorum: The result of Read and Write requests are calculated by querying a "majority" of replicas.
    Multimaster: Two or more replicas sync each other via a transaction identifier.
    Parallel synchronous replication of databases enables transactions to be replicated on multiple servers simultaneously, which provides a method for backup and security as well as data availability. The first parallel synchronous replication systems were deployed by Parallel Computers Technology, Inc.(for SQL Server databases) using patented technology developed by a team of parallel computing specialists.


    [edit] Security
    Database security is the system, processes, and procedures that protect a database from unintended activity.

    In the United Kingdom legislation protecting the public from unauthorized disclosure of personal information held on data bases falls under the Office of the Information Commissioner. United Kingdom based organizations holding personal data in electronic format (data bases for example) are required to register with the Data Commissioner. (reference: [1])


    [edit] Applications of databases
    Databases are used in many applications, spanning virtually the entire range of computer software. Databases are the preferred method of storage for large multiuser applications, where coordination between many users is needed. Even individual users find them convenient, and many electronic mail programs and personal organizers are based on standard database technology. Software database drivers are available for most database platforms so that application software can use a common application programming interface (API) to retrieve the information stored in a database. Two commonly used database APIs are JDBC and ODBC.


    [edit] Database development platforms
    4D
    Alpha Five
    Apache Derby (Java, also known as IBM Cloudscape and Sun Java DB)
    BerkeleyDB
    dBase
    FileMaker
    Firebird_(database_server)
    Hsqldb (Java)
    IBM DB2
    Informix
    Ingres
    Interbase
    MaxDB (formerly SapDB)
    Microsoft Access
    Microsoft SQL Server (derived from Sybase)
    MySQL
    Oracle Corporation
    Paradox (database)
    PostgreSQL
    Sybase
    Visual FoxPro
    Trackvia

    [edit] Notes
    ^ Swanson, Kenneth (1963-11-08). Development and Management of a Computer-Centered Database. dtic.mil. Retrieved on 2007-07-20.
    ^ S. Lightstone, T. Teorey, T. Nadeau, Physical Database Design: the database professional's guide to exploiting indexes, views, storage, and more, Morgan Kaufmann Press, 2007. ISBN: 0123693896
    `


    [edit] References
    C. J. Date, An Introduction to Database Systems, Eighth Edition, Addison Wesley, 2003.
    J. Gray, A. Reuter, Transaction Processing: Concepts and Techniques, 1st edition, Morgan Kaufmann Publishers, 1992.
    David M. Kroenke, Database Processing: Fundamentals, Design, and Implementation (1997), Prentice-Hall, Inc., pages 130-144
    S. Lightstone, T. Teorey, T. Nadeau, “Physical Database Design: the database professional's guide to exploiting indexes, views, storage, and more”, Morgan Kaufmann Press, 2007. ISBN: 0123693896
    T. Teorey, S. Lightstone, T. Nadeau, “Database Modeling & Design: Logical Design, 4th edition”, Morgan Kaufmann Press, 2005. ISBN: 0-12-685352-5
    J. Shih, "Why Synchronous Parallel Transaction Replication is Hard, But Inevitable?", white paper, 2007.


×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.