CaptainRon
Members-
Content Count
235 -
Joined
-
Last visited
Everything posted by CaptainRon
-
Windows Vista Discussions
CaptainRon replied to philbrennan537's topic in Websites and Web Designing
Where can I get Vista beta from? I tried searching on p2p... any other methods?By the way, I believe that MS is planning for the next big jump, and that will be through Human-Computer Natural Interface. They are planning to build AI Based OS that will be able to talk to a user and do the deeds for him. This is what Bill Gates terms as the next big evolution in Interfaces.Actually even I am making my final year project on Human/Comp Interface, so really started wondering about the MS proposition. If MS does come out with a truly functional Human/Computer Natural Interface, I think it will make sure that the "casual user" base that MS holds today, would be still held for many more years to come. To counter such a threat, Linux world will have to make a move now itself.Since I am making my proj on .NET (which can evidently compile on Mono 1.0, tested it) I have a proj idea. Will post it on Antilost.com. -
A website that I visit pretty regularly, Sitepoint.com, today published an excellent introduction for Ruby On Rails (ROR). I, like many of the webdevelopers, have been terribly curious about this almost "magical, no fuss" web development language, hence the time was perfect for Sitepoint to come out with the article.Danny's article on Sitepoint gave a brief introduction, but moreso stressed and emphasized on the "ease of development" that ROR brings along.We've witnessed years of almost three decades of "hero worshipping" OOP techniques in software programming, and for a brief period with the onset of PHP5, we've witnessed the same in the web programming sector. Now with the introduction of ROR, this can only increase .. increase exponentially. =)And it is a good thing, this OOP, it is good!Now as my interest in ROR has surely surfaced, I visit Wikipedia to see what they have to say about this new magical utopian web programming langauge. And I must say the blokes at Wikipedia have done an excellent job maintaining the entry for ROR. It is definitely a must read for anybody even remotely interested.But what really caught my eye was the Philosophy Of Ruby On Rails. It adhere's to the DRY principly, Dry - Don't Repeat Yourself. Something I yearned for in PHP/Perl/ASP/Coldfusion, but like nirvana never could find it. If ROR can ever so remotely make DRY a practical principle, I will be the first to leave all and start 'practicing the ROR religion'.Another defining principle of ROR is - Convention Over Configuration. Which Wikipedia graciously explains as, and I quote:> > "Convention Over Configuration" means that the programmer> only needs to define configuration which is unconventional.> > For example, if there is a Post class in model, the corresponding> table in the database is posts, but if the table is unconventional> (e.g. blogposts), it must be specified manually (set_table_name> "blogposts").>Eh, sounds not too bad for a lazy inefficient web developer like myself, does it .. ;-)As I get all excited about ROR, I've finally decided to try it out on my little localhost tonight. Taking the plunge, metaphorically. I do hope ROR does live up to all this hype that its surrounded by and I've indulged in.
-
Who Is Your Favourite Free Host? whats your fav free host?
CaptainRon replied to teartrack_sos's topic in General Discussion
I have tried many, and I never landed on something as great as Xisto.comThey not only provide a great Pro Package, but have a truly wonderful credit system!!!!The best hosting that i have had apart from Xisto, were, Aboho, Aushost. i was using Aboho Free account (no posts required) but it was only 5 mb and 500 mb b/w. Then I bought a domain, so went for aushost instead, which gives free hosting to domain owners. It gives 50mb and 500mb b/w. It's only good if you have a homepage, not a portal.No host provided enough features to host a portal, except :-) Xisto.I bet whoever comes to Xisto once, won't leave it... also, the Forum is a great place to socialize. -
apart from that i don't see them displaying relevant ads either!!! I have a Military Aviation Website, and what they show is Don't Divorce ads etc... It makes it harder to generate clicks.
-
Wow, that's great.By the way what is the criteria for being deemed as a good quality website?
-
yordan dude, the disk management starts with the simple command diskmgmt.msc secondly, the method i suggested, takes in consideration that the mbr is gone (all partitions gone). marretas needs the files on the hard disk more importantly.
-
Here is a sol, use diskmgmt to delete and then recreate all the partitions you had earlier and in the exact sizes. Like if you had two 10GB partitions, make 2 ten GB partitions. Do not format the partitions, if you do that the data is gone. Then use a software like Recover4All (any tool that can recover deleted files), and make it scan ur damaged HD's newly created drives. Theoritically it should recover most of your files.
-
Google Pigeonrank: The Secret Of Their Search
CaptainRon replied to CaptainRon's topic in Search Engines
did you have to talk it out! hey common i too posted it here on 1st April, expecting atleast a few victims!!! -
I will quote the article from Google's own site: https://www.google.com/technology/pigeonrank.html
-
Rel="nofollow" in Links prevents Pagerank from improving
CaptainRon replied to ruben1405241511's topic in Search Engines
Uh... sorry for the garbled post. Initially i got the concept wrong. Now I have it straight. Like you said, it can be abused for the decrediting of legitimate links, and I find your argument correct. And for that reason I suggest, let Google observe a rel="nofollow" link for a length of time before giving due credits. What I mean by this is, rel="nofollow" shouldn't mean that the credits are not given at all, rather suspended. By doing that, it can wait for the website owner to remove spam links, and also finally give due credits to a legitimate link after a period of time when it notices the link is still existing. I have made a similar suggestion at the wikipedia discussion. See the tech is needed for sure.... the question is how to prevent its misuse. -
Hey guys!!!Now here is something even more interesting... I dont know why it has NEVER been discussed here!Google uses PigeonRank Technology to find results so quickly. Can you believe it? It uses real Pigeons, and flashes thousands of pages in front of them, until they peck on one.Some one though mentioned this technology during the Google OS discussion, i never paid attention.I will create a new topic in Google Section. Check it there.
-
Following the open source model, ProgrammerAssist.com strives on the free world concept. Absolutely simplistic website, no graphics for the matter of fact, and most of the features you require to have in a Question-Answer system.Register a free account and get started right on asking/answering questions. It's a very new website as of now. I got my .htaccess problem answered over there.I think people should check out the fine website, and also get registered there. Had a talk with the owner, Srirangan (a staunch open sourcist), he says its just an ongoing effort to create a free rival to the paid website Experts-Exchange.com Although there are very few participants as of yet, but we can together make it a success.
-
Yes, thats the only thing that can challenge MSN Search, Google implementing Neural Networks too.But don't you think its simply an overwhelming thought of converting the whole of Google's present page rank database and web pages to adapt to a Neural Network system? Or probably they could give options to the searcher, "Traditional Search" or "Smart Search"... Plus consider the time taken to train the Neural Network.Either way, I say MSN should first get a top level domain name for its search engine.
-
Microsoft Hates OpenOffice
CaptainRon replied to nightfox1405241487's topic in Websites and Web Designing
I don't think Microsoft Office XP provides anything really revolutionary or the so called 2006 era features. OpenOffice is great for your day to day regular editing. -
Now ignore it, or read it with an open mind. MSN Search has the most powerful and promising searching technology. It is based on Neural Networks and NOT on an Algorithm (like Yahoo and Google). Difference between Algorithm, AI Algorithm, and Artificial Neural Networks: 1) Algorithm is a flow controlled logic, that works "Perfectly" IF implemented "Perfectly". It cannot adapt, and rests on the human brain to develop logic. 2) AI Algorithm is the one, that can perform intelligent actions, based on situation and conditions. The overall flow controlled logic is always the same. It uses task completion algorithms like the "Hill Climbing" etc to accomplish a task. 3) It is based on the Architecture of human brains, implementing Neurons. Neural Networks do not use algorithms, but generate results on the basis of inputs fed to the network. The network is interconnected neurons with weighted links. A Artificial neural network is a series of computers which are supposed to learn based on input provided.
-
Rel="nofollow" in Links prevents Pagerank from improving
CaptainRon replied to ruben1405241511's topic in Search Engines
I have been shouting for long enough to all the idiotic Google lovers that the moment someone gets power, it tries to exercise Monopoly. Although yes, this tag can create a problem for the "legitimate" websites, but i think in case the rel="nofollow" will be checked by the google spider to be linking to a page from the same relative path, then it wont be that much of a problem. In short, the rel tag should be read only for a link to the webmaster's own page.For example, I can block my Guestbook entry links from being spidered, but i can't stop a link to, say "http://www.someone.com/show.php; being linked via "http://mypage.com/article.php;. Google can compare the relative domains in this case. In other cases, relative paths can be compared.I support this technique, provided the rel tag is checked against the link it is used for. -
i suppose u shud go in for the other 3rd party virus removal tools...i like McAfee Stinger the most. One small app for the most outrageous viruses.
-
A New Probable Algorithm For A Search Engine
CaptainRon replied to CaptainRon's topic in Search Engines
See what u are over looking is the fact that "Content" or Data is being overlooked here.I just gave a brief scrap of what came to my mind. Let's say I get serious with this technique, I will make a more complex implementation.To give a small explanation:I will create a tree structure, just for a single page. When I say I will give more importance to the title tag, it means it will be the ROOT of the Tree. The H1 (or to be precise, any bold html that shows up prior to simple text) tags will come as nodes, and the content they discuss will come as child to those nodes. To simplify look up, the content is broken up into keywords which have a proper construct (like the way MS Word Grammar check does). These keywords are associated into a index table (just for that particular page, and in the specific subnode), with their occurence frequencies. Now since I said I will index only those keywords which follow proper construct, it will stop spammers from repeatedly wrting the same key word over and over again. After that I create a diversity factor. Usually, in previous case, a spammer could re-write a sentence with same keywords many times over and over again. To cut that, the diversity factor is calculated as a function of words in a sentence construct. It will also include non-keywords like (is that the them their etc), hence a unique paragraph with meaningful text gets properly credited.This along with frequency table will make the index table.This index table is then finally generated for the whole page and belongs to the tree structure. Such tree structure is generated for each and every page that is submitted, and then in the end these tree's finally become the part of the giant tree called the webspace. The way a page-tree enters teh web space is, it is categorically stored. Categories are created on the basis of keywords, and a page-tree can belong to several keywords (ofcourse), but are linked with weighted nodes, where the weight of the node tells that how prominent that key word is in the page tree.Remember that the keyword weight is a function of "where it appears in the page" plus the frequency plus the diversity factor. It can all become a complex mathematical equation if I sit down to seriously work upon it.But the point is... in a world dominated by Google, its impossible to outperform it. Look at Acoona... a real fine search engine with little future. -
Yeah ofcourse i put the file in my public_html ... The browser returns a blank page, since it's not supposed to return anything. They say create a blank page by a given name and place it in ur web space. I did that. Then they say the Web server has a security issue with the way it handles error pages. May be you submitted ur page before this flaw was detected. They say that the error page header should return 404 not 200. Usually a page that is reached without any inconvinience is marked 200. They probably don't want this for security issues. They also mention that they use the HEAD request (not GET) that also means the content of the page doesn't come in context at all! It needs some web server configuration for sure. Hey!! I guess i figured out the problem... My pages are not displaying error pages at all! and may be its due to the stupid Mambo Content Management sys that i have installed. That too i think its due to the .htaccess file that i modified for SEO friendly URLs. Well now who can set this straight ??? when i am typing an incorrect URL it is taking me to my home page... whereas it should take me to the error page. Now how to get this straight? OK I figured out the solution too... The prob was with .htaccess file. my .htaccess looked like: RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*) index.php Now this last line was the error creating line. RewriteRule ^(.*) index.php It means that any file request in format of .* (read dot all) should be redirected to index.php Now no doubt that the Google site was getting a 200 in the page header. I removed that line, and now its all working fine. I got my site verified. Thanx to this awesome support site: http://programmerassist.com/ One more thing: All mamo users should note that, they should replace that line the moment google has verified your site. Otherwise mambo will stop redirecting URLs properly
-
Damn!!!India lost to China by 1,050,000,000 to 1,650,000,000Anyway it scores way above Pakistan (as if it matters :lol )
-
IE7 Beta - Usual Microsoft Comedy Of Errors? another crap of an update?
CaptainRon replied to shiv's topic in Software
You will under stand my reaction by re-reading the way Shiv criticizes MS.Abhiram, you are rite and even i mentioned it explicitly that nearly every one on this forum have began with MS until and unless they were born after 1998. See the fact is, I respect those, because of whom I am what I am.I can assure you that incase I began with Linux, I would give up all hopes of becomming a software developer and rather chosen another field, probably Air Force. And being an Indian, you will know how much the IT Sector matters...As i mentioned above, the reasons because of which I find Linux developer unfriendly. Anyhow, when we talk of Servers, Linux is the BEST.I also explicitly stated that Linux has contributed nothing to me, until i came into engineering, i came into engg because of MS.Being in engineering, I have realised how wonderful is the concept of Open Source. Matter of factly, I am myself an open source developer. I have released each and every creation of mine as open source.Apart from that, whatever we study as a subject, can only be practically understood under Linux. For example the Kernel. I dont know how the windows kernel works, and never will, but I can read each and every line of code of Linux kernel. Apart from that, when i study the FTP protocol, I can see a Linux FTP implementation with all its code, and bury in my mind the practical concepts.Linux is great and so is Open Source... But MS isn't that bad either! It definitely doesnt deserve the kind of bashing it recieves!You will know what I mean once you start programming on Windows and Linux both, and see where you excell more. -
This is the error that Google gives me while i try to verify my site. NOT VERIFIEDWe've detected that your 404 (file not found) error page returns a status of 200 (OK) in the header.The explanation for this error at google is: This configuration presents a security risk for site verification and therefore, we can't verify your site. If your web server is configured to return a status of 200 in the header of 404 pages, and we enabled you to verify your site with this configuration, others would be able to take advantage of this and verify your site as well. This would allow others to see your site statistics. To ensure that no one can take advantage of this configuration to view statistics to sites they don't own, we only verify sites that return a status of 404 in the header of 404 pages.Please modify your web server configuration to return a status of 404 in the header of 404 pages. Note that we do a HEAD request (and not a GET request) when we check for this. Once your web server is configured correctly, try to verify the site again. If your web server is configured this way and you receive this error, click Check Status again and we'll recheck your configuration.So I suppose none other than the admins can help me with this....
-
To have it use like a Gmail drive, all you need is to contact the owner of the Gmail drive and request him to re-program his software to do so. Its pretty much the same concept. Sending the file as an attachment with specialized headers. What will matter is how much attachment size they allow. By the way how to use Gmail for site hosting??? Hey guys check this!!! I came across this and its a warning to all the idiots using GDrive or GoogleFS http://blogoscoped.com/forum/22209.html some one please send me a 30gigs invite at neodimension at gmail dot com boetaw can you?