I'm currently working on a website which main function is to search a relatively large database of companies. After submitting their search parameters, people will be redirected to a list of companies that match whatever they searched for and from there, they can see the details of any specific company if they wish by clicking on the result.
This website splash-page is a simple form to execute this search and nothing else.
As far as I know, crawlers will not try to submit forms with random text so, I'm affraid everything after my form will never be indexed.
I want to avoid, obviously, all kind of black hat techniques. Having that said, how can I make sure my inner content will ever get indexed???
Also, if you know, is there any good reading material around the web about indexing websites?
Thanks in advance.
My suggestions would be
1) Generate a sitemap XML file from your list with each element being an individual page as well as a dynamic html based sitemap with a tiny link to this in the footer.
Manually Creating Sitemaps: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=34657
or (my personal preference)
2) Create an RSS Feed + Icon that will allow google to peruse the list of companies as well as yourself or clients being able to subscribe and see companies as they sign up. Submit the RSS feed to Google and voila. Make sure you have an icon or RSS link on the front page though.
Submit an RSS Feed: http://www.google.com/submityourcontent/content_type/rss.html
Create an RSS Feed: http://www.wikihow.com/Create-an-RSS-Feed
Create a page with an index of all the companies and link to it from the search page. If you have a ton of companies separating them by letter will be helpful. This allows search engines to see all of your content and has the added benefit of helping users as well.