优化大型MySQL表

I have MYISAM table with 60 000 000 rows. It has a lot of INSERTS and some SELECTS (not UPDATEs).

Executed INSERTs blocks reads (its obviously).

I only read 1% of rows, other 99% rows will be never reads.

I want optimize this. I think is good way to create duplicate table only for SELECTs with popular rows.

Is any ready solution for this? I'm working at PHP+MySQL.

Use a cronjob to INSERT...SELECT and DELETE inactive rows in your table to another table, if you have a way to locate those inactive rows, e.g. if those rows have a certain range of timestamp. That gives you the ability to find out those rows while not affecting the performance of most other queries.