I would like to know if there is a way to get the most recent posts from a website including an image if the post has one.
for example get the last 20 bbc news business posts with images and display them on my website?
Find the url to the rss feed for the site then use take a look at the php feed library SimplePie. It converts the feed at the url you give it to easy to an easy to use pho object format.
For your example, one of BBC England's new feeds is http://feeds.bbci.co.uk/news/england/rss.xml
First grab the simple pie library source: http://www.simplepie.org/downloads/
To consume this a feed in PHP and display it to the user do something like this:
require_once('../simplepie.inc'); //explicitly include the SimplePie library
$feed = new SimplePie(); //create your feed object
$feed->set_feed_url('http://feeds.bbci.co.uk/news/england/rss.xml'); //set the feed url to read
$feed->init(); //Start consuming the feed!
//the newly initialized feed object has some properties like it name, description, ect...
echo "Feed Url ".$feed->get_permalink();
echo "Feed Title ".$feed->get_title();
echo "Feed Description: ". $feed->get_description();
$count = 0;
//now, run though post of feeds, stop at 20
foreach ($feed->get_items() as $item){
if($count >= 20){
break;
}else{
$count++;
}
echo "Link to original post: ".$item->get_permalink();
echo "Title of Post: ". $item->get_title();
echo "Description of the Post: ". $item->get_description();
echo "Date Posted".$item->get_date('j F Y | g:i a');
}
Some websites have an API, a direct access to their contents, you can retreive it in the format XML or JSON.
You will have to use something like SimpleXML, DomXML, json_decode(); in PHP to cache the results in your database, or to query their API.