使用多个连接和Group_Concat优化php mySQL查询

I have multiple join sets happening on my query. I have multiple joins, because of the way the tables are structured to get what I want. My main joins, I've separated into three sets as commented below. If I only have one set, the query time is pretty fast. When I have two sets active, the query time is around 2 minutes. If all three sets are active as shown below, it takes too long.

Any help on optimizing this query would be appreciated.

$query  = "SELECT   `Databases`.*, 
                    `DatabaseDescriptors`.*, 
                    `DatabaseContents`.*, 
                    `DatabaseAccessLevels`.*, 
                    `Providers`.*, 
                    GROUP_CONCAT(`Descriptors`.DescriptorName SEPARATOR ', ') as DescriptorNames, 
                    GROUP_CONCAT(`Contents`.ContentName SEPARATOR ', ') as ContentNames, 
                    GROUP_CONCAT(`AccessLevels`.AccessLevelName SEPARATOR ', ') as AccessLevelNames ";

$query .= "FROM `Databases` ";

// SET 1

$query .= "JOIN `DatabaseDescriptors` 
                ON `DatabaseDescriptors`.DatabaseID = `Databases`.DatabaseID ";

$query .= "JOIN `Descriptors` 
                ON `Descriptors`.DescriptorID = `DatabaseDescriptors`.DescriptorID ";

//SET 2

$query .= "JOIN `DatabaseContents`
                ON `DatabaseContents`.DatabaseID = `Databases`.DatabaseID ";

$query .= "JOIN `Contents`
                ON `Contents`.ContentID = `DatabaseContents`.ContentID ";

//SET 3

$query .= "JOIN `DatabaseAccessLevels`
                ON `DatabaseAccessLevels`.DatabaseID = `Databases`.DatabaseID ";

$query .= "JOIN `AccessLevels`
                ON `AccessLevels`.AccessLevelID = `DatabaseAccessLevels`.AccessLevelID ";

$query .= "JOIN `Providers`
                ON `Providers`.ProviderID = `Databases`.ProviderID ";

$query .= "AND `Databases`.DatabaseID = 47";

The reason for the performance problem is that a "database" has multiple descriptors, access levels, and contents. Say a database has 10 of each. The query ends up turning this into 10*10*10 = 1000 rows for processing.

The solution is to do the aggregation before doing the join. For instance, instead of this: $query .= "JOIN DatabaseDescriptors ON DatabaseDescriptors.DatabaseID = Databases.DatabaseID ";

$query .= "JOIN `Descriptors` 
                ON `Descriptors`.DescriptorID = `DatabaseDescriptors`.DescriptorID ";

You would have:

(select dd.DatabaseId,
         GROUP_CONCAT(d.DescriptorName SEPARATOR ', ') as DescriptorNames
 from DatabaseDescriptors dd join
      Descriptions d
      on dd.DescriptorID = d.DescriptorID
 group by dd.DatabaseId
) dd
on Databases.DatabaseId = dd.DatabaseId

(You can replace the group by dd.DatabaseId with where dd.DatabaseId = 47 to just handle your one case. My guess is that you might want this information for all databases. If so, add a group by DatabaseId to the outer query.)

You then need to repeat this for all three of the group_concat() columns.

Note: You are pulling all the fields from the all the tables. However, only one row is returning because you have an aggregation query with no group by. I'm guessing that you are also getting duplicates in the group_concat()-created lists. This approach will also fix that problem.