Never having looked at all the backup options offered by BlueHost/cPanel I can't really comment on how that has changed. The only ones I have ever used are the ones that backed up the definition of my email accounts and I only ever ran those once when I first finished setting up the email accounts. So I can't really comment on what BlueHost has done but I can offer the alternative of how I handle backups which doesn't rely on the tools provided and which also doesn't cost anything.
I would never rely on any third party backups of my site. The process of handling your own backups is really simple and you don't need to use any tools from cPanel to be able to do it.
The master copy of the static part of my site is stored on my own computer and is copied to the web hosting as each change is made. The copy on the hosting of the static part doesn't need backing up as it is only a copy. If anything gets messed up I can re-copy it from the original master.
The dynamic part of my site (the database content) is automatically backed up on a regular basis by cron jobs that I set up on the hosting for that specific purpose which use mysqldump to create gzip files of specific tables in the database on the time intervals that I decided are suitable based on how frequently those particular tables get updated. Those files are then emailed to me so that I can then use the import option in mysqladmin running on my own computer to bring that part of the master copy of my site up to date. Apart from setting the userid, password, and databasename at the start of the script, there are only three statements needed to run each backup - one to specify the backup file name, one to create the command that will do the backup, and the third that does the system() call to run the command. I have the one job back up groups of tables separately so that the backup files that it creates are not too big. That job saves the backup files in a specific folder of my hosting where any method could be used to retrieve them (eg. SFTP), I just prefer to have a second job run a few minutes later that sends the files to me by email so as to save me the trouble of having to connect in and download them.
The following in a cron job would make a complete backup of a database into a dated gzip file :
BlueHost doesn't allow large databases (a large database would require multiple dedicated database servers to handle it), I am not even sure that you'd be able to host a small database of a few hundred Gb on BlueHost, but if you have a tiny database rather than a microscopic one you can update those statements to process the tables separately or in groups so as to keep the files small enough to be able to reimport many of them directly with the import command in phpmyadmin without it going over the file size limits. The other alternative would be to use the mysql command from a command prompt (or from within a script) to do the restore for you - see http://php.about.com/od/learnmysql/s...l_backup_2.htm for how that works.
$backupFile = '/home/example/backup/' . $dbase . date("Y-m-d") . '.gzip';
$command = "mysqldump --opt -hlocalhost -u$dbuser -p$dbpass $dbase | gzip > $backupFile";
The backup and restore options in mysql itself are so simple that they almost make the import/export options in phpMyAdmin look complicated in comparison (after all phpmyadmin only really caters for databases in the microscopic to tiny size range).