Common Hosting Security Mistakes (And How to Avoid Them)
If you’ve been running a website in 2025, you’ve probably noticed some strange activity in your server logs. User agents like GPTBot, ClaudeBot, and a dozen other AI crawlers have been hammering sites across the web, sometimes at rates exceeding 39,000 requests per minute. Small sites have been knocked offline entirely. Bandwidth bills have spiked out of nowhere. A lot of site owners found themselves scrambling to figure out what was happening.
That chaos was a wake-up call for many, but aggressive bots are just one piece of the puzzle. The bigger risks are usually quieter. Default passwords that never got changed. Software updates that kept getting postponed. Admin accounts with no two-factor authentication. These aren’t sophisticated attacks. They’re simple oversights that add up over time, and they’re responsible for far more breaches than most people realize.
1. Default Credentials Are Still Everywhere
This one is embarrassingly common. When you set up a new server, CMS, or control panel, it often comes with default login credentials like admin/admin or root/password. These combinations are documented in every installation guide on the internet, and attackers run automated scans looking for servers that never changed them.
The usual culprits include WordPress admin accounts left at “admin,” phpMyAdmin installations with default credentials, cPanel accounts that never got secured, and test accounts created during setup that nobody remembered to delete. Sample applications are another problem since many hosting environments come with demo apps that have known vulnerabilities.
The fix takes about ten minutes. Change every default credential immediately after installation. Delete test accounts and sample applications. Use a password manager to generate strong, unique passwords for each service.
2. Unpatched Software Is an Open Invitation
Most successful breaches don’t exploit unknown zero-day vulnerabilities. They exploit known issues that have already been patched, sometimes for months. The WannaCry ransomware attack that crippled hospitals across the UK exploited a Windows vulnerability Microsoft had patched two months earlier. The organizations that got hit simply hadn’t applied the update.
People delay updates for understandable reasons. They’re worried about breaking something, they don’t have a staging environment, or they figure they’ll get to it later. But the window between “patch available” and “actively exploited” keeps shrinking.
If your hosting platform supports automatic security updates, enable them. If not, schedule a monthly reminder to check for patches. The mild inconvenience of occasional compatibility issues beats running known-vulnerable software.
3. Weak Passwords and Missing Two-Factor Authentication
You’d think this wouldn’t need saying in 2025, but weak passwords remain a top cause of security breaches. “password123” still shows up in credential dumps, along with “admin,” “letmein,” and company names followed by the current year.
The other half of this problem is missing two-factor authentication. Even a strong password can be compromised through phishing or database breaches at other services where you reused the same credentials. Two-factor adds a second layer so that even if someone gets your password, they can’t log in without your phone or authentication app.
Every admin account and hosting control panel should have two-factor enabled. Use a password manager to generate unique passwords for each service. Stop reusing credentials across sites.
4. Unnecessary Services and Open Ports
A fresh server installation typically comes with more services running than you actually need. FTP might be enabled even though you only use SFTP. Database ports might be open to the internet when they should only accept local connections. SSH is sitting on port 22, getting hammered by bots trying default credentials around the clock.
Every open port and running service is a potential entry point. Start by auditing what’s running on your server and disable anything you’re not using. FTP transmits credentials in plain text and has no business being enabled anymore. If you’re running a database, make sure it’s only accessible from localhost or specific IP addresses, not the entire internet.
Configure your firewall to deny all incoming traffic by default and only allow what’s explicitly needed.
5. No Visibility Into What’s Happening
You can’t protect what you can’t see. A lot of site owners have no idea what’s actually happening on their servers until something breaks. They don’t check logs, don’t monitor traffic patterns, and have no baseline for what “normal” looks like.
This lack of visibility is how small problems become big ones. An unused admin account gets compromised but nobody notices. A service gets misconfigured but doesn’t cause obvious issues until an attacker exploits it months later. The concepts behind data security posture management apply here. The idea is understanding where sensitive data lives, what services interact with it, and how access is controlled. That visibility turns security from guesswork into informed oversight.
You don’t need enterprise tools to get started. Check your server logs regularly. Look for failed login attempts, unusual traffic spikes, and requests to paths that shouldn’t exist. Know what normal looks like so you can recognize when something’s off.
6. Untested Backups
Having backups isn’t the same as having a backup strategy. Plenty of site owners have automated backups running but have never tested whether they actually work. They’ve never tried restoring one. They don’t know if the backup includes everything they’d need to recover.
Small issues compound over time. A misconfigured setting here, a corrupted file there. The result is often gradual instability that affects your site before it becomes an obvious crisis. Good backups give you a way to roll back when things drift in the wrong direction.
Automate your backups, store copies offsite, and test restores at least once a quarter. The time to discover your backup process is broken is not when you desperately need it.
7. Unmanaged Bot Traffic
This brings us back to where we started. AI crawlers have been aggressively crawling the web throughout 2025, consuming bandwidth and server resources at unprecedented rates. Some site owners saw traffic spike ten or twenty times normal levels overnight.

The instinct is to add rules to robots.txt, but that only helps with crawlers that respect the standard. Malicious bots ignore it entirely. Real bot management requires rate limiting to restrict how many requests a single source can make, services like Cloudflare to identify and block suspicious traffic, and hosts that offer built-in protection against aggressive crawlers.
At minimum, know what’s hitting your server. Check your logs, look at user agent strings, understand your traffic sources. If something’s consuming disproportionate resources, you need to know about it.
Final Thoughts
None of these are advanced threats requiring sophisticated defenses. They’re basic oversights that become dangerous when they accumulate. Default credentials, unpatched software, weak passwords, open ports, poor visibility, untested backups, unmanaged bots. Each one is easy to fix on its own.
Security isn’t a one-time setup. It’s an ongoing process of checking, updating, and adjusting as things change. If you’re just getting started, our guide on how to host a website covers the fundamentals, and our roundup of website security tools can help you find the right protection options.
Fix the basics first. The fancy stuff can come later.
Frequently Asked Questions
What’s the most common hosting security mistake?
Default credentials left unchanged after setup. Attackers run automated scans looking for servers still using admin/admin or similar combinations, and it remains one of the easiest ways to get compromised.
Do I really need two-factor authentication?
Yes. Even strong passwords can be compromised through phishing or data breaches at other services. Two-factor authentication adds a second layer that stops attackers even if they have your password.
How do I block AI crawlers like GPTBot and ClaudeBot?
Start with robots.txt rules for crawlers that respect the standard. For more aggressive protection, use rate limiting, Cloudflare’s bot management, or similar services. Check your server logs regularly to see what’s actually hitting your site.
How often should I test my backups?
At least once a quarter. Spin up a test environment, restore a backup, and verify everything works. The worst time to discover your backup process is broken is when you desperately need it.