If you’ve been following along last week, I wrote a tutorial on how to roll your own backups using a custom bash script. At the end of that article, you were essentially left with a script that squeezed your entire WordPress site into a tidy .zip archive, ready to store anywhere. Now it’s time to do just that!
Prerequisites
Rclone
Some kind of cloud storage (Google Drive, Dropbox, Backblaze, etc)
sudo privileges on your web host server
For this tutorial, we’re going to be using an incredible tool called Rclone. If you’ve been following along for a while, you’ll know this isn’t the first time I’ve referenced Rclone on my blog. A while back I wrote about using Rclone to mount web servers like a hard drive, and it’s really an amazing and versatile tool.
Setup
First thing’s first, we’ll need to make sure all your packages are up-to-date. I’m using an Ubuntu box, so it’s just a quick command, but if you’re on a different distro, use your appropriate package manager.
sudo apt update && sudo apt upgrade
With that out of the way, it’s time to install Rclone. Rclone has its own installation script which you can just pipe directly into bash. Just be aware, it’s not safe to pipe random scripts directly into bash (especially if you don’t know what the script is doing or where it’s from) but since Rclone is a trusted source, we can go ahead and run the install script provided in the official documentation.
If this doesn’t work for you, or if you’re running a different operating system, you can always check out the installation instructions in the official Rclone docs. https://rclone.org/install/
Once your installation is complete, you should be able to run man rclone and see the manual page for Rclone. If not, try reinstalling or try a different method if that didn’t work.
Connecting Your Cloud Storage
These steps will vary pretty wildly, depending on which storage provider you’re with and what they require in order to get set up with their platform. For this tutorial, I’m using Backblaze B2 bucket, but you can follow the dedicated configuration instructions for your cloud provider. Again, all of this information and more can be found in the Rclone docs.
The first step for everyone, regardless of provider, is actually starting the configuration process with the rclone config command.
Once you’re in, you will be prompted to choose your cloud storage provider. As you can see, there are TONS of options.
Yes, it scrolls. In my case, since I’m setting up for Backblaze, I’m going to choose option 6, but again, you will choose whichever provider you like.
Next, you’ll be prompted to name your configuration. This is important to keep in mind, as we’ll be using this name in our command every time we want to interact with this cloud storage. That said, I like to keep my names as short and descriptive as possible to keep typing to a minimum and prevent typos from giving me grief.
Once my provider is selected, I will provide my account ID and then my application key. Once those two pieces of information are provided, I can then choose a handful of various options as to whether or not I want to be able to “hard delete” from this application, etc, etc, the default options are pretty good to use for all these questions.
A Note About The Rclone Command Format
At this point, you should be good to go! If you’ve configured your cloud storage correctly, now when you type rclone config you should see the name of the storage that you previously configured. Great work! At this point, now we’re ready to get familiar with a few of the basic Rclone commands and the way that Rclone works with Backblaze.
For our purposes, every part of the command will essentially stay the same except for the subcommand. There are a handful of subcommands you may find useful such as:
ls = List everything
lsd = List only the directories
delete = Delete everything in the following path
deletefile = Only delete this one specific file
copy = Copy everything in this path to the next provided path, skipping identical files
copyto = Copy just this one file
Awesome, Now What?
Now that you have Rclone installed and configured to be used with your favorite cloud storage solution, we’re ready to move that .zip archive off to be stored! To do this, we’ll be using the last subcommand as mentioned above.
copy is a really cool in that you can use it to recursively copy large directories from one place to another (very useful), but in our specific case, if you were following the previous tutorial, we only need to copy a single .zip archive up to cloud storage for safe keeping. To do this, we’ll be using the copyto command. The format of the copyto command is like so:
For testing, if you’re just wanting to test your syntax and make sure you have everything correct before actually carrying out the copy command, you can append the --dry-run flag at the very end like this:
And that’s it! Another flag you can use for your own purposes, especially if the archive is pretty hefty, is the --verbose and --progress flags. These flags will give you more output and update you with various stats such as upload completed percentage, data transfer rates, and more. Again, just optional cool stuff to keep an eye on progress.
And That’s It!
Now you have everything you need to add the rclone command to your backup bash script. Now by executing a single file, you can roll up a copy of your website and ship it off to be stored for later! In the last part of this little blog series, we’ll go over exactly how to automate the execution of your backup script and how you can customize your very own running schedule so you can have backups make themselves whenever you want! Thanks for reading.
So you wanna roll your own backup solution, eh? Sure you could pay for some premium solution, but who are you kidding, they’re just copying files back and forth. How hard could it be? In this article, we’re going to take a look at writing our own backup solution. While technically, this solution can be molded to fit any type of website, app, or software, this particular script will be focused on making copies of and backing up WordPress sites. Let’s get into it!
Prerequisites
This tutorial assumes you already have root access to your hosting server, and will be performed pretty much exclusively from within a bash shell (hence the title). You will need a few things in order to get started:
SSH access to your host
Permissions to write and execute your own scripts
Permissions to install CLI tools on the server
Some place to store your backups (and access to that place)
With regards to that last bullet point, this can be virtually any cloud hosting service (Google Drive, Dropbox, Amazon, etc) or even a custom backup server.
Step 1
Once you’re all logged in and got your permissions all set up, you’re ready to go. Go ahead and create a new file with your favorite text editor (You can do this locally and upload it when you’re done, but I just wrote this directly on the server by using this command):
vim ~/backup.sh
Once you have your file opened, you can start off with a shebang, this tells the computer that this is a bash file and you should use bash to evaluate it:
#!/bin/bash
The first thing you’ll need is a place to temporarily store your backup files. Make sure you have permissions to access and write to this location, as you’ll be basing your entire backup out of this one folder. It can be anywhere, just so long as you can do stuff without permissions issues. For this example, we’ll be using /opt/backups/ but you can use any directory you like. Go ahead and make that backup directory if it doesn’t already exist:
mkdir -p /opt/backups/
Now that you have a temporary spot to keep the backup you’re about to create, let’s move into the root of the WordPress install so we can use wpcli to begin the process of making a backup. This can be located anywhere, but typically, WordPress installs are located in the /var/www/ directory, in a VPS or similar.
cd /var/www/example.com/
Next, using wpcli, we’ll export the entire database with the db export subcommand like so:
wp db export /opt/backups/backup.sql
This is just an example, but the premise of this command is to generate a sql file that you can use to recreate the existing SQL database on another empty database somewhere else. The final argument of this command simply provides a full filepath and filename for the resulting SQL dump.
Next, we need to make a copy of everything inside of /wp-content/ and save it in our backup directory. Luckily, we can copy the complete contents of the /wp-content/ folder over to our temporary backup directory and compress everything in-transit all in one single command:
tar -cvzf /opt/backups/wp-content.zip -C /var/www/example.com/wp-content/
If you’re familiar with the tar command, this should be pretty straightforward. If not, here’s a quick breakdown. tar is a LEGENDARY command. It’s the OG method for creating archives of virtually anything. From what I understand, it’s what the dinosaurs used to back up their blogs.
That’s right, tar is short for “Tape Archive”. I told you that’s what the dinosaurs used! After invoking the tar command, you can pass various options. The common options for creating an archive are cvf which stand for create, verbose file. Doing it this way would result in an archive ending in .tar but you can also pass the letter z as an option to format your archive into a more common .zip archive. You can also use tar to extract archives, but we’ll get into that later.
Let’s review Step 1 Because That Was A Lot
At this point, you should have lots of stuff in your bash file. It would probably be a great idea to save what you’ve got so far. First, we opened with a shebang and created our backup directory. Now, technically speaking, we would really only need to run the mkdir command once. So you could optionally exclude it from your script as long as you just create your backup folder once and have correct permissions for it, there’s really no need to keep creating a backup folder every single time you run this script in the future.
Next, we used wpcli to export the database and save it in our backup directory. There are other ways to go about generating an SQL dump file, you can do it from phpMyAdmin, or even from the SQL repl. By using wpcli, we’re simply exporting the database that we know for sure is being used by WordPress because just before it we used cd to change directories into the root folder of the WordPress install we’re trying to backup. wpcli uses the database credentials stored in wp-config.php to execute this command. And the final argument of the wpcli command saves the resulting SQL dump to our previously-created backup folder.
And finally, we used the tar command to create a .zip archive of everything inside the wp-content/ folder and save that in our backup folder as well. And ultimately, the idea is to create a single .zip archive that contains everything you need to start our website somewhere else. Knowing this, our folder structure is kind of awkward now. Our backup folder should look like this:
So yeah, technically, this is really all you need to completely restore a WordPress site, but you’ll notice it’s actually not including WordPress core and the all-important wp-config.php. So in order to restore this site, you’ll need to download WordPress core as well as set up an new database and wp-config.php file. This is quite a lot of stuff to tackle for when the time comes to actually restore the backup your taking.
Additionally, suppose a bit of time passes after you take your backup and the WordPress team releases a major core update, and now you find yourself needing to restore a backup. Doing things this way would require you to download the latest version of WordPress which could potentially cause issues with some of the items located in wp-content/plugins or even wp-content/themes.
Small Advantage of Doing Backups This Way
If you’re pretty infrequently restoring backups (if ever), the odds of you encountering a real problem with WP core updates disrupting a restore are pretty slim. Additionally, only targeting the wp-content/ folder for backups allows you to only back up the main content of your site without having to save core files every time you do a backup. However, admittedly, this is a very small space-saving tactic, and is ultimately your choice and one of the advantages of rolling your own backup script with bash! We get to decide what we keep and what we throw away (or don’t make copies of).
Disadvantage of Doing Backups This Way
I would be disingenuous to say that this is the method described above is the best way to go about this, because as mentioned before, you could run into some problems at restore time, as it’s unfortunately common that WP core updates have been known to break websites because plugin developers weren’t ready for this core update, their plugin depended on some feature of WP core, and that feature was removed at the time core was updated, and the result broke the website.
In order to avoid this situation, you can optionally (and at this point, I would probably just go ahead and recommend this way) include the core files in your backup. To do this, you need to modify your tar command and your wpcli command to look like this:
//SQL dump directly into the root of the WP install instead of putting it in the root of the backup folder
//if no arguments are passed after 'export', the default action is to generate an SQL file in the same directory where you are running the command.
wp db export
This will put your WP database into the root of your WP install with a default name and timestamp. Next, you’ll just zip your entire WordPress director (database included) and place it wherever you want and name it how you like. To put it in your user’s home directory, just run
tar -cvzf ~/site-backup.zip /var/www/example.com/
Additional Space-Saving Techniques
There’s another really cool feature of tar that we’re going to take advantage of in our script. It’s the -X option or --exclude-from option. If you’re familiar with Git, or have used version control before, you’ll likely be familiar with the .gitignore file. When using tar to create archives, you can using the -X option to essentially include your own ignore file to specify which files and folders you should exclude from your archive.
Why is this cool? Well, it’s common to find many WordPress installs using a caching plugin of some kind. This allows the server to save cached versions of various pages that visitors are requesting in improve overall site performance. For the purposes of creating a backup, there’s really no need to store this cache as part of our backup, especially since these cache folder can get quite large.
The above screenshot is just a small example of WP Fastest Cache running for a few weeks on a simple, low-traffic blog. Not much right now, but if you consider daily backups with even a max storage time of 10 days, this 37MB folder would quickly balloon to over 370MB across 10 daily archives, and that’s assuming the cache folder stays the same size for 10 days.
To exclude this folder (and others) we’ll need to create a small text file to tell tar which files and folder we’d like to exclude from our archive. You can name this file whatever you like, ignore.txt, exluded_files.txt, or even something more descriptive like cache_folders_for_tar_to_ignore.txt. But since you’ll be passing the name of this file to tar when you execute the command, the more concise, the better.
Inside the ignore file, let’s add a path to the cache folder we want to ignore. Keep in mind when defining file paths to exclude in tar, these file paths will need to be relative to the directory in which the tar command is being run. So for us, since we’ll be running tar from within the directory containing our WordPress install (/var/www/) we should have no problem.
inside /var/www/ingore.txt
wp-content/cache
With that added, suppose our WordPress site is located at /var/www/example.com and you’ve already exported your database.sql to the root of that install. To zip that entire site up, database and everything, but exclude the cache files located at /var/www/example.com/wp-content/cache/ you can run something like this:
cd /var/www/
tar -cvzf /opt/backups/example-backup.zip -X ignore.txt example.com
As tar evaluates this command, it’s going to take a look at /var/www/ignore.txt and know to ignore all our cache when creating this archive. Then it’s going to compress everything located at /var/www/example.com/ (except the cache folder) and copy it over to /opt/backups/example-backup.zip. Obviously, you can tweak this command to suit your own needs, but this exclusion feature of the tar command is very cool and pretty useful in our case!
Moving Your Archives Off-Server
Here’s the thing… Making regular backups of your site is pretty important. Lots of people use plugins to do this, and there’s lots of great plugins out there for doing just that. But here’s one of the biggest mistakes I see site owners do: They set that backups plugin to make routine backups… and they never move those archives off-server. Why? Simply put, lots of folks don’t put this much thought and energy into considering what a site backup is and does.
Unfortunately, I’ve come across many many WordPress sites that have a backup plugin, but it’s just saving tons and tons of backups to wp-content/ and the result is terrible. First, it’s not really making a “backup”. I mean, sure, it’s creating a copy of your website and creating an archive like we’ve been talking about… but it just stores it.. IN YOUR WEBSITE! So if your server goes down… so do all your backups!
The whole point of creating a backup archive like this is so we can move it off-server. There’s a metric ton of ways to do that, and we’ll get into that in my next post.
While I was updating a WordPress plugin locally, something happened on my local server to trigger a 500 error code as the plugin was updating. On the frontend, I refreshed the page and was greeted with “Briefly Unavailable for Scheduled Maintenance. Check back in a minute”
This message is typical during the one or two seconds it takes to update a plugin or a theme, but this was taking a while. After refreshing my admin screen, I received the same scheduled maintenance error. The admin dashboard was no longer accessible. The following error was in my local server log:
[ERROR] mysqld.exe: Table '.\\local\\wp_options' is marked as crashed and should be repaired.
After a quick search, I found this awesome article by Chris Jean (Thanks, Chris!) explaining how to resolve errors in database tables. I just wanted to do a quick write up just because I don’t find myself in the sql shell all too often, and it’s mostly for my own notes. In my scenario, I was running LocalWP, so the database was named local and the user/pass is root which is fine for local development, not for production.
To connect click the open site shell button:
Since this is a local environment with default login credentials you can open the mysql shell simply by running mysql with no params or creds.
It should be noted, at the time of writing, ZSH and Local running PHP 8 or higher, will not allow for users to open the site shell as documented in this support forum
Also, as Chris’s article covers, you can correct the issue by running it in the shell or via PHPMyAdmin, which is Adminer in LocalWP:
From there you can repair the offending table. Just make sure you know exactly what your database name is and what your prefix is. That thew me off for a second when chris calls prefix_post in his code, but that prefix depends on whatever the prefix is on your wordpress table, whatever that is. The default is typcially _wp but that isn’t always the case. Hope this helps. And I was hoping to have screenshots of WPCLI running locally, but I ran into some issues. Perhaps that’ll be in the next blog post!
I was recently tasked with the challenge of creating a WordPress theme generator. The idea being, instead of writing your style.css file from scratch, you’d be able to just answer a few simple questions about how you’d like your theme set up, and those starter files would be automatically generated.
How it’s Going
Once I started diving into the specifics of how exactly to create a tool that would generate the files I was looking for, I realized very quickly that my scope for this project was perhaps a bit to broad. It ultimately had to answer the question, “What does this starter theme look like, anyway?”
In today’s fast-paced and rapidly changing tech environments, even the world of WordPress is looking at some pretty radical changes that forced me to ask the question, “FSE theme or Classic theme?” Because the files required to create even the most basic of themes would be different based on how the user answered that simple question.
Admittedly, the majority of the questions in this command-line program are just filling out various fields in the style.css file. And if you’re planning on making a parent or even a child theme, you’re going to have to fill that file out anyway.
And perhaps, that should be (or should have been) the primary objective of the entire tool. Essentially a stylesheet generator, and perhaps that can still be a web app down the road. However, in my ever-growing list of things I need this tool to do, was the ability to download the latest versions of various WordPress plugins from the official WordPress plugin repository.
This project ended up being a crash course on the finder points of Javascript async and await as I needed to figure out how to make a whole slew of HTTP GET requests in a specific order, waiting on responses from the first request before I could act on the second, and build it in such a way that it’s a modular set of download instructions that is easily repeatable for any WordPress plugin in the official repository.
I had an absolute blast on this project, there are still loose ends to tie up, and I’m probably going to continue messing with it at least for the next few months. I’ll need to remind myself to put a badge up if it gets too far out-of-date. If you made it this far, thanks for reading. If you’d like to try out my theme generator and play around with it, feel free to fork it on GitHub: https://github.com/mjones129/themegen
I’ve done a write-up on hosting local WordPress sites before, but I definitely prefer this method over the previous one. If you’ve never tried hosting multiple sites using LocalWP, then I’d encourage you to check it out. I’ve had really bad issues with it in the past, and it may have been a combination of lack of Linux support and my incompetence, but I decided to give it another shot and this time it was an absolute homerun.
What is LocalWP?
For the uninitiated, LocalWP is a software program designed for developers to get local copies of WordPress up and running quickly and push to production with ease. Just a full disclaimer before we go any farther, I am not being paid by LocalWP, they are not sponsoring me, and this is just a reflection of my own experience and my own opinions.
Let’s Talk About Setup and Updates
This is mainly a concern for Linux users, because build-it-yourself software can definitely have its ups and downs. However, I really like how Local distributes for Linux systems. While they don’t distribute via the official Linux repositories like Ubutnu and Arch, they do offer a simple .deb that you can either drop into your default software manager or something like Gdebi. Updates are as easy as uninstalling the existing .deb ad downloading and installing the latest .deb. Gdebi handles all of this for you.
Setting Up Your First Local Dev Environment
This really couldn’t be easier. It’s a simple as clicking the “add site” button, choosing where you want to save everything, choosing your config options (PHP versions, MYSQL version, and either Apache or NGINX). The fact that all of these settings are a simple dropdown is amazing. When configuring your local dev environment with XAMPP (as far as I know) you make those decisions one time and it’s a pain to change them later.
And what’s more, these settings are independent of each other. So I can have 2 or 3 copies of a site all running different versions of PHP and one on Apache and one on NGINX. To me, that alone is worth the price of admission.
Cool Features
In addition to the fact that you can manually set up different environments with different versions of your LAMP or LNMP stack, there are other features inside LocalWP that make it a real standout.
Easy To Find Your Config Files
When you’re running multiple sites on your local machine, especially when each of them has different setups, it can be easy to lose track of all your config files. LocalWP does all of this for you. Inside each local environment install, above the root directory of your WP install, is one folder that contains all the config files for that particular environment. From that one folder, you can control your php.ini and other vital config settings that could make or break your site.
WPEngine Integration
LocalWP has built-in integration with WPEngine, so once you connect your WPEngine and LocalWP accounts, you can drop in the WPEngine API key and gain access to all of your WPEngine environments right from within LocalWP. If you don’t use WPEngine for your hosting, then this feature doesn’t matter, but for those who do host with WPEngine, this is a huge bonus as opposed to figuring out how to migrate or push changes from a LocalWP instance manually via Git or SSH or FTP or whatever.
Automatic SSL Certs
If you’re developing locally with something like XAMPP, doing SSL certs can be an absolute pain, especially if you have to set them up yourself and generate your own certifications and keys and all that. Plus modern web browsers are getting so restrictive on locally hosted content that it becomes a real chore just to get a simple lock icon in the top corner. To have LocalWP just automagically generate SSL certificates for each environment is not only a huge time saver, but depending on the project you’re working on, some features actually depend on SSL certs being in place, especially in e-commerce situations.
File Tree Diffs on Push
One of the more recent updates is a feature called “Magic Sync” that automatically compares the files in your local file tree with the files in the remote environment that you’re pushing to. This is especially convenient especially if you’re concerned about overwriting important files. Great to have this “look before you leap” feature.
Final Thoughts
All in all, LocalWP has been an excellent upgrade over something like XAMPP, especially if you need to adhere to specific PHP versions and keep multiple sites locally as you work on several projects at once. All of the VirtualHosts and stuff that’s set up inside Apache config becomes a thing of the past as local sites can literally be up in running with 60 seconds with about 5 clicks. I really don’t know how I’d work without LocalWP; it’s become a vital part of my everyday workflow.
Yes, technically it’s possible to just drop scripts into template files, but it’s not the correct method to use on WordPress sites. WordPress is pretty particular on how it handles JS.
Register Scripts via functions.php
Just a heads up, the functions.php file is pretty sensitive. WordPress requires everything in functions.php to have absolutely perfect syntax, otherwise it will break your entire site. Please don’t mess with it if you’re on live install or don’t have a way to access this file should you get the dreaded white screen of death.
Once you agree to the above mentioned terms and conditions, you’ll want to register your new Javascript file with the wp_enqueue_scripts function, which is most commonly used with a hook like so:
There’s definitely a lot going on here, but basically wp_enqueue_script is expecting a lot of arguments, some required, some optional. The first argument is the name of the script, and that’s required. Next is the filepath of the named script, which funny enough, is actually optional. The default filepath is just an empty string, which refers to the root of the WordPress installation.
In the above example, I actually used another WordPress function get_stylesheet_directory_uri(); which just returns the directory of the currently activated child theme, then you can just concatenate the rest of the filepath with a period in PHP.
Next is an array of dependencies, so if your script depends on other scripts, you can pass those into this array. Mine didn’t have any dependencies, so I just left it as an empty array.
Next is the version number, which is given as a string in single quotes. And finally, the last argument is a boolean which is $in_footer. This is set to false by default. WordPress loads all scripts at the very top of the page, up in the <header>. If your Javascript is manipulating or selecting an element later on down the page, there’s a good chance your script won’t work in the <header>.
If that’s the case, and you need to load your JS in the footer in order for it to work, you can set the very last argument to true and that will drop your script all the way down to the bottom, just before the closing </body> tag.
If you only need to call a different sidebar on a single page and that page is totally unique, and there’s no need to have a plan for children of those pages, then a simple page template should do the trick. Just go to your parent theme, copy page.php (for classic themes) or page.html (for block themes) and paste it into the root directory of your child theme. You can also create page templates WYSIWYG style with blocks, but that’s for another time.
Rename page.php to include the slug of the one page you’re looking to change. If you want example.com/awesome-page to be different, then rename page.php to be page-awesome-page.php. Open your freshly renamed file and search for the get_sidebar() function that is likely used in your page.php file and pass the name or ID of the sidebar you want into that function.
Conditional Statement (Best for Groups of Pages)
I had a situation in which I needed to manage a handful of different sidebars that all needed to be displayed on different pages, depending on what kind of page it is. For example, it it’s a page about bicycles, it needs to display the bicycle sidebar. If it’s a child page of the bicycle page, it also needs to display the bicycle sidebar. If it’s a page or child page about underwater basket weaving, it needs to display the sidebar for basket weaving. Else, just output a normal generic sidebar. This little function came in great handy to be placed in the child theme’s page.php:
<?php
if (is_page([ID OR SLUG]) || $post->post_parent == [postID] ) {
wp_nav_menu(array('menu' => 'menu_name'));
}
?>
It should also be mentioned: the wp_nav_menu() function, can kinda take just a normal menu name or ID, but if you were like me and needed to stack a few if..elseif..elseif...elseif..else statements, I ended up seeing the same menu being outputted regardless of the ID or name passed into wp_nav_menu(). That was resolved when I passed the menu name into an array, like above. The first element of the array must be 'menu', and the second element is the name of whatever your menu is.
In the Firefox address bar, type about:config. You’ll be greeted with a warning.
Click “Accept the Risk and Continue”. Next, you’ll see a search bar where you can search for specific preferences. If you made it this far, there’s a good chance you know exactly the setting you’re looking for.
Search for content.cors.disable.
No need to type true or false. To toggle the boolean value, simply click the toggle button on the far right. CORS will then be disabled for Firefox.
Method 2: Apache Settings (recommended)
On your Apache server, head over to /opt/lampp/etc/httpd.conf and locate the lines that declare the directory that you’re looking to allow CORS on. Once you find the <Directory> tags, drop inside and add the following lines:
Header set Access-Control-Allow-Origin: *
Header set Access-Control-Allow-Headers: *
At this point, you can restart your server and you should be good to go. Don’t forget to clear your cache!
Today I ran into a situation where I was getting a new user registration every half hour for the entire afternoon. I came across this solution that will block registrations by email domain.
First, it blocks registrations by email domain. This simple denial of registration does not require WordFence, as it’s just a really useful filter you can just drop into your functions.php.
add_filter( 'registration_errors', 'disable_user_registration_for_email_domain', 10, 3 );
function disable_user_registration_for_email_domain ( $errors, $sanitized_user_login, $user_email ) {
// only if it's an email address at all
if ( ! is_email( $user_email ) ) {
return $errors;
}
// get domain from email address
$email_domain = substr( $user_email, strrpos( $user_email, '@' ) + 1 );
$block_domains = [ // partial domain names allowed (doesn't need to include TLD for example)
'spammersgalore.com',
];
foreach ( $block_domains as $domain_partial ) {
if ( stripos( $email_domain, $domain_partial ) !== false ) {
// throw registration error
$errors->add( 'email_error', '<strong>ERROR</strong>: Registration not allowed.' );
}
}
return $errors;
}
Take it one step farther by adding the IP address to WordFence’s blocked IPs list. This isn’t permanent, the default block duration is 4 hours. However, if the problem persists, you can make the block permanent within the WordFence GUI. If you get WordFence emails from your website, it will include the IP of the blocked user, so you can go back and permanently block IPs whose lockout duration may have already expired.
add_filter( 'registration_errors', 'disable_user_registration_for_email_domain', 10, 3 );
function disable_user_registration_for_email_domain ( $errors, $sanitized_user_login, $user_email ) {
// only execute when the relevant WordFence functions can be called
if ( ! is_callable( 'wfUtils', 'getIP' ) || ! is_callable( 'wfBlock', 'isWhitelisted' ) || ! is_callable( 'wordfence', 'lockOutIP' )) {
return $errors;
}
// only if it's an email address at all
if ( ! is_email( $user_email ) ) {
return $errors;
}
// get domain from email address
$email_domain = substr( $user_email, strrpos( $user_email, '@' ) + 1 );
$block_domains = [ // partial domain names allowed (doesn't need to include TLD for example)
'baikcm.ru',
];
foreach ( $block_domains as $domain_partial ) {
if ( stripos( $email_domain, $domain_partial ) !== false ) {
$IP = \wfUtils::getIP();
if ( \wfBlock::isWhitelisted( $IP ) ) {
return $errors; // don't block whitelisted IPs
}
// lockout IP
\wordfence::lockOutIP( $IP, "Registration attempt from blocked email domain {$domain_partial}" );
// throw registration error
$errors->add( 'email_error', '<strong>ERROR</strong>: Registration not allowed.' );
}
}
return $errors;
}
To search and replace using phpmyadmin, first you’ll need to select and open the database you want to run the search and replace query on. Once you’ve selected it, you’ll see the tabs across the top of the screen change.
Execute Query
Click on the SQL tab at the top, and you’ll be greeted by a large textbox. This is where you can run your SQL queries like such:
UPDATE `table_name`
SET `field_name` = replace(same_field_name, 'unwanted_text', 'wanted_text')
Keep In Mind When Using Search and Replace
You can only target one table at a time when running queries like this. It’s a good thing and a bad thing. Good because it prevents huge data loss across multiple tables with one wrong SQL query (there is no undo). Bad because it makes it kinda difficult to make large scale changes with a few keystrokes. Depends on your situation and what you’re trying to accomplish.
My Use Case
I was recently transitioning a WordPress site away from Divi and moving into a custom Gutenberg block theme. When I made the transition, there was lots of code embedded in all my posts that was left over from the Divi builder.
Various shortcodes left over indicated what kind of Divi module it used to be, what the layout was, what version of the builder I was using at the time, etc.
So rather than go in and manually clean up well over a hundred posts, I thought it was better to just run a search and replace query against the database to get rid of that extra code, making the transition a lot easier, and WordPress would have better luck converting existing posts over to Gutenberg blocks.
Using Search and Replace to Delete Stuff
With regard to the query outlined above, you can modify it to replace your target string with an empty one, which basically deletes it.
UPDATE `table_name`
SET `field_name` = replace(same_field_name, 'unwanted_text', '')
Using Search and Replace Across Multiple Tables
Since you’re not allowed to do this with a single SQL query, the best way to do something like this is to dump the database and run a search and replace using your text editor of choice locally. Once completed, then you can just import that database into phpmyadmin, overwriting your existing database. (best to do all this locally before trying anything live)