I recently needed to update Search API Solr to 4.3.1 and I ran into some issues and I wasn't immediately clear on how to resolve them.

To be clear, while I've used Search API Solr for a while now, it's pretty much a black box to me.  I have always gotten the results I've needed by following various tutorials and documentation, and just enough experimentation that it just seems to work like magic.

So,when I recently upgraded to 4.3.1, I was a bit dismayed by the release notes. They describe how, due to some changes to support various features, your existing implementation would be able to read existing indexes, adding to or editing those indexes would have problems.  Specifically, you'd receive errors like:  

cannot change field "xyz" from index options=DOCS_AND_FREQS_AND_POSITIONS to inconsistent index options=DOCS_AND_FREQS_AND_POSITIONS_AND_OFFSETS

Which is exactly what happened to me. 

However, I was stumped about how to solve the issue.  The release notes state, "There're different ways to deal with the required update. But all require Solr knowledge."  Well, that is not good news, since I don't have a lot of Solr knowlege.

The notes go on to describe the "safest way" to handle the update, which it to delete the "existing core".  I wasn't entirely sure what the "core" was.  Core is not terminology seen in the Search API UI, nor anything I can recall interacting with.  I was pretty sure this wasn't the "server" or the "index", but that was all of what I was sure.

While looking through the issue queue, I found this issue, Updating to 4.3 Errors, in which mkalkbrenner points to the release notes. The next comment states that with Pantheon, we have no option to delete and recreate cores, and suggested closing the issue.  That's where I quit reading.  I didn't see any other explanations or steps in the queue, and I didn't see anything in the search_api_pantheon queue either.  I searched Slack (Drupal and Pantheon), I searched Google, I didn't find anything even close to anything I could follow.

Having exhausted my abilities, I opened a ticket with Pantheon, and the support engineer pointed back to the release notes and the issue I had already investigated.  This time, however, I read the issue more thoroughly, and the post just after the one that indicated I was barking up the wrong tree indicated that the steps outlined by the issue reporter would actually work, with an extra step.

For this step, we need the search_api_solr_devel submodule to execute a drush command that deletes *all* documents on a Solr server.

One thing I experienced which contradicts what is stated in the release notes and what was mentioned in the issue: I never needed to delete (nor create) a server or index to make this work.

Here is the complete list of commands:

terminus drush <site>.<env> -- en devel search_api_solr_devel search_api_solr_admin -y
terminus drush <site>.<env> -- sapps <server_id> /code/web/modules/contrib/search_api_solr/jump-start/solr8/config-set/
terminus drush <site>.<env> -- sapi-sd <server_id>
terminus drush <site>.<env> -- search-api-solr:devel-delete-all <server_id>
terminus drush <site>.<env> -- sapi-se <server_id>
terminus drush <site>.<env> -- solr-reload <server_id>
terminus drush <site>.<env> -- sapi-rt
terminus drush <site>.<env> -- sapi-i
terminus drush <site>.<env> -- pm-uninstall search_api_solr_devel search_api_solr_admin -y

Where:

<site>
Is the Pantheon site.
<env>
Is the Pantheon environment.
<server_id>
The Search API server created in Drupal.

Obviously you'll want to test this before you deploy, but here's the catch, first you need to be able to reproduce the problem.  And to do that, you need a Solr instance with indexes already built.  Now, if you are like me, you've already deployed the 4.3.1 version of search_api_solr (oops), so it's going to be a bit of a challenge to test in a new environment.  So, I wound up testing in the Pantheon site's Dev environment.  The important thing is you are able to reproduce the error, which should be easy enough, just try to re-index, if you get errors in the log, perfect.  Fortunately, we don't need a bunch of config changes or anything, just make sure devel is in the codebase, and you don't have any config overrides that would affect search.

So, let's go through the commands, because there are a few additional things to be aware of:

Step 1 - Install required modules

We need both search_api_solr_devel and search_api_solr_admin for the Drush commands.

terminus drush <site>.<env> -- en devel search_api_solr_devel search_api_solr_admin -y

Step 2 - Post config set

Search API Solr 4.3.1 ships with an updated config-set that needs to be deployed, and should serve most cases.  I understand that you might have a custom config-set, if so, you'll need to deploy it.  I don't know if you'll need to make any updates to custom config-sets such that they are compatible.

terminus drush <site>.<env> -- sapps <server_id> /code/web/modules/contrib/search_api_solr/jump-start/solr8/config-set/

You'll notice this command requires the server_id.  If you have multiple servers configured in Search API that use the same Solr instance, you should run this command for each server_id.

Step 3 - Disable Search API servers

According to danreb's comment, disabling the search API server "removes all the data in the disk".  This is actually counter to the instructions in the release notes, and maybe specific to Pantheon, so if you are not using Pantheon, you may need to follow the instructions that indicate you should create new search API servers.  However, in my tests, simply disabling the server is enough.

terminus drush <site>.<env> -- sapi-sd <server_id>

Again, if you have multiple search API servers, run this command for each server.

Step 4 - Delete all Solr documents

This command is made available by search_api_solr_devel.  

terminus drush <site>.<env> -- search-api-solr:devel-delete-all <server_id>

Step 5 - Enable Search API servers

Re-enable the search API servers.

terminus drush <site>.<env> -- sapi-se <server_id>

Step 6 - Reload Solr servers

From the search_api_solr_admin command doc, "Forces the Solr server to reload the core or collection to apply config changes."

terminus drush <site>.<env> -- solr-reload <server_id>

Step 7 - Rebuild index tracking information

This command differs from sapi-r in that it not only resets the tracker, it deletes the existing data.

terminus drush <site>.<env> -- sapi-rt

Step 8 - Index

Finally, we need to index our data.

terminus drush <site>.<env> -- sapi-i

If everything went according to plan, the indexing will work.  If you are still receiving errors, go back through the commands listed.  If it fails again, you might just do as recommended and create new servers / indexes.  Apparently, Solr has rename / alias capabilities, so you might be able to leverage those.

 

 

Tags