Skip to content

Instantly share code, notes, and snippets.

@thomasst
Created May 14, 2015 18:26
Show Gist options
  • Save thomasst/afeda8fe80534a832607 to your computer and use it in GitHub Desktop.
Save thomasst/afeda8fe80534a832607 to your computer and use it in GitHub Desktop.
Migrate Redis data on Amazon ElastiCache
"""
Copies all keys from the source Redis host to the destination Redis host.
Useful to migrate Redis instances where commands like SLAVEOF and MIGRATE are
restricted (e.g. on Amazon ElastiCache).
The script scans through the keyspace of the given database number and uses
a pipeline of DUMP and RESTORE commands to migrate the keys.
Requires Redis 2.8.0 or higher.
Python requirements:
click==4.0
progressbar==2.3
redis==2.10.3
"""
import click
from progressbar import ProgressBar
from progressbar.widgets import Percentage, Bar, ETA
import redis
from redis.exceptions import ResponseError
@click.command()
@click.argument('srchost')
@click.argument('dsthost')
@click.option('--db', default=0, help='Redis db number, default 0')
@click.option('--flush', default=False, is_flag=True, help='Delete all keys from destination before migrating')
def migrate(srchost, dsthost, db, flush):
if srchost == dsthost:
print 'Source and destination must be different.'
return
source = redis.Redis(srchost, db=db)
dest = redis.Redis(dsthost, db=db)
if flush:
dest.flushdb()
size = source.dbsize()
if size == 0:
print 'No keys found.'
return
progress_widgets = ['%d keys: ' % size, Percentage(), ' ', Bar(), ' ', ETA()]
pbar = ProgressBar(widgets=progress_widgets, maxval=size).start()
COUNT = 2000 # scan size
cnt = 0
non_existing = 0
already_existing = 0
cursor = 0
while True:
cursor, keys = source.scan(cursor, count=COUNT)
pipeline = source.pipeline()
for key in keys:
pipeline.pttl(key)
pipeline.dump(key)
result = pipeline.execute()
pipeline = dest.pipeline()
for key, ttl, data in zip(keys, result[::2], result[1::2]):
if ttl is None:
ttl = 0
if data != None:
pipeline.restore(key, ttl, data)
else:
non_existing += 1
results = pipeline.execute(False)
for key, result in zip(keys, results):
if result != 'OK':
e = result
if hasattr(e, 'message') and (e.message == 'BUSYKEY Target key name already exists.' or e.message == 'Target key name is busy.'):
already_existing += 1
else:
print 'Key failed:', key, `data`, `result`
raise e
if cursor == 0:
break
cnt += len(keys)
pbar.update(min(size, cnt))
pbar.finish()
print 'Keys disappeared on source during scan:', non_existing
print 'Keys already existing on destination:', already_existing
if __name__ == '__main__':
migrate()
@papile
Copy link

papile commented Aug 4, 2015

Very nice. This worked perfectly to copy my DB over to AWS ElasticCache

@fredrik
Copy link

fredrik commented Sep 20, 2016

Thanks for this, @thomasst!

@sourabh100486
Copy link

sourabh100486 commented Feb 10, 2017

Thanks.. Saved a lot of time.. (y)

@TechCanuck
Copy link

TechCanuck commented Mar 3, 2017

Saved us a considerable amount of time -- super appreciated!

@kitwalker12
Copy link

I added a couple settings to my fork to add auth and specify port

@hylaride
Copy link

You saved us a hassle and a half as MIGRATE is disabled on AWS Elasticache! Thank you so much

@tourhop
Copy link

tourhop commented Jan 30, 2018

I used this script successfully to migrate data on multiple AWS environments, but one instance (with the most data) is failing due to

  File "/usr/local/lib/python2.7/site-packages/redis/connection.py", line 208, in read
    return data[:-2]
MemoryError

Any clues on what could be tweaked to migrate this redis instance?

@metamatik
Copy link

Thank you @thomasst!

Just in case anybody stumbles upon this gist, I forked @kitwalker12's fork (thanks too!), updated it to support Python 3 and added a couple of options to handle SSL connections and basic filtering on the names of the keys to migrate. The result can be found here :)

@r44
Copy link

r44 commented Mar 1, 2018

👍

@learndevops007
Copy link

is there any way we can override the values of keys when the value of key in the source is changed and running the script again. Currently if i run the script second time after the value of key is changed in the source, that change is not getting reflected in the second host(destination)

@prashanth612
Copy link

Hello, you can use restore(name, ttl, value, replace=True) for overriding the keys everytime you copy.

@joshmbg
Copy link

joshmbg commented Apr 18, 2018

I'm getting the error redis.exceptions.ResponseError: DUMP payload version or checksum are wrong, any idea what that means?

@nishant-jain
Copy link

Thanks a lot! 👍

@ruchit47
Copy link

ruchit47 commented Nov 9, 2019

Thanks a ton. You made migrating on Elasticache easier. I tried a lot of other ways but this was the fastest option.

@marianodav
Copy link

Super faster

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment