One of the most interesting tasks in the work of an application administrator, in my opinion, is the implementation of data migration when moving to a new system. Today I want to share my own experience of transferring data from the not very well-known helpdesk of the VisionFlow system to the more famous ServiceNow system.
What the customer wanted
Transfer all data from VisionFlow to ServiceNow keeping registration / closing date of tickets
Move the entire history of correspondence for each ticket (it was enough to combine all comments into one thread, but we went a little further)
Move all files attached to tickets
What did we have
Server version of the VisionFlow Helpdesk system deployed on a virtual Linux machine with a MySQL database for data storage.
ServiceNow instance, with a table prepared in advance for the customer.
At this stage, all the nuances were discussed, such as:
Status model
Required fields
Logic of automatic assignment of tickets to the executor
Data to be transferred
Data transfer
ServiceNow allows you to use excel files as a resource for importing data. I will not describe in detail the process of importing data into the system (the process is well described in the product documentation), but in general terms it looks like this:
Data import
Transform map allows us to set a key field by which the system will understand that a record with these parameters is already present in the table and only the fields need to be updated
xlsx , . VisionFlow . :
VisionFlow
SELECT
projectissue.projectIssueId,
projectissue.ticketId as 'Number',
reporter.email as 'Reporter',
projectissue.name as 'Short Description',
projectissue.Description as 'Description',
projectissue.companycustomfield15 as 'Product',
projectissue.companycustomfield13 as 'Document',
issuestatus.name as 'Status',
assignee.name as 'Assignee',
ADDTIME(projectissue.CreateDate, '-01:00') as 'Created',
ADDTIME(projectissue.completionDate, '-01:00') as 'Closed',
issuehistory.EventText as 'Comment',
author.name as 'commentAuthor'
FROM
projectissue
INNER JOIN issuestatus
ON projectissue.IssueStatusId = issuestatus.IssueStatusId
INNER JOIN systemuser assignee
ON projectissue.ResponsibleSystemUserId = assignee.SystemUserId
INNER JOIN systemuser reporter
ON projectissue.CreatedBySystemUserId = reporter.SystemUserId
INNER JOIN issuehistory ON
issuehistory.ProjectIssueId = projectissue.ProjectIssueId
INNER JOIN systemuser author
ON issuehistory.SystemUserId = author.SystemUserId
WHERE
projectissue.ProjectId = 54 AND (issuehistory.IssueEventTypeId = 5 OR issuehistory.IssueEventTypeId = 10 OR issuehistory.IssueEventTypeId = 2)
#projectissue.ProjectId = 54
ORDER BY projectissue.TicketId ASC, issuehistory.EventDate ASC
, , . JSON Excel . ServiceNow Data Source / .
: ServiceNow VisionFlow, , ( ) . .. , ( , ).
( ) , . , VisionFlow, , .
:
VisionFlow
SELECT
document.documentId,
document.name,
document.FullPath,
SUBSTRING_INDEX(SUBSTRING_INDEX(document.FullPath, '/', -2), '/', 1) as 'projectIssueId',
projectissue.ticketId as 'Number'
FROM
visionflow.document
INNER JOIN projectissue
ON projectissue.ProjectIssueId = SUBSTRING_INDEX(SUBSTRING_INDEX(document.FullPath, '/', -2), '/', 1)
WHERE
document.FullPath like '%/54/issuedocuments/%'
ORDER BY projectissueid
, VisionFlow . , , VF , , . issueId, . , , TicketId ( ServiceNow).
, ServiceNow . .. Python, , .
ServiceNow API attachments. SN endpoint .
ServiceNow code samples API. , :
file_name (Required) -
table_name (Required) - ,
table_sys_id (Required) - ID ,
Content-Type (Header) - mime type
, sys_id , ( VisionFlow). , , VisionFlow sys_id , . sys_id + ticketId ServiceNow + issueId + ticketId VisionFlow. VLOOKUP Excel :
old_folder_name
ticket_id
new_folder_name
Python , ( ):
import pandas as pd, os
from tqdm import tqdm
def renameFolders():
df = pd.read_csv('/Downloads/folder_rename.csv')
pbar = tqdm(total=len(df))
for _ , row in df.iterrows():
old_name = row['old_folder_name']
new_name = row['new_folder_name']
try:
os.rename(f'/Downloads/home/tomcat/vflowdocs/54/issuedocuments/{old_name}', f'/Downloads/home/tomcat/vflowdocs/54/issuedocuments/{new_name}')
pbar.update(1)
except:
pbar.update(1)
def removeEmptyFolders():
folder_list = os.listdir('/Downloads/home/tomcat/vflowdocs/54/issuedocuments/')
for folder in folder_list:
path = f'/Downloads/home/tomcat/vflowdocs/54/issuedocuments/{folder}'
try:
os.rmdir(path)
except:
if len(os.path.basename(path)) < 6 and os.path.basename(path) != 'nan':
print(f'ServiceNow SysId not found for item: {os.path.basename(path)}')
renameFolders()
removeEmptyFolders()
, :
, , , 3000 kb ( , ) def getSize()
. VisionFlow def removeDuplicates()
mime None. - mimetypes *msg, *txt, *eml
( , , ) -
import os, glob, filetype, requests, mimetypes
from tqdm import tqdm
import pandas as pd
def number_of_files():
files_number = 0
folder_list = os.listdir('/Downloads/home/tomcat/vflowdocs/54/issuedocuments/')
for folder in folder_list:
files_number += len(os.listdir(f'/Downloads/home/tomcat/vflowdocs/54/issuedocuments/{folder}/'))
return files_number
#Progress Bar
pbar = tqdm(total=1297)
log_messages_status = []
log_messages_filepath = []
log_messages_filename = []
log_messages_target = []
def uploadAllFiles(folder_name):
#Variables
entire_list = glob.glob(f'/Downloads/home/tomcat/vflowdocs/54/issuedocuments/{folder_name}/*')
my_list_updated = []
#Get Files Size
def getSize(fileobject):
fileobject.seek(0,2)
size = fileobject.tell()
return size
#Upload Files
def uploadFunc(filename, sys_id, path_to_file, content_type):
url = f'https://instance.service-now.com/api/now/attachment/file?file_name={filename}&table_name=table_name&table_sys_id={sys_id}'
payload=open(path_to_file, 'rb').read()
headers = {
'Accept': 'application/json',
'Authorization': 'Bearer ',
'Content-Type': content_type,
}
response = requests.request("POST", url, headers=headers, data=payload)
if response.status_code == 201:
#print(f'Success: {filename} was uploaded to the incident with sys_id {sys_id}')
pbar.update(1)
log_messages_status.append('Success')
log_messages_filename.append(filename)
log_messages_filepath.append(path_to_file)
log_messages_target.append(sys_id)
else:
pbar.update(1)
#print(f'Error: {filename} was not uploaded to the incident with sys_id {sys_id}')
log_messages_status.append('Error')
log_messages_filename.append(filename)
log_messages_filepath.append(path_to_file)
log_messages_target.append(sys_id)
#Remove Duplicates
def removeDuplicatesByName(list_of_elements):
list_of_elements.sort()
if len(list_of_elements) > 1:
for item in list_of_elements:
item_to_compare = item.split('.')[0]
for element in list_of_elements:
if item_to_compare in element:
entire_list.remove(element)
else:
pass
return list_of_elements
else:
return list_of_elements
my_list = removeDuplicatesByName(entire_list)
for item in my_list:
file_size = open(item, 'rb')
if getSize(file_size) > 3000:
my_list_updated.append(item)
else:
pass
for attach in my_list_updated:
kind = filetype.guess_mime(attach)
if kind != None:
uploadFunc(os.path.basename(attach), os.path.dirname(attach).split('/')[-1], attach, kind)
elif kind == None and attach.split('.')[-1] == 'txt':
uploadFunc(os.path.basename(attach), os.path.dirname(attach).split('/')[-1], attach, 'text/plain')
else:
uploadFunc(os.path.basename(attach), os.path.dirname(attach).split('/')[-1], attach, 'application/octet-stream')
def getFolders():
folder_list = os.listdir('/Downloads/home/tomcat/vflowdocs/54/issuedocuments/')
for folder in folder_list:
if folder != '.DS_Store':
uploadAllFiles(folder)
getFolders()
data_to_write = pd.DataFrame({
'status': log_messages_status,
'file_name' : log_messages_filename,
'file_path' : log_messages_filepath,
'target' : log_messages_target
})
data_to_write.to_csv('/Downloads/results_log.csv')
We had 2 sachets .... ©. We had 6,000,000 records to transfer (not so many, the old system did not work for long), 2,000 attachments and a little time. The preparation process took me about 14 hours (study, attempts, etc.) of leisurely work, and the transfer process takes about 30 minutes in total.
Of course, a lot could be improved, the process could be fully automated (from data uploading to uploading), but, unfortunately, this task is one-time. It was interesting to try Python for the implementation of the project, and I can say that he helped to cope with such a task with a bang.
Ultimately, the main task of the move is to do it as unnoticed as possible for the customer, which was done on my part.