1

I have 200 gdbs, with the same structure: one dataset with five features (see below)

enter image description here

I need to copy the feature named "GSUP_USO_Vegetacao" from all 200 gdbs to another gdb, also, name the output features with "GSUP_USO_Vegetacao_name of gdb" ("GSUP_USO_Vegetacao_A1", example above) I tried using arcpy fuctions but I'm stuck in this scirpt:

import arcpy
import os
workspace = #geodatabase path
outgdb = #new geodatabase path named "all.gdb"
search = "GSUP_USO_Vegetacao"
fc = []
walk = arcpy.da.Walk(workspace, datatype="FeatureClass", type="Polygon")
for gdb, datasets, features in arcpy.da.Walk(os.environ.workspace):
 for dataset in datasets:
 for feature in fc:
 arcpy.CopyFeatures_management(feature,os.path.join(outgdb,"GSUP_USO_Vegetacao"+gdb)

What do I need to do with this script?

Vince
20.5k16 gold badges49 silver badges65 bronze badges
asked Apr 29, 2022 at 13:44

2 Answers 2

1

Assuming all of the geodatabases are within the same folder, you just need to iterate through the folder of geodatabases, then save a copy of the feature class with the new name. Just change the second and third variables to match your conditions. It's not necessary to iterate through the datasets when you set the workspace because table names have to be unique throughout a geodatabase.

Edit: This script doesn't create the output gdb, so one must create a new gdb first

import os
import arcpy
input_fc_name = 'GSUP_USO_Vegetacao'
folder_of_gdbs = r'C:\Path\To\FOLDER\Containing\GDBS'
output_gdb_path = r'C:\Path\To\Output.gdb'
# assumes all gdbs are in folder_of_gdbs
input_gdbs = [g for g in os.listdir(folder_of_gdbs) if g.upper().endswith('.GDB')]
for gdb in input_gdbs:
 arcpy.env.workspace = os.path.join(folder_of_gdbs, gdb) # set workspace
 out_fc_name = input_fc_name + '_' + os.path.splitext(gdb)[0] # gets rid of extension (e.g ".gdb")
 out_fc_path = os.path.join(output_gdb_path, out_fc_name)
 arcpy.AddMessage(out_fc_name)
 arcpy.management.Copy(input_fc_name, out_fc_path)
answered Apr 29, 2022 at 15:30
0

Here's an attempt at solving your problem by making a few modifications to the code in this other answer.

# Import system modules
import arcpy
import os
import pathlib
def export_layers_to_single_gdb(input_gdb_folder, 
 target_dataset_name,
 target_layer_name, 
 output_gdb, 
 overwrite_output_gdb,):
 '''
 Function that reads an input folder with multiple GDB files and copies one 
 specific layer from each GDB to the output GDB. 
 
 Parameters
 ----------
 input_gdb_folder : str
 The input folder that contains all the GDBs that will be looked into
 target_dataset_name : str
 The name of the dataset to serch for the target layer. If the target 
 layer is not contained in a dataset, just use either an empty string or 
 None for this value: '' or None
 target_layer_name : str
 The name of the layer to be copied from all input GDBs to the 
 consolidated output GDB 
 output_gdb : str
 Path for the consolidated output GeoDataBase (.gdb) file. 
 overwrite_output_gdb : boolean
 Flag that indicates whether or not the consolidated output GDB file 
 should be overwritten in this process in case a file with the same name 
 already exists on disk.
 
 Returns
 -------
 None.
 
 Outputs
 -------
 This function copies the target layer from several input into creates a master GDB file 
 
 '''
 # Fishing out the output folder and the GDB filename
 output_gdb_path = pathlib.Path(output_gdb)
 output_folder = str(output_gdb_path.parent)
 master_gdb = output_gdb_path.stem + output_gdb_path.suffix
 
 # Creating master consolidated output GDB. This will store the finalized 
 # combined layers.
 master_gdb_fullpath = os.path.join(output_folder, master_gdb)
 if ((os.path.exists(master_gdb_fullpath)) and (overwrite_output_gdb)):
 arcpy.Delete_management(master_gdb_fullpath)
 arcpy.CreateFileGDB_management(output_folder, master_gdb)
 
 # Get list of all GDBs in the input folder
 list_of_input_gdbs = [f for f in os.listdir(input_gdb_folder) 
 if (f[-4:].lower()=='.gdb')]
 
 # Looping over every input GDB file
 for i,input_gdb in enumerate(list_of_input_gdbs):
 # Getting the full path of the input GDB
 input_gdb_fullpath = os.path.join(input_gdb_folder, input_gdb)
 arcpy.env.workspace = input_gdb_fullpath
 
 # Copying the Featureclass from the input GDB to the Master GDB
 print(f"COPYING: {target_layer_name} FROM: {input_gdb}")
 if (target_dataset_name is None) or (target_dataset_name == ''):
 fcCopy = os.path.join(input_gdb_fullpath, 
 target_layer_name)
 else:
 fcCopy = os.path.join(input_gdb_fullpath, 
 target_dataset_name, 
 target_layer_name)
 
 new_layer_name = target_layer_name + '_' + input_gdb
 
 arcpy.FeatureClassToFeatureClass_conversion(
 fcCopy, master_gdb_fullpath, new_layer_name)

Btw, I wasn't able to test it myself, so please let me know if it crashed on some step.

answered Apr 29, 2022 at 16:15

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.