Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
91 commits
Select commit Hold shift + click to select a range
1f5234a
Create slf_generator class for generating storey loss functions
mouayed-nafeh Mar 7, 2025
900651e
Add tests to calibration, modeller and slf_generator
mouayed-nafeh Mar 7, 2025
11bd417
Changes to plotter class for styling
mouayed-nafeh Mar 7, 2025
eae6b98
Minor changes to modeller and plotter classes
mouayed-nafeh Mar 7, 2025
f6e0004
Change font type in plotter class to 'Arial'
mouayed-nafeh Mar 7, 2025
c31e2b9
Update README
mouayed-nafeh Mar 9, 2025
d0ed7dc
Update README
mouayed-nafeh Mar 9, 2025
5b4e959
Remove unnecessary classes from slf_generator
mouayed-nafeh Mar 10, 2025
105b33a
Minor fix in slf_generator
mouayed-nafeh Mar 10, 2025
fd0d3fd
Update README
mouayed-nafeh Mar 10, 2025
b207606
Merge branch 'slf-generator' of https://github.com/GEMScienceTools/oq…
mouayed-nafeh Mar 10, 2025
f649ee4
Refactor slf_generator
mouayed-nafeh Mar 11, 2025
b3fdf1c
Merge branch 'slf-generator' of https://github.com/GEMScienceTools/oq…
mouayed-nafeh Mar 11, 2025
81262e8
Refactor slf_generator and create example
mouayed-nafeh Mar 11, 2025
0f9cdd5
Finalise example_4
mouayed-nafeh Mar 11, 2025
c1ab8ec
Create README for demos
mouayed-nafeh Mar 11, 2025
cf17fc0
Update README
mouayed-nafeh Mar 14, 2025
ca7c443
Update contribute_guidelines.md
mouayed-nafeh Mar 14, 2025
a9d0f1f
Update README
mouayed-nafeh Mar 14, 2025
07e040c
Merge branch 'slf-generator' of https://github.com/GEMScienceTools/oq…
mouayed-nafeh Mar 14, 2025
1d0d17c
Update calibration class
mouayed-nafeh Mar 16, 2025
0d17d07
Update calibration/modeller classes test
mouayed-nafeh Mar 16, 2025
5670b51
Lump examples 2 and 3 into end-to-end demonstration of cloud analysis
mouayed-nafeh Mar 16, 2025
a8f3fd5
Remove out folder from demos
mouayed-nafeh Mar 17, 2025
50b10f1
Update get_fragility_function to account distinctively for both epist…
mouayed-nafeh Mar 18, 2025
39b037c
Add method to calculate rotated fragility function with unit test
mouayed-nafeh Mar 18, 2025
7b7265e
Fixed bug in target percentile for fragility function rotation
mouayed-nafeh Mar 18, 2025
97676a1
Revert manually last commit
mouayed-nafeh Mar 18, 2025
16b77f4
Fixed issues in fragility function rotation
mouayed-nafeh Mar 18, 2025
ad0c3d0
Fixed bug in cloud regression
mouayed-nafeh Mar 19, 2025
4222e61
Removed get_fragility_function
mouayed-nafeh Mar 20, 2025
50a644b
Add calculate_lognormal_fragility to postprocessor class
mouayed-nafeh Mar 20, 2025
6d10027
Add calculate_glm_fragility to postprocessor class with docstrings
mouayed-nafeh Mar 20, 2025
085297f
Add calculate_ordinal_fragility to postprocessor class with docstring
mouayed-nafeh Mar 20, 2025
a0651ba
Update calculate_rotated_fragility in postprocessor class
mouayed-nafeh Mar 20, 2025
f9e6d21
Updated do_cloud_analysis method in postprocessor class
mouayed-nafeh Mar 20, 2025
9e22bcb
Update docstrings in postprocessor class
mouayed-nafeh Mar 20, 2025
eb23231
Update postprocessor_test to account for changes in class
mouayed-nafeh Mar 20, 2025
ab4cfce
Update plotter class
mouayed-nafeh Mar 20, 2025
9ca97db
Bug fixes in plotter and update docstrings for slf_generator
mouayed-nafeh Mar 20, 2025
9c32b98
Update README.md to include test badges
mouayed-nafeh Mar 20, 2025
59120c5
Update README.md to include test badges
mouayed-nafeh Mar 20, 2025
bf1a040
Update README.md to include test badges
mouayed-nafeh Mar 20, 2025
f744502
Update docstring for calibration
mouayed-nafeh Mar 20, 2025
5532161
Merge branch 'slf-generator' of https://github.com/GEMScienceTools/oq…
mouayed-nafeh Mar 20, 2025
a6b3a54
Update docstrings for im_calculator
mouayed-nafeh Mar 20, 2025
95cffee
Update docstrings for modeller class
mouayed-nafeh Mar 20, 2025
5af8b4d
Update docstrings for plotter class
mouayed-nafeh Mar 20, 2025
e0e4348
Update docstrings for postprocessor class
mouayed-nafeh Mar 20, 2025
518a9ff
Fix bug in postprocessor
mouayed-nafeh Mar 20, 2025
f6e27fd
Fix bug in postprocessor
mouayed-nafeh Mar 20, 2025
28e6e49
Update README.md with missing package installation step
mouayed-nafeh Mar 21, 2025
0e87182
Create documentation folder
mouayed-nafeh Mar 21, 2025
b2ea4db
Uncomment statsmodels imports
mouayed-nafeh Mar 21, 2025
827f37e
Add docs folder for documentation
mouayed-nafeh Mar 21, 2025
131eb9f
Create HTML build for documentation
mouayed-nafeh Mar 21, 2025
119faf1
Rename .rst files for documentation
mouayed-nafeh Mar 21, 2025
013f0a3
Update imc.rst with syntax fixes
mouayed-nafeh Mar 21, 2025
74744d5
Update plo.rst with syntax fixes
mouayed-nafeh Mar 21, 2025
b3f8824
Update pos.rst with syntax fixes
mouayed-nafeh Mar 21, 2025
f860caf
Update slf.rst with syntax fixes
mouayed-nafeh Mar 21, 2025
082e0df
Update slf.rst with syntax fixes
mouayed-nafeh Mar 21, 2025
7dcf620
Update index.rst with syntax fixes
mouayed-nafeh Mar 21, 2025
9bfd518
Created build for HTML pages
mouayed-nafeh Mar 21, 2025
ee52254
Add least-square-regression as cloud analysis method (classical option)
mouayed-nafeh Mar 21, 2025
7f53fe4
Refactor MLE-based cloud fitting
mouayed-nafeh Mar 22, 2025
687e21e
Add experimental code to produce "dummy" parameters from GLM and ordi…
mouayed-nafeh Mar 22, 2025
2cf7c84
Update README.md
mouayed-nafeh Mar 23, 2025
7f061f6
Update test_postprocessor.py to account for change in output structure
mouayed-nafeh Mar 23, 2025
2173b36
Clean up utilities.py
mouayed-nafeh Mar 25, 2025
b16183f
add pydantic-2.10.6-py3-none-any.whl as new deps
vot4anto Mar 25, 2025
a0b743a
add statsmodels 0.14.4 as new deps
vot4anto Mar 25, 2025
11ac4a1
add new reqs also on pyproject.toml
vot4anto Mar 25, 2025
e6a8ad7
add py version to install requirements files
vot4anto Mar 25, 2025
eea347a
remove 64 in CI script
vot4anto Mar 25, 2025
b799c06
add installation of requirements files on windows
vot4anto Mar 25, 2025
3b273f9
use only python3.11 on windows
vot4anto Mar 26, 2025
5842714
change version of openseespy on python3.12
vot4anto Mar 26, 2025
57d7243
change version of openseespy on python3.12
vot4anto Mar 26, 2025
c07d472
test last version on linux
vot4anto Mar 26, 2025
56ebf06
Could not find a version that satisfies the requirement openseespywin…
vot4anto Mar 26, 2025
315f485
use opensees 3.7.0.4
vot4anto Mar 26, 2025
49b515e
use opensees 3.6.0.3
mouayed-nafeh Mar 26, 2025
770d931
openseespywin==3.7.0.3 on python3.11 windows
vot4anto Mar 26, 2025
7e2d527
add also python3.10 on windows for CI
vot4anto Mar 26, 2025
6c7f370
openseespy==3.7.0.3.1 openseespywin==3.7.0.3.1 on windows
vot4anto Mar 26, 2025
3e0fc1c
openseespy==3.7.0.3 and openseespywin==3.7.0.3.1 on windows
vot4anto Mar 26, 2025
111677e
dismiss python3.10 on windows
mouayed-nafeh Mar 26, 2025
1ff4faf
use openseespy 3.7.0.3 on windows
mouayed-nafeh Mar 26, 2025
94bb326
updated openseespy versions for windows
mouayed-nafeh Mar 26, 2025
fee9e01
Remove support of python3.10 on Windows from README
mouayed-nafeh Mar 26, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .github/workflows/linux_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,10 @@ jobs:
- name: Install vmtk Package
run: |
source ~/openquake/bin/activate
PY_VER=`echo py${{ matrix.python-version }} | tr -d .`
echo $PY_VER
export PIP_DEFAULT_TIMEOUT=100
pip install -r requirements-$PY_VER-linux.txt
pip install -e .
- name: Run tests
run: |
Expand Down
6 changes: 5 additions & 1 deletion .github/workflows/windows_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ jobs:
fail-fast: false
matrix:
os: [windows-2022]
python-version: ["3.10", "3.11", "3.12"]
python-version: ["3.11", "3.12"]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
Expand All @@ -37,6 +37,10 @@ jobs:
run: |
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
C:\Users\runneradmin\openquake\Scripts\activate.ps1
$PY_VER="py${{ matrix.python-version }}"
$py = $PY_VER.replace(".","")
set PIP_DEFAULT_TIMEOUT=100
python -m pip install -r requirements-$py-win64.txt
pip install -e .
- name: Run tests
run: |
Expand Down
305 changes: 305 additions & 0 deletions .virtual_documents/demos/example_2.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,305 @@






import os
import sys
import shutil
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt

# Import the classes necessary for structural analysis
from openquake.vmtk.units import units # oq-vtmk units class
from openquake.vmtk.calibration import calibrate_model # oq-vmtk sdof-to-mdof calibration class
from openquake.vmtk.modeller import modeller # oq-vmtk numerical modelling class
from openquake.vmtk.postprocessor import postprocessor # oq-vtmk postprocessing class
from openquake.vmtk.plotter import plotter # oq-vmtk plotting class
from openquake.vmtk.utilities import sorted_alphanumeric, import_from_pkl, export_to_pkl # oq-vmtk utility class





# Define the directory of the ground-motion records
gm_directory = './in/records'

# Define the main output directory
nrha_directory = './out/nltha'
os.makedirs(nrha_directory, exist_ok=True)

# Define directory for temporary analysis outputs: it is used to store temporary .txt files used as accelerations recorders
temp_nrha_directory = os.path.join(nrha_directory,'temp')
os.makedirs(temp_nrha_directory, exist_ok=True)





# Import the intensity measure dictionary (output from example 1)
ims = import_from_pkl(os.path.join(gm_directory, 'imls_esrm20.pkl'))








# Number of storeys
number_storeys = 2

# Relative floor heights list
floor_heights = [2.80, 2.80]

# First-mode based participation factor
gamma = 1.33

# SDOF capacity (First row are Spectral Displacement [m] values - Second row are Spectral Acceleration [g] values)
sdof_capacity = np.array([[0.00060789, 0.00486316, 0.02420000, 0.04353684],
[0.10315200, 0.20630401, 0.12378241, 0.12502023]]).T
# Frame flag
isFrame = False

# Soft-storey mechanism flag
isSOS = False

# Degradation flag
mdof_degradation = True

# Inherent damping
mdof_damping = 0.05





# Intensity measures to use for postprocessing cloud analyses
IMTs = ['PGA', 'SA(0.3s)', 'SA(0.6s)', 'SA(1.0s)','AvgSA']

# Damage thresholds (maximum peak storey drift values in rad)
damage_thresholds = [0.00150, 0.00545, 0.00952, 0.0135]

# The lower limit to be applied for censoring edp values (below 0.1 the minimum threshold for slight damage is considered a negligible case)
lower_limit = 0.1*damage_thresholds[0]

# The upper limit to be applied for consoring edp values (above 1.5 the maximum threshold is considered a collapse case)
censored_limit = 1.5*damage_thresholds[-1]

# Define consequence model to relate structural damage to a decision variable (i.e., expected loss ratio)
consequence_model = [0.05, 0.20, 0.60, 1.00] # damage-to-loss ratios











# Calibrate the model using the Lu et al. (2020) method
floor_masses, storey_disps, storey_forces, mdof_phi = calibrate_model(number_storeys, gamma, sdof_capacity, isFrame, isSOS)

print('The mass of each floor (in tonnes):', floor_masses)
print('The first-mode shape used for calibration:', mdof_phi)

# Plot the capacities to visualise the outcome of the calibration
for i in range(storey_disps.shape[0]):
plt.plot(np.concatenate(([0.0], storey_disps[i,:])), np.concatenate(([0.0], storey_forces[i,:]*9.81)), label = f'Storey #{i+1}')
plt.plot(np.concatenate(([0.0], sdof_capacity[:,0])), np.concatenate(([0.0], sdof_capacity[:,1]*9.81)), label = 'SDOF Capacity')
plt.xlabel('Storey Deformation [m]', fontsize= 16)
plt.ylabel('Storey Shear [kN]', fontsize = 16)
plt.legend(loc = 'lower right')
plt.grid(visible=True, which='major')
plt.grid(visible=True, which='minor')
plt.xlim([0.00, 0.03])
plt.show()





# Initialise MDOF storage lists
conv_index_list = [] # List for convergence indices
peak_disp_list = [] # List for peak floor displacement (returns all peak values along the building height)
peak_drift_list = [] # List for peak storey drift (returns all peak values along the building height)
peak_accel_list = [] # List for peak floor acceleration (returns all peak values along the building height)
max_peak_drift_list = [] # List for maximum peak storey drift (returns the maximum value)
max_peak_drift_dir_list = [] # List for maximum peak storey drift directions
max_peak_drift_loc_list = [] # List for maximum peak storey drift locations
max_peak_accel_list = [] # List for maximum peak floor acceleration (returns the maximum value)
max_peak_accel_dir_list = [] # List for maximum peak floor acceleration directions
max_peak_accel_loc_list = [] # List for maximum peak floor acceleration locations

# Loop over ground-motion records, compile MDOF model and run NLTHA
gmrs = sorted_alphanumeric(os.listdir(os.path.join(gm_directory,'acc'))) # Sort the ground-motion records alphanumerically
dts = sorted_alphanumeric(os.listdir(os.path.join(gm_directory,'dts'))) # Sort the ground-motion time-step files alphanumerically

# Run the analysis
for i in range(len(gmrs)):
### Print post-processing iteration
print('================================================================')
print('============== Analysing: {:d} out of {:d} =================='.format(i+1, len(gmrs)))
print('================================================================')

### Compile the MDOF model
model = modeller(number_storeys,
floor_heights,
floor_masses,
storey_disps,
storey_forces*units.g,
mdof_degradation) # Initialise the class (Build the model)

model.compile_model() # Compile the MDOF model

if i==0:
model.plot_model() # Visualise the model (only on first iteration)
model.do_gravity_analysis() # Do gravity analysis

if number_storeys == 1:
num_modes = 1
else:
num_modes = 3
T, phi = model.do_modal_analysis(num_modes = num_modes) # Do modal analysis and get period of vibration (Essential step for running NLTHA)

### Define ground motion objects
fnames = [os.path.join(gm_directory,'acc',f'{gmrs[i]}')] # Ground-motion record names
fdts = os.path.join(gm_directory,'dts',f'{dts[i]}') # Ground-motion time-step names
dt_gm = pd.read_csv(fdts, header=None)[pd.read_csv(fdts,header=None).columns[0]].loc[1]-\
pd.read_csv(fdts, header=None)[pd.read_csv(fdts,header=None).columns[0]].loc[0] # Ground-motion time-step
t_max = pd.read_csv(fdts)[pd.read_csv(fdts).columns[0]].iloc[-1] # Ground-motion duration

### Define analysis params and do NLTHA
dt_ansys = dt_gm # Set the analysis time-step
sf = units.g # Set the scaling factor (if records are in g, a scaling factor of 9.81 m/s2 must be used to be consistent with opensees)
control_nodes, conv_index, peak_drift, peak_accel, max_peak_drift, max_peak_drift_dir, max_peak_drift_loc, max_peak_accel, max_peak_accel_dir, max_peak_accel_loc, peak_disp = model.do_nrha_analysis(fnames,
dt_gm,
sf,
t_max,
dt_ansys,
temp_nrha_directory,
pflag=False,
xi = mdof_damping)

### Store the analysis
conv_index_list.append(conv_index)
peak_drift_list.append(peak_drift)
peak_accel_list.append(peak_accel)
peak_disp_list.append(peak_disp)
max_peak_drift_list.append(max_peak_drift)
max_peak_drift_dir_list.append(max_peak_drift_dir)
max_peak_drift_loc_list.append(max_peak_drift_loc)
max_peak_accel_list.append(max_peak_accel)
max_peak_accel_dir_list.append(max_peak_accel_dir)
max_peak_accel_loc_list.append(max_peak_accel_loc)

# Remove the temporary directory
shutil.rmtree(f'{temp_nrha_directory}')

# Store the analysis results in a dictionary
ansys_dict = {}
labels = ['T','control_nodes', 'conv_index_list',
'peak_drift_list','peak_accel_list',
'max_peak_drift_list', 'max_peak_drift_dir_list',
'max_peak_drift_loc_list','max_peak_accel_list',
'max_peak_accel_dir_list','max_peak_accel_loc_list',
'peak_disp_list']

for i, label in enumerate(labels):
ansys_dict[label] = vars()[f'{label}']
# Export the analysis output variable to a pickle file using the "export_to_pkl" function from "utilities"
export_to_pkl(os.path.join(nrha_directory,'ansys_out.pkl'), ansys_dict)

print('ANALYSIS COMPLETED!')





# Initialise the postprocessor class
pp = postprocessor()

# Initialise the plotter class
pl = plotter()

# Loop over the intensity measure types and perform cloud regression to fit the probabilistic seismic demand-capacity model
for _, current_imt in enumerate(IMTs):

# Import the current intensity measure type
imls = ims[f'{current_imt}']

# Import the engineering demand parameters (i.e., mpsd) from the analysis dictionary (processed from example 2)
edps = ansys_dict['max_peak_drift_list']

# Process cloud analysis results using the "do_cloud_analysis" function called from "postprocessor"
# The output will be automatically stored in a dictionary
cloud_dict = pp.do_cloud_analysis(imls,
edps,
damage_thresholds,
lower_limit,
censored_limit)

## Visualise the cloud analysis results
pl.plot_cloud_analysis(cloud_dict,
output_directory = None,
plot_label = f'cloud_analysis_{current_imt}',
xlabel = f'{current_imt} [g]',
ylabel = r'Maximum Peak Storey Drift, $\theta_{max}$ [%]') # The y-axis values of drift are converted to % automatically by the plotter

## Visualise the fragility functions
pl.plot_fragility_analysis(cloud_dict,
output_directory = None,
plot_label = f'fragility_{current_imt}',
xlabel = f'{current_imt}')

## Visualise the seismic demands
pl.plot_demand_profiles(ansys_dict['peak_drift_list'],
ansys_dict['peak_accel_list'],
ansys_dict['control_nodes'],
output_directory = None,
plot_label="seismic_demand_profiles") # The y-axis values of drift and acceleration are converted to % and g automatically by the plotter

## Visualise the entire set of results using subplots
pl.plot_ansys_results(cloud_dict,
ansys_dict['peak_drift_list'],
ansys_dict['peak_accel_list'],
ansys_dict['control_nodes'],
output_directory = None,
plot_label = f'analysis_output_{current_imt}',
cloud_xlabel = f'{current_imt}',
cloud_ylabel = r'Maximum Peak Storey Drift, $\theta_{max}$ [%]')





# In this example, since the latest iteration of the previous cell uses 'AvgSA' as the intensity measure,
# then all variables stored inside the "cloud_dict" dictionary correspond to that same IM. Hence,
# the vulnerability function derived here will represent the continuous relationship of the expected
# structural loss ratio conditioned on increasing levels of ground-shaking expressed in terms of the
# average spectral acceleration (in g)

structural_vulnerability = pp.get_vulnerability_function(cloud_dict['poes'],
consequence_model,
uncertainty=True)


# Plot the structural vulnerability function
pl.plot_vulnerability_analysis(structural_vulnerability['IMLs'],
structural_vulnerability['Loss'],
structural_vulnerability['COV'],
'SA(1.0s)',
'Structural Loss Ratio',
output_directory = None,
plot_label = 'Structural Vulnerability')


# The output is a DataFrame with three keys: IMLs (i.e., intensity measure levels), Loss and COV
print(structural_vulnerability)



Loading