Compare commits

...

63 commits

Author SHA1 Message Date
166331cd31
Merge branch 'master' of ssh://git.edgarpierre.fr:39529/m2cce/internship 2023-05-16 11:56:12 +02:00
35cbecd0c8
Add openfoam_env fish function 2023-05-16 11:55:23 +02:00
4ae8a0def6
Update wavelet script 2023-05-16 11:54:53 +02:00
479230840b
Add README SWASH 2022-07-06 10:06:18 +02:00
e1588f6bca
Update README, mat_npz 2022-07-06 10:05:34 +02:00
f2dd8d557e
Update README, animate 2022-07-06 10:01:50 +02:00
e8d925ac48
Update README, run_ola 2022-07-06 09:54:27 +02:00
8ef9a8446c
Update README, sws_wavedict 2022-07-06 09:38:57 +02:00
89a05e191f
Update README, sws_ola 2022-07-06 09:32:20 +02:00
65c97cf8d1
Update README, pickle 2022-07-06 09:22:42 +02:00
2a4ae8ad0c
Update README, flow_velocity 2022-07-06 09:17:50 +02:00
7fd2395914
Update README, diff 2022-07-06 09:15:31 +02:00
5c23dd1d93
Update README, bathy 2022-07-06 09:11:50 +02:00
e7316717f7
Update README, animate 2022-07-06 09:03:13 +02:00
71927ce67e
Update README 2022-07-06 08:57:01 +02:00
0e79473b59
Update README 2022-07-06 08:56:09 +02:00
db5096922f
Update README, zero_cross 2022-07-06 08:51:01 +02:00
c5f419b587
Update README, ts 2022-07-06 08:48:05 +02:00
067bf2af9a
Update README, subdomain 2022-07-06 08:45:18 +02:00
31df415726
Update README, spec 2022-07-06 08:40:55 +02:00
934be9d593
Update README, projection 2022-07-06 08:36:28 +02:00
f6ac178e86
Update README, plot 2022-07-06 08:26:45 +02:00
d3358fa40d
Update README, orbitals 2022-07-06 08:21:47 +02:00
c5e2478b91
Update README 2022-07-06 08:02:15 +02:00
ac56466ddd
Merge branch 'nature' 2022-07-06 07:56:13 +02:00
b2c4227f83
Olaflow processing for article 2022-07-06 07:55:46 +02:00
14ef85246a
Swash processing for article 2022-07-06 07:55:33 +02:00
34fcb4e879
Data processing for article 2022-07-06 07:55:18 +02:00
c443947e3c
Outputs for article 2022-07-06 07:53:59 +02:00
9ece349d43
End report 2022-07-04 19:20:46 +02:00
6a3c8cee5c
Todo 2022-07-04 15:04:20 +02:00
c90e3b8b86
Nature v1 2022-07-04 14:39:16 +02:00
b8da550849
Update zero_cross and add nansasena 2022-06-29 09:31:07 +02:00
2422c412f7
Methods olaflow 2022-06-28 16:08:48 +02:00
acfb2ee382
Discussion, methods swash 2022-06-28 15:05:11 +02:00
db148e8bdd
Reflection 2022-06-27 14:56:09 +02:00
9fec212d98
Rogue 2022-06-27 10:58:32 +02:00
f1bc55843a
Update prints from data zero_cross 2022-06-27 10:27:29 +02:00
d4e9c3fd78
Merge branch 'master' of ssh://git.edgarpierre.fr:39529/m2cce/internship 2022-06-24 16:50:53 +02:00
b92e52ecbb
Server side scripts & all 2022-06-24 16:50:38 +02:00
25e0f91bf0
Data processing + photo analysis 2022-06-24 16:49:36 +02:00
aadfda6a86
Wavelet 2022-06-24 16:44:28 +02:00
b14dd702cb
Olaflow results, better fig 4 2022-06-24 15:00:37 +02:00
eab1c68b8a
Olaflow results, better fig 1 2022-06-23 11:12:00 +02:00
5fbcdcfa8a
Olaflow results 2022-06-22 17:02:44 +02:00
1039ee6379
Wave transformation 2022-06-22 15:59:32 +02:00
6a9fff542b
Wave transformation 2022-06-22 15:57:58 +02:00
f113808de4
Add swash propa 2022-06-07 14:11:15 +02:00
a000c67e93
Merge branch 'master' of ssh://git.edgarpierre.fr:39529/m2cce/internship 2022-06-07 12:10:00 +02:00
339c27b99b
Update plot outputs 2022-06-07 12:09:22 +02:00
95785f9df5
Results from swash 1 2022-06-07 11:41:48 +02:00
ec051a2054
Finalise introduction 2022-06-07 11:04:23 +02:00
d1b6819e0c
Add model names in introduction 2022-06-07 09:25:55 +02:00
564127e5b3
Introduction 2022-06-07 09:23:19 +02:00
174b764b3a
Testing kalliope 2022-06-06 09:56:14 +02:00
384481c4e4
Use inter sf font 2022-06-06 09:53:23 +02:00
f7d287394f
Nature: start introduction 2022-06-06 09:41:23 +02:00
045b6ae439
Merge branch 'jngcgc' 2022-06-01 13:19:07 +02:00
12fc34fa0c
Remove .doc 2022-05-31 17:04:21 +02:00
eda60c39eb
Abstract + mots-clés jngcgc 2022-05-31 17:03:48 +02:00
38e5ae5c99
Add transmission plot 2022-05-25 11:06:15 +02:00
fd15259134
Add compare and info to wave siz 2022-05-25 10:27:36 +02:00
97c856a73c
Update colormap and limits for animate & diff 2022-05-18 10:05:49 +02:00
51 changed files with 2122 additions and 435 deletions

106
README.md
View file

@ -5,101 +5,31 @@ displacement at the Artha breakwater on February 28, 2017"
[Report](https://kalliope.edgarpierre.fr/latex/m2cce/internship/report.pdf)
## Configuration
## Structure
Each part can be run in their respective directories.
```
$ python -m processing.swash -h
usage: swash.py [-h] [-v] [-c CONFIG]
Le projet est séparé en dossiers pour chaque partie. Les dossiers sont les suivants:
Run swash model
`data`
: Données d'entrée et traitement (données bouée, bathymétrie)
options:
-h, --help show this help message and exit
-v, --verbose
-c CONFIG, --config CONFIG
```
`jngcgc`
: Article pour la Journée Nationale Génie Côtier Génie Civil
## Bathymetry
`nature`
: Article pour Nature Communications
Configuration:
```
[inp]
root: input directory
base: bathymetry database (.xyz) -- not provided in this repository
hires: 1d hires smoothed breakwater profile (.dat)
hstru: 1d hires smoothed breakwater porosity height (.dat)
poro: 1d breakwater porosity (.dat)
psize: 1d porosity size (.dat)
hires_step: mesh sizing of 1d profile
`olaflow`
: Modèle 2Dv VOF VARANS Olaflow
[out]
margin: margin around the buoy and breakwater for subdomain
no_breakwater: option to remove breakwater
root: output directory
sub: output file for subdomain
out: output file for 1d projection
step: mesh size
left: margin for 1d projection (breakwater side)
right: margin for 1d projection (buoy side)
`report`
: Résultats intermédiaires et bibliographie
[artha]
lat: latitude of the breakwater
lon: longitude of the breakwater
`swash`
: Modèle 1D NLSW Swash
[buoy]
lat: latitude of the buoy
lon: longitude of the buoy
```
## Fonctionnement
* Insert database in `data/data`
* Run `processing.subdomain` to generate 2D data in a smaller domain
* Run `processing.projection` to generate a 1D bathymetry
Dans chaque dossier se trouve un fichier `README.md` détaillant son fonctionnement.
## Swash
Configuration:
```
[data]
out: input directory (output of bathymetry)
[swash]
input: swash input file (.sws)
path: path to swash directory (should contain swashrun and swash.exe)
out: output directory for swash (created at the end of the computation)
mpi: number of mpi threads (omit to use serial)
[post]
inp: input directory for post-processing (output of sws_npz)
compare: input directory for post-processing comparison (omit to display results from single case)
out: output directory for figures
x0: position of reflection calculations
t0: minimum time for reflection calculations (necessary because of boundary condition ramping)
```
* Run `processing.swash` to run the Swash model
* Run `processing.sws_npz` to convert Swash output to Numpy files
* Run `processing.post` to plot wave height at a point, water level at an
instant, reflection coefficient at a point
* Run `processing.animate` to plot an animation of water level
* Run `processing.layers` to plot an animation of layers with flow velocity
## OlaFlow
Configuration:
```
[swash]
np_out: input of olaflow from swash (output of sws_npz)
[bathy]
bathy: bathymetry to be used by olaflow
hstru: height of porous domain (see bathy)
scale: openscad scaling (mesh sizing in [x,y,z])
out: output folder for stl bathymetry
[olaflow]
root: olaflow computation directory
```
* Run `processing.bathy` to generate stl input files
* Run `run_ola.sh` to run OlaFlow model
Les scripts Python sont regroupés dans un dossier `processing`, et généralement exécutables en fonctionnement module :
`python -m processing.script`.

169
data/README.md Normal file
View file

@ -0,0 +1,169 @@
# Data
Ce dossier regroupe les données d'entrée et les scripts de pré-traitement des données.
## Configuration initiale
Afin de limiter l'espace utilisé, le fichier de bathymétrie n'est pas fourni. Il est recommandé d'utiliser la base de
donnée construite par V. Roeber.
Les scripts de ce dossier sont configurés à l'aide d'un fichier `config.ini`. Les options seront détaillées pour chaque
script par la suite.
## Scripts
### Lambert
`lambert.py` contient une classe permettant de réaliser la projection en coordonnées cartésiennes de la bathymétrie en
coordonnées sphériques.
### Nandasena
`nandasena.py` permet de calculer le critère de déplacement de bloc de Nandasena (2011). Configuration dans le code
directement en modifiant le contenu de la variable `const`.
### Orbitals
`orbitals.py` permet de tracer la trajectoire de la bouée lors du passage de la vague scélérate identifiée en 2 et 3
dimensions.
```python -m processing.orbitals [-c CONFIG] [-v]```
* `-c CONFIG` : choix d'un fichier de configuration (`.ini`)
* `-v` : verbose
```
[inp]
root : racine des fichiers d'entrée
raw_ts : liste des chemins vers les fichiers de données brutes de la bouée
[out]
root : racine des fichiers de sortie
```
### Plot
`plot.py` permet de tracer la bathymétrie locale en 2 dimensions.
```python -m processing.plot [-c CONFIG] [-v]```
* `-c CONFIG` : choix d'un fichier de configuration (`.ini`)
* `-v` : verbose
```
[inp]
root : racine des fichiers d'entrée
[out]
root : racine des fichiers de sortie
sub : sous-domaine de bathymétrie, doit être généré avec subdomain
```
### Projection
`projection.py` utilise le module `lambert` afin de réaliser une projection 1D en coordonnées cartésiennes de la
bathymétrie en coordonnées sphériques, en utilisant les données haute-résolution de PA Poncet pour la digue.
```python -m processing.projection [-c CONFIG] [-v]```
* `-c CONFIG` : choix d'un fichier de configuration (`.ini`)
* `-v` : verbose
```
[inp]
root : racine des fichiers d'entrée
hires : bathymétrie haute-résolution (PA Poncet)
hstru : hauteur de poreux (PA Poncet)
poro : porosité (PA Poncet)
psize : taille de porosité (PA Poncet)
hires_step : résolution des données hires
[out]
root : racine des fichiers de sortie
no_breakwater* : enlever la digue (défaut : False)
sub : sous-domaine de bathymétrie, doit être généré avec subdomain
step : résolution du domaine de sortie
left : marge après la digue sur le domaine de sortie
right : marge après la bouée sur le domaine de sortie
[artha]
lat : latitude de la digue
lon : longitude de la digue
[buoy]
lat : latitude de la bouée
lon : longitude de la bouée
```
### Spec
`spec.py` permet de générer un spectre d'entrée de SWASH à partir du spectre mesuré par la bouée.
```python -m processing.spec [-c CONFIG] [-v]```
* `-c CONFIG` : choix d'un fichier de configuration (`.ini`)
* `-v` : verbose
```
[inp]
root : racine des fichiers d'entrée
raw_spec : fichier de spectre à utiliser
cycle : durée d'utilisation du spectre
[out]
root : racine des fichiers de sortie
```
### Subdomain
`subdomain.py` permet de générer un sous-domaine de la bathymétrie pour utilisation avec `plot` et `projection`.
```python -m processing.projection [-c CONFIG] [-v]```
* `-c CONFIG` : choix d'un fichier de configuration (`.ini`)
* `-v` : verbose
```
[inp]
root : racine des fichiers d'entrée
base : base de données de bathymétrie (V Roeber)
[out]
margin : marges (en degrés) autour des positions de la bouée et de la digue
root : racine des fichiers de sortie
sub : fichier de sortie du sous-domaine de bathymétrie
[artha]
lat : latitude de la digue
lon : longitude de la digue
[buoy]
lat : latitude de la bouée
lon : longitude de la bouée
```
### TS
`ts.py` permet de générer une série temporelle pour SWASH à partir de plusieurs série temporelles brutes de la bouée.
```python -m processing.ts [-c CONFIG] [-v]```
* `-c CONFIG` : choix d'un fichier de configuration (`.ini`)
* `-v` : verbose
```
[inp]
root : racine des fichiers d'entrée
raw_ts : liste des fichiers de données brutes de la bouée à utiliser
[out]
root : racine des fichiers de sortie
```
### Zero-cross
`zero_cross.py` permet d'obtenir les vagues les plus grosses de toutes les séries temporelles et d'obtenir la
transformée en ondelettes de la série.
```python -m processing.zero_cross [-c CONFIG] [-v]```
* `-c CONFIG` : choix d'un fichier de configuration (`.ini`)
* `-v` : verbose
```
[inp]
root : racine des fichiers d'entrée (les données brutes doivent être dans le sous-dossier `cerema/raw`)
[out]
root : racine des fichiers de sortie
```

29
data/config2.ini Normal file
View file

@ -0,0 +1,29 @@
[inp]
root=data
base=Database_20220224.xyz
hires=bathyhires.dat
hstru=Hstru.dat
poro=Poro.dat
psize=Psize.dat
raw_ts=cerema/raw/201702281700.raw
raw_spec=cerema/spt/201702281715.spt
hires_step=0.5
cycle=14400
[out]
margin=0.005
#no_breakwater=True
root=out2
sub=bathy_sub.npy
out=bathy.npy
step=1
left=-300
right=150
[artha]
lat=43.398450
lon=-1.673097
[buoy]
lat=43.408333
lon=-1.681667

65
data/photos.md Normal file
View file

@ -0,0 +1,65 @@
# Times
## Block Displacement
18:28:12
## Previous
18:27:39
## Next
18:28:28
## Other Waves
18:25:01
18:23:13
18:22:55
18:22:00
18:09:08
18:08:43
18:08:27
18:07:18
18:04:37
17:59:08
17:58:31
17:54:15
17:53:55
17:53:39
17:47:53
17:39:40
17:38:45
17:38:28
17:32:06
17:31:46
17:26:06
17:25:12
17:24:47
17:23:55
17:23:18
# Periods
## Block Displacement
### Before
33
### After
16
## Other Waves
18
25
16
37
20
16
17
20
25
37

View file

@ -0,0 +1,27 @@
import numpy as np
def u_nandasena(b, c, C_d, C_l, rho_s, rho_w=1e3, theta=0, g=9.81, **_):
return np.sqrt(
(2 * (rho_s/rho_w - 1) * g * c * np.cos(np.radians(theta)))
/ C_l
)
const = {
"a": 4,
"b": 2.5,
"c": 2,
"C_d": 1.95,
"C_l": 0.178,
"theta": 0,
"rho_s": 2.7e3,
}
u = u_nandasena(**const)
u_ola = 14.5
print(f"Nandasena: u={u:5.2f}m/s")
print(f"Olaflow: u={u_ola:5.2f}m/s")
print(f"Gap: du={abs(u_ola - u):5.2f}m/s")
print(f"Gap: du={abs(u_ola - u)/u:5.2%}")
print(f"Gap: du={abs(u_ola - u)/u_ola:5.2%}")

View file

@ -4,6 +4,7 @@ import logging
import pathlib
import matplotlib.pyplot as plt
from matplotlib.ticker import MultipleLocator
import numpy as np
parser = argparse.ArgumentParser(description="Plot orbitals")
@ -21,8 +22,6 @@ config.read(args.config)
inp_root = pathlib.Path(config.get("inp", "root"))
out_root = pathlib.Path(config.get("out", "root"))
out_ts = out_root.joinpath("ts.dat")
raw_ts = []
for tsi in config.get("inp", "raw_ts").split(","):
raw_ts.append(
@ -81,18 +80,23 @@ ax3d.quiver3D(
ax3d.set(xlabel="x (cm)", ylabel="y (cm)", zlabel="z (cm)")
theta = np.angle(raw_ts["x"] + 1j * raw_ts["y"]).mean()
fig2dv, ax2dv = plt.subplots()
fig2dv, ax2dv = plt.subplots(figsize=(5/2.54, 2/3*10/2.54), dpi=200, constrained_layout=True)
x0 = ts_flt["x"] * np.cos(theta) + ts_flt["y"] * np.sin(theta)
ax2dv.plot(x0, z0, c="#0066ff", lw=1)
#ax2dv.plot(x0, z0, c="#0066ff", lw=1)
ax2dv.quiver(
x0[:-1],
z0[:-1],
np.diff(x0)[:],
np.diff(z0)[:],
color="#0066ff",
x0[:-1] * 1e-2,
z0[:-1] * 1e-2,
np.diff(x0)[:] * 1e-2,
np.diff(z0)[:] * 1e-2,
color="k",
scale_units="xy",
scale=1,
)
ax2dv.grid()
ax2dv.set(aspect="equal")
fig2dv.savefig("out_orbitals.pdf")
ax2dv.grid(c="k", alpha=.2)
ax2dv.set(aspect="equal", xlabel="x (m)", ylabel="z (m)")
ax2dv.set(ylim=(-10, 10))
ax2dv.yaxis.set_minor_locator(MultipleLocator(1))
fig2dv.savefig(out_root.joinpath("orbitals.pdf"))
fig2dv.savefig(out_root.joinpath("out_orbitals.jpg"))
plt.show()

71
data/processing/plot.py Normal file
View file

@ -0,0 +1,71 @@
import argparse
import configparser
import logging
import pathlib
import numpy as np
from scipy import interpolate
import matplotlib.pyplot as plt
from .lambert import Lambert
parser = argparse.ArgumentParser(description="Pre-process bathymetry")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument("-c", "--config", default="config.ini")
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
log = logging.getLogger("bathy")
log.info("Starting bathymetry pre-processing")
config = configparser.ConfigParser()
config.read(args.config)
inp_root = pathlib.Path(config.get("inp", "root"))
out_root = pathlib.Path(config.get("out", "root"))
bathy_inp = out_root.joinpath(config.get("out", "sub"))
log.info(f"Loading bathymetry from {bathy_inp}")
bathy_curvi = np.load(bathy_inp)
projection = Lambert()
bathy = np.stack(
(
*projection.cartesian(bathy_curvi[:, 0], bathy_curvi[:, 1]),
bathy_curvi[:, 2],
),
axis=1,
)
log.debug(f"Cartesian bathy: {bathy}")
artha_curvi = np.array(
(config.getfloat("artha", "lon"), config.getfloat("artha", "lat"))
)
buoy_curvi = np.array((config.getfloat("buoy", "lon"), config.getfloat("buoy", "lat")))
artha = np.asarray(projection.cartesian(*artha_curvi))
buoy = np.asarray(projection.cartesian(*buoy_curvi))
bathy[:, :2] = bathy[:, :2] - artha
fig, ax = plt.subplots(figsize=(6 / 2.54, 5 / 2.54), constrained_layout=True, dpi=200)
c = ax.tricontourf(
bathy[:, 0],
bathy[:, 1],
bathy[:, 2],
cmap="plasma",
levels=np.arange(-30, 10, 5),
extend="both",
)
ax.plot(*(np.stack((artha, buoy)) - artha).T, lw=1, ls="-.", c="k", marker="x")
ax.set(xlim=(bathy[np.argmax(bathy[:, 1]), 0], bathy[np.argmin(bathy[:, 1]), 0]))
ax.set(ylim=(bathy[np.argmin(bathy[:, 0]), 1], bathy[np.argmax(bathy[:, 0]), 1]))
ax.set(xlabel="x (m)", ylabel="y (m)")
fig.colorbar(c, label="z (m)")
ax.set_aspect("equal")
ax.set_rasterization_zorder(1.5)
fig.savefig(out_root.joinpath("bathy2d.pdf"))
plt.show()

View file

@ -60,6 +60,12 @@ x = np.arange(
)
theta = np.angle(D.dot((1, 1j)))
log.info(f"N points: {bathy.size:e}")
S = bathy[:, 0].ptp() * bathy[:, 1].ptp()
log.info(f"Surface: {S*1e-6:.2f}km^2")
res = np.sqrt(S / bathy.size)
log.info(f"Resolution: {res:.2f}m")
coords = artha + (x * np.stack((np.cos(theta), np.sin(theta)))).T
log.info("Interpolating bathymetry in 1D")
@ -131,8 +137,12 @@ np.savetxt(out_root.joinpath("hstru.dat"), hstru[::-1], newline=" ")
np.savetxt(out_root.joinpath("poro.dat"), poro[::-1], newline=" ")
np.savetxt(out_root.joinpath("psize.dat"), psize[::-1], newline=" ")
fig, ax = plt.subplots()
fig, ax = plt.subplots(figsize=(16 / 2.54, 2 / 3 * 10 / 2.54), constrained_layout=True)
ax.plot(-x, z, color="k")
ax.fill_between(-x, z + hstru, z, color="k", alpha=0.2)
ax.set_title(f"N={z.size-1}, x=[{-x.max()};{-x.min()}]")
#ax.set_title(f"N={z.size-1}, x=[{-x.max()};{-x.min()}]")
ax.set(ylim=(-40, 15))
ax.set(xlabel="x (m)", ylabel="z (m)")
ax.autoscale(True, "x", True)
ax.grid()
fig.savefig(out_root.joinpath("bathy.pdf"))

View file

@ -8,14 +8,13 @@ import numpy as np
from scipy.interpolate import griddata
parser = argparse.ArgumentParser(description="Pre-process time-series")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument("-c", "--config", default="config.ini")
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
log = logging.getLogger("bathy")
log.info("Starting time-series pre-processing")
log.info("Starting pre-processing")
config = configparser.ConfigParser()
config.read(args.config)
@ -45,7 +44,7 @@ if cycle is None:
f = inp["f"]
S = inp["S"] * Sm
else:
f = np.arange(inp["f"].min(), inp["f"].max() + 1/cycle, 1/cycle)
f = np.arange(inp["f"].min(), inp["f"].max() + 1 / cycle, 1 / cycle)
S = griddata(inp["f"], inp["S"] * Sm, f)
with out_spec.open("w") as out:

View file

@ -4,6 +4,7 @@ import logging
import pathlib
import matplotlib.pyplot as plt
from matplotlib.ticker import MultipleLocator
import numpy as np
parser = argparse.ArgumentParser(description="Pre-process time-series")
@ -25,12 +26,14 @@ out_ts = out_root.joinpath("ts.dat")
raw_ts = []
for tsi in config.get("inp", "raw_ts").split(","):
raw_ts.append(np.loadtxt(
inp_root.joinpath(tsi),
dtype=[("state", int), ("z", float), ("y", float), ("x", float)],
delimiter=",",
max_rows=2304,
))
raw_ts.append(
np.loadtxt(
inp_root.joinpath(tsi),
dtype=[("state", int), ("z", float), ("y", float), ("x", float)],
delimiter=",",
max_rows=2304,
)
)
n = len(raw_ts)
raw_ts = np.concatenate(raw_ts)
log.debug(f"{raw_ts=}")
@ -39,13 +42,43 @@ if (errs := np.count_nonzero(raw_ts["state"])) != 0:
log.warning(f"{errs} transmission errors!")
log.debug(f"{dict(zip(*np.unique(raw_ts['state'], return_counts=True)))}")
t = np.linspace(0, 30 * 60 * n, 2304*n+1)[:-1]
t = np.linspace(0, 30 * 60 * n, 2304 * n + 1)[:-1]
log.debug(f"{t=}")
log.info(f"Saving timeseries to '{out_ts}'")
np.savetxt(out_ts, np.stack((t, raw_ts["z"]/100), axis=1))
np.savetxt(out_ts, np.stack((t, raw_ts["z"] / 100), axis=1))
fig, ax = plt.subplots()
ax.plot(t, raw_ts["z"])
ax.set(xlabel="t (s)", ylabel="z (cm)")
fig, ax = plt.subplots(figsize=(8 / 2.54, 2 / 3 * 10 / 2.54), constrained_layout=True)
tp = np.datetime64("2017-02-28T17:00:00") + t.astype(np.timedelta64)[-(t.size // 3) :]
ax.plot(
tp,
raw_ts["z"][-(t.size // 3) :] * 1e-2,
color="k",
lw=1,
)
ax.axvline(
np.datetime64("2017-02-28T17:00:00") + np.timedelta64(23 * 60 + 8),
color="k",
alpha=0.1,
lw=20,
)
ax.autoscale(True, "x", True)
ax.set(xlabel="t (s)", ylabel="z (m)")
yabs_max = abs(max(ax.get_ylim(), key=abs))
ax.set(ylim=(-10, 10))
ax.set(
xticks=(
np.datetime64("2017-02-28T17:20:00"),
np.datetime64("2017-02-28T17:25:00"),
np.datetime64("2017-02-28T17:30:00"),
),
xticklabels=(
"17:20",
"17:25",
"17:30",
),
)
ax.yaxis.set_minor_locator(MultipleLocator(1))
ax.grid(color="k", alpha=0.2)
fig.savefig(out_root.joinpath("ts.pdf"))
fig.savefig(out_root.joinpath("ts.jpg"), dpi=200)

View file

@ -0,0 +1,181 @@
import argparse
import configparser
import logging
import pathlib
import sys
import matplotlib.pyplot as plt
import matplotlib.dates as mdates
import numpy as np
import scipy.signal as sgl
from scipy import fft
parser = argparse.ArgumentParser(description="Pre-process time-series")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument("-c", "--config", default="config.ini")
args = parser.parse_args()
logging.basicConfig()
log = logging.getLogger("bathy")
log.setLevel(max((10, 20 - 10 * args.verbose)))
log.info("Starting time-series pre-processing")
config = configparser.ConfigParser()
config.read(args.config)
inp_root = pathlib.Path(config.get("inp", "root"), "cerema/raw")
out_root = pathlib.Path(config.get("out", "root"))
out_root.mkdir(exist_ok=True)
raw_ts = []
#for tsi in sorted(inp_root.glob("2017022817*.raw")):
for tsi in sorted(inp_root.glob("*.raw")):
raw_ts.append(
np.loadtxt(
tsi,
dtype=[("state", int), ("z", float), ("y", float), ("x", float)],
delimiter=",",
max_rows=2304,
)
)
log.debug(f"Loading <{tsi}>")
n = len(raw_ts)
raw_ts = np.concatenate(raw_ts)
log.debug(f"{raw_ts=}")
t0 = np.linspace(0, 30 * 60 * n, 2304 * n, endpoint=False)
t = (t0 * 1e3).astype("timedelta64[ms]") + np.datetime64("2017-02-28T00:00")
if (errs := np.count_nonzero(raw_ts["state"])) != 0:
log.warning(f"{errs} transmission errors!")
log.debug(f"{dict(zip(*np.unique(raw_ts['state'], return_counts=True)))}")
# log.debug(f"{t[raw_ts['state'] != 0]}")
sos = sgl.butter(1, 0.2, btype="lowpass", fs=2305 / (30 * 60), output="sos")
z = sgl.sosfiltfilt(sos, raw_ts["z"]*1e-2)
cr0 = np.where(np.diff(np.sign(z)))[0]
wave = np.fromiter(
(
np.max(np.abs(z[cr0[i - 1] : cr0[i]])) + np.max(np.abs(z[cr0[i] : cr0[i + 1]]))
#(np.max(np.abs(raw_ts["z"][cr0[i - 1] : cr0[i]])) + np.max(np.abs(raw_ts["z"][cr0[i] : cr0[i + 1]]))) * 1e-2
for i in range(1, len(cr0) - 1)
),
dtype=np.single,
)
log.debug(f"{wave=}")
log.debug(f"{t=}")
# plt.plot(t[cr0[1:-1]], wave)
dt = 30 * 60 / 2304
# Mlims = (int(5 / dt), int(30 / dt))
N = t.size // 24
s0 = 2 * dt
dj = 0.5
J = 1 / dj * np.log2(N * dt / s0)
j = np.arange(0, J)
sj = s0 * 2 ** (j * dj)
Tj = 2 * sj * np.pi / 5
# sj = s0 * np.arange(1, 2 ** (J * dj))
Mw = sj / dt
Mlims = sj[[0, -1]]
M = (np.abs(sgl.cwt(raw_ts["z"]*1e-2, sgl.morlet2, Mw))/np.std(raw_ts["z"]*1e-2))**2
# M = np.abs(sgl.cwt(z, sgl.morlet, Mw))
v = np.max(np.abs(M))
fig, ax = plt.subplots()
# ax2 = ax.twinx()
# ax.plot(t0, raw_ts["z"], lw=.5, c="k", alpha=.2)
# ax.plot(t0, z, ls="-.", lw=.25, alpha=.2, c="k")
st = raw_ts["state"][raw_ts["state"] != 0]
c = np.asarray(["g", "b", "r"])
# ax.vlines(t0[raw_ts["state"] != 0], -20, 20, colors=c[np.where(st != 777, st, 0)])
# ax.set(xlabel="t (s)", ylabel="z (cm)")
# ax.set(xlim=(17 * 3600 + 20 * 60, 17 * 3600 + 30 * 60))
ax.grid(c="k", alpha=0.2)
ax.set(zorder=1, frame_on=False)
ax.semilogy()
a = [t0[0], t0[-1], *Mlims]
# c = ax.imshow(M, extent=a, aspect="auto", cmap="plasma", vmin=0)
c = ax.contourf(t, Tj, M, cmap="Greys", vmin=0, vmax=v)
fig.colorbar(c)
H13 = np.quantile(wave, 2 / 3)
Hs = 4*np.std(raw_ts["z"]*1e-2)
th = 2 * Hs
log.info(f"Threshold: {th}m")
bigw = np.where(wave > th)[0]
ym = 1.1 * np.max(np.abs(z))
nw = wave.size / 2
nlw = bigw.size
log.info(f"Number of waves: {nw}")
log.info(f"Number of waves >m: {nlw}")
log.info(f"Frequency: {nlw/nw:e}")
log.info(f"Frequency: 1/{nw/nlw:.0f}")
log.info(f"H1/3: {H13}m")
log.info(f"Hs: {Hs}")
if bigw.size > 32:
log.warning(f"Number of large waves: {bigw.size}")
sys.exit()
fig, ax_ = plt.subplots(2 * (bigw.size // 2), 2, figsize=(15/2.54, 4/3*10/2.54), constrained_layout=True)
for w, ax2, ax in zip(bigw, ax_[::2].flatten(), ax_[1::2].flatten()):
i0 = cr0[w] - int(400 / dt)
i1 = cr0[w + 2] + int(400 / dt)
# a = [t0[i0], t0[i1], *Mlims]
# c = ax2.imshow(M[:, i0:i1], extent=a, aspect="auto", cmap="Spectral", vmin=-v, vmax=+v)
ws = np.ptp(raw_ts["z"][cr0[w]:cr0[w+2]]) * 1e-2
log.info(f"Wave [{w}] size: {ws:.2f}m")
c = ax2.contourf(
t[i0:i1],
Tj,
M[:, i0:i1],
cmap="Greys",
vmin=0,
levels=[1, 2.5, 5, 10, 20, 40],
extend="both",
)
fig.colorbar(c, ax=ax2, label="NWPS")
ax.plot(t[i0:i1], (raw_ts["z"]*1e-2)[i0:i1], c="k", lw=1)
#ax.plot(t[i0:i1], z[i0:i1], c="k", lw=1, alpha=0.2, ls="-.")
# ax.vlines(t[raw_ts["state"] != 0], -20, 20, colors=c[np.where(st != 777, st, 0)])
ax.set(xlim=(t[i0], t[i1 - 1]), ylim=(-ym, ym))
ax2.set(ylim=(2, 200))
ax2.set(ylabel="T (s)")
ax2.grid(c="k", alpha=0.2)
ax2.semilogy()
ax.grid(c="k", alpha=.2)
#ax.axhline(0, c="k", alpha=0.2, lw=1, ls="-.")
#ax.set(zorder=1, frame_on=False)
ax.set_xlabel("t (s)", loc="left")
ax.set_ylabel("z (m)")
ax.axvspan(t[cr0[w]], t[cr0[w+2]], color="k", alpha=.1)
locator = mdates.AutoDateLocator(minticks=3, maxticks=7)
formatter = mdates.ConciseDateFormatter(locator)
ax.xaxis.set_major_locator(locator)
ax.xaxis.set_major_formatter(formatter)
ax2.xaxis.set_major_locator(locator)
ax2.xaxis.set_major_formatter(formatter)
ax2.axes.set_xticklabels([])
ax2.set_rasterization_zorder(1.5)
fig.savefig(out_root.joinpath(f"wavelet.pdf"), dpi=300)
fig.savefig(out_root.joinpath(f"wavelet.png"), dpi=200)
#fig, ax = plt.subplots(constrained_layout=True)
## ax.plot(fft.rfftfreq(raw_ts["z"].size, dt), np.abs(fft.rfft(raw_ts["z"])), c="k", alpha=.2, lw=1)
#ax.plot(*sgl.welch(raw_ts["z"], 1 / dt), c="k", alpha=0.2, label="PSD")
#ax.plot(1 / sj, N * np.mean(M, axis=1), c="k", label="CWT")
## ax.grid(color="k", alpha=.2)
#ax.set(xlabel="T (s)", ylabel="PSD")
## ax2.set(ylabel="Average Wavelet Transform")
#ax.set(xlim=1 / Mlims)
#ax.legend()
plt.show()

Binary file not shown.

Binary file not shown.

12
nature/README.md Normal file
View file

@ -0,0 +1,12 @@
# Nature
Article pour Nature Communications.
* <https://www.nature.com/ncomms/submit/how-to-submit>
* <https://www.nature.com/ncomms/submit/article>
* 5000 mots
* 10 figures
## Compilation
* Utiliser `xelatex` (`latexmk -xelatex main.tex`)

BIN
nature/fig/U.pdf Normal file

Binary file not shown.

BIN
nature/fig/aw_t0.pdf Normal file

Binary file not shown.

BIN
nature/fig/bathy2d.pdf Normal file

Binary file not shown.

BIN
nature/fig/maxw.pdf Normal file

Binary file not shown.

BIN
nature/fig/out_orbitals.pdf Normal file

Binary file not shown.

BIN
nature/fig/pic1.jpg Executable file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.8 MiB

BIN
nature/fig/pic2.jpg Executable file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.4 MiB

BIN
nature/fig/ts.pdf Normal file

Binary file not shown.

BIN
nature/fig/wavelet.pdf Normal file

Binary file not shown.

BIN
nature/fig/wavelet_sw.pdf Normal file

Binary file not shown.

BIN
nature/fig/x.pdf Normal file

Binary file not shown.

4
nature/kalliope.conf Normal file
View file

@ -0,0 +1,4 @@
[latex]
engine=xelatex
main=main.tex
out=nature.pdf

137
nature/library.bib Normal file
View file

@ -0,0 +1,137 @@
@misc{olaFlow,
author={Higuera, P.},
title={olaFlow: {CFD} for waves [{S}oftware].},
year=2017,
doi={10.5281/zenodo.1297013},
url={https://doi.org/10.5281/zenodo.1297013}
}
@report{parvin2020,
author={Amir Parvin},
title={Processes allowing large block displacement under wave action},
year={2020},
}
@article{cox2018,
title={Extraordinary boulder transport by storm waves (west of Ireland, winter 2013--2014), and criteria for analysing coastal boulder deposits},
author={Cox, R{\'o}nadh and Jahn, Kalle L and Watkins, Oona G and Cox, Peter},
journal={Earth-Science Reviews},
volume={177},
pages={623--636},
year={2018},
publisher={Elsevier}
}
@article{shah2013,
title={Coastal boulders in Martigues, French Mediterranean: evidence for extreme storm waves during the Little Ice Age},
author={Shah-Hosseini, M and Morhange, C and De Marco, A and Wante, J and Anthony, EJ and Sabatier, F and Mastronuzzi, G and Pignatelli, C and Piscitelli, A},
journal={Zeitschrift f{\"u}r Geomorphologie},
volume={57},
number={Suppl 4},
pages={181--199},
year={2013}
}
@article{nott1997,
title={Extremely high-energy wave deposits inside the Great Barrier Reef, Australia: determining the cause—tsunami or tropical cyclone},
author={Nott, Jonathan},
journal={Marine Geology},
volume={141},
number={1-4},
pages={193--207},
year={1997},
publisher={Elsevier}
}
@article{nott2003,
title={Waves, coastal boulder deposits and the importance of the pre-transport setting},
author={Nott, Jonathan},
journal={Earth and Planetary Science Letters},
volume={210},
number={1-2},
pages={269--276},
year={2003},
publisher={Elsevier}
}
@article{nandasena2011,
title={Reassessment of hydrodynamic equations: Minimum flow velocity to initiate boulder transport by high energy events (storms, tsunamis)},
author={Nandasena, NAK and Paris, Rapha{\"e}l and Tanaka, Norio},
journal={Marine Geology},
volume={281},
number={1-4},
pages={70--84},
year={2011},
publisher={Elsevier}
}
@article{weiss2015,
title={Untangling boulder dislodgement in storms and tsunamis: Is it possible with simple theories?},
author={Weiss, R and Diplas, P},
journal={Geochemistry, Geophysics, Geosystems},
volume={16},
number={3},
pages={890--898},
year={2015},
publisher={Wiley Online Library}
}
@article{zijlema2011,
title={SWASH: An operational public domain code for simulating wave fields and rapidly varied flows in coastal waters},
author={Zijlema, Marcel and Stelling, Guus and Smit, Pieter},
journal={Coastal Engineering},
volume={58},
number={10},
pages={992--1012},
year={2011},
publisher={Elsevier}
}
@article{higuera2015,
title={Application of computational fluid dynamics to wave action on structures},
author={Higuera, P},
journal={PhD. Universidade de Cantabria},
year={2015}
}
@phdthesis{poncet2021,
title={Characterization of wave impact loading on structures at full scale: field experiment, statistical analysis and 3D advanced numerical modeling},
author={Poncet, Pierre-Antoine},
year={2021},
school={Université de Pau et des Pays de l'Adour},
chapter={4},
}
@article{torrence1998,
title={A practical guide to wavelet analysis},
author={Torrence, Christopher and Compo, Gilbert P},
journal={Bulletin of the American Meteorological society},
volume={79},
number={1},
pages={61--78},
year={1998},
publisher={American Meteorological Society}
}
@article{dysthe2008,
title={Oceanic rogue waves},
author={Dysthe, Kristian and Krogstad, Harald E and M{\"u}ller, Peter},
journal={Annu. Rev. Fluid Mech.},
volume={40},
pages={287--310},
year={2008},
publisher={Annual Reviews}
}
@article{lodhi2020,
title={The role of hydrodynamic impact force in subaerial boulder transport by tsunami—Experimental evidence and revision of boulder transport equation},
author={Lodhi, Hira A and Hasan, Haider and Nandasena, NAK},
journal={Sedimentary Geology},
volume={408},
pages={105745},
year={2020},
publisher={Elsevier}
}
@book{violeau2012,
title={Fluid mechanics and the SPH method: theory and applications},
author={Violeau, Damien},
year={2012},
publisher={Oxford University Press}
}
@article{violeau2007,
title={Numerical modelling of complex turbulent free-surface flows with the SPH method: an overview},
author={Violeau, Damien and Issa, Reza},
journal={International Journal for Numerical Methods in Fluids},
volume={53},
number={2},
pages={277--304},
year={2007},
publisher={Wiley Online Library}
}

369
nature/main.tex Normal file
View file

@ -0,0 +1,369 @@
\documentclass[a4paper, twocolumn]{article}
\usepackage{polyglossia} \usepackage{authblk}
\usepackage[sfdefault]{inter}
\usepackage[math-style=french]{unicode-math}
\setmathfont{Fira Math}
\usepackage{graphicx}
\usepackage[hmargin=2.1cm, vmargin=2.97cm]{geometry}
\usepackage{hyperref}
\usepackage{siunitx}
\sisetup{
mode=text,
reset-text-family=false,
reset-text-series=false,
reset-text-shape=false,
propagate-math-font=true,
}
\setmainlanguage{english}
\usepackage[
backend=biber,
style=iso-authoryear,
sorting=nyt,
]{biblatex}
\bibliography{library}
\hypersetup{
pdftitle={Analysis of the displacement of a large concrete block under an extreme wave},
pdfauthor={Edgar P. Burkhart}
}
\title{Analysis of the displacement of a large concrete block under an extreme wave}
\author[1]{Edgar P. Burkhart}
\author[*,1]{Stéphane Abadie}
\affil[1]{Université de Pau et des Pays de lAdour, E2S-UPPA, SIAME, France}
\affil[*]{Corresponding Author, stephane.abadie@univ-pau.fr}
\begin{document}
\maketitle
\section{Introduction}
% Displacement of blocks studies
Displacement of large blocks or boulders by waves is an interesting phenomenon in the study of extreme historical
coastal events. The existence of block deposits at unusual heights can be a clue to past events such as extreme storms
or tsunamis. For instance, \textcite{cox2018} studied coastal deposits on the coast of Ireland in relation to the storms
from winter 2013--2014, and extracted criteria for analysing such deposits. Similarly, \textcite{shah2013} found boulder
deposits on the mediterranean coast to be evidence of extreme storms in the Little Ice Age.
% Need for analytical equations
In order for those studies to be possible, analytical criterias are needed in order to ascertain the cause of the
displacement of a block. \textcite{nott1997,nott2003} proposed a set of equations that have been widely used for that
purpose. Those equations rely on an equilibrium relation between the lift force produced by a wave and restraining
forces depending on the initial setting of the block, allowing to extract a minimal flow velocity necessary for movement
initiation. A parametrisation of waves depending on their source is also used to provide minimal wave heights depending
on the type of scenario --- wave or tsunami. Those equations were later revised by \textcite{nandasena2011}, as they
were found to be partially incorrect. A revised formulation based on the same considerations was provided.
The assumptions on which \textcite{nott2003, nandasena2011} are based were then critisized by \textcite{weiss2015}. In
fact, according to them, the initiation of movement is not sufficient to guarantee block displacement.
\textcite{weiss2015} highlights the importance of the time dependency on block displacement. A method is proposed that
allows to find the wave amplitude that lead to block displacement. Additionally, more recent research by
\textcite{lodhi2020} has shown that the equations proposed by \textcite{nott2003, nandasena2011} tend to overestimate
the minimum flow velocity needed to displace a block.
% Lack of observations -> observation
Whether it is \textcite{nott2003}, \textcite{nandasena2011} or \textcite{weiss2015}, all the proposed analytical
equations suffer from a major flaw: they are all based on very simplified analytical models and statistical analysis.
Unfortunately, no block displacement event seems to have been observed directly in the past, and those events are
difficult to predict.
In this paper, we study such an event. On February 28, 2017, a \SI{50}{\tonne} concrete block was dropped by a wave on
the crest of the Artha breakwater (Figure~\ref{fig:photo}). Luckily, the event was captured by a photographer, and a
wave buoy located \SI{1.2}{\km} offshore captured the seastate. Information from the photographer allowed to establish
the approximate time at which the block displacement occured. The goal of this paper is to model the hydrodynamic
conditions near the breakwater that lead to the displacement of the \SI{50}{\tonne} concrete block.
% Modeling flow accounting for porous media
Several approaches can be used when modelling flow near a breakwater. Depth-averaged models can be used to study the
transformation of waves on complex bottoms. Studying the hydrodynamic conditions under the surface can be achieved
using smoothed-particles hydrodynamics (SPH) or volume of fluid (VOF) models. SPH models rely on a Lagrangian
representation of the fluid \parencite{violeau2012}, while VOF models rely on an Eulerian representation. VOF models
are generally more mature for the study of multiphase incompressible flows, while SPH models generally require more
processing power for similar results \parencite{violeau2007}.
In this paper, we first use a one-dimensionnal depth-averaged non-linear non-hydrostatic model to verify that the
signal measured by the wave buoy can be used as an incident wave input for the determination of hydrodynamic conditions
near the breakwater. For this model, we use a SWASH model \parencite{zijlema2011} already calibrated by
\textcite{poncet2021} on a domain reaching \SI{1450}{\m} offshore of the breakwater.
Then, we use a nested VOF model in two vertical dimensions that uses the output from the larger scale SWASH model as
initial and boundary conditions to obtain the hydrodynamic conditions on the breakwater. The model uses olaFlow
\parencite{higuera2015}, a VOF model based on volume averaged Reynolds averaged Navier-Stokes (VARANS) equations, and
which relies on a macroscopic representation of the porous armour of the breakwater. The model is qualitatively
calibrated using photographs from the storm of February 28, 2017. Results from the nested models are finally compared
to the analytical equations provided by \textcite{nandasena2011}.
\begin{figure*}
\centering
\includegraphics[height=.4\textwidth]{fig/pic1.jpg}
\includegraphics[height=.4\textwidth]{fig/pic2.jpg}
\caption{Photographs taken during and after the wave that displaced a \SI{50}{\tonne} concrete block onto the Artha
breakwater.}\label{fig:photo}
\end{figure*}
\section{Results}
\subsection{Identified wave}
Preliminary work with the photographer allowed to identify the time at which the block displacement event happened.
Using the data from the wave buoy located \SI{1250}{\m} offshore of the Artha breakwater, a seamingly abnormally large
wave of \SI{14}{\m} amplitude was identified that is supposed to have led to the block displacement.
Initial analysis of the buoy data plotted in Figure~\ref{fig:wave} shows that the movement of the buoy follows two
orbitals that correspond to an incident wave direction. These results would indicate that the identified wave is
essentially an incident wave, with a minor reflected component.
The wavelet power spectrum displayed in Figure~\ref{fig:wavelet} highlights a primary infragravity wave in the signal,
with a period of over \SI{30}{\s}.
\begin{figure*}
\centering
\includegraphics{fig/ts.pdf}
\includegraphics{fig/out_orbitals.pdf}
\caption{\textit{Left}: Free surface measured during the extreme wave measured on February 28, 2017 at 17:23UTC.
\textit{Right}: Trajectory of the wave buoy during the passage of this particular wave.}\label{fig:wave}
\end{figure*}
\subsection{Reflection analysis}
The results from the large scale SWASH model using two configurations --- one of them being the real bathymetry, and
the other being a simplified bathymetry without the breakwater --- are compared in Figure~\ref{fig:swash}. The results
obtained with both simulations show a maximum wave amplitude of \SI{13.9}{\m} for the real bathymetry, and
\SI{12.1}{\m} in the case where the breakwater is removed.
\begin{figure*}
\centering
\includegraphics{fig/maxw.pdf}
\caption{Free surface elevation obtained with the SWASH model in two configurations. \textit{Case 1}: With breakwater;
\textit{Case 2}: Without breakwater.}\label{fig:swash}
\end{figure*}
\subsection{Wave transformation}
The free surface obtained with the SWASH model using raw buoy measurements as an elevation boundary condition is plotted
in Figure~\ref{fig:swash_trans}. Those results display a strong transformation of the wave between the buoy and the
breakwater. Not only the amplitude, but also the shape of the wave are strongly impacted by the propagation over the
domain. While the amplitude of the wave is reduced as the wave propagates shorewards, the length of the trough and the
crest increases, with a zone reaching \SI{400}{\m} long in front of the wave where the water level is below \SI{0}{\m}.
\begin{figure*}
\centering
\includegraphics{fig/x.pdf}
\caption{Propagation of the wave supposed to be responsible for the block displacement; highlighted zone:
qualitatively estimated position of the wave front.}\label{fig:swash_trans}
\end{figure*}
In an attempt to understand the identified wave, a wavelet analysis is conducted on raw buoy data as well as at
different points along the SWASH model using the method proposed by \textcite{torrence1998}. The results are displayed
in Figure~\ref{fig:wavelet} and Figure~\ref{fig:wavelet_sw}. The wavelet power spectrum shows that the major component
in identified rogue waves is a high energy infragravity wave, with a period of around \SI{60}{\s}.
The SWASH model seems to indicate that the observed transformation of the wave can be characterized by a transfer of
energy from the infragravity band to shorter waves from around \SI{600}{\m} to \SI{300}{\m}, and returning to the
infragravity band at \SI{200}{\m}.
\begin{figure*}
\centering
\includegraphics{fig/wavelet.pdf}
\caption{Normalized wavelet power spectrum from the raw buoy timeseries for identified rogue waves on february 28,
2017.}\label{fig:wavelet}
\end{figure*}
\begin{figure*}
\centering
\includegraphics{fig/wavelet_sw.pdf}
\caption{Normalized wavelet power spectrum along the SWASH domain.}\label{fig:wavelet_sw}
\end{figure*}
\subsection{Hydrodynamic conditions on the breakwater}
The two-dimensionnal olaFlow model near the breakwater allowed to compute the flow velocity near and on the breakwater
during the passage of the identified wave. The results displayed in Figure~\ref{fig:U} show that the flow velocity
reaches a maximum of \SI{14.5}{\m\per\s} towards the breakwater during the identified extreme wave. Although the
maximum reached velocity is similar to earlier shorter waves, the flow velocity remains high for twice as long as
during those earlier waves. The tail of the identified wave also exhibits a water level over \SI{5}{\m} for over
\SI{40}{\s}.
\begin{figure*}
\centering
\includegraphics{fig/U.pdf}
\caption{Horizontal flow velocity computed with the olaFlow model at $x=\SI{-20}{\m}$ on the breakwater armor.
Bottom: horizontal flow velocity at $z=\SI{5}{\m}$. The identified wave reaches this point around
$t=\SI{175}{\s}$.}\label{fig:U}
\end{figure*}
\section{Discussion}
\subsection{Incident wave}
According to the criteria proposed by \textcite{dysthe2008}, rogue waves can be defined as waves with an amplitude over
twice the significant wave height over a given period. The identified wave fits this definition, as its amplitude is
\SI{14.7}{\m}, over twice the significant wave height of \SI{6.3}{\m} on that day. According to \textcite{dysthe2008},
rogue waves often occur from non-linear superposition of smaller waves. This seems to be what we observe on
Figure~\ref{fig:wave}.
As displayed in Figure~\ref{fig:wavelet}, a total of 4 rogue waves were identified on february 28, 2017 in the raw buoy
timeseries using the wave height criteria proposed by \textcite{dysthe2008}. The wavelet power spectrum shows that a
very prominent infragravity component is present, which usually corresponds to non-linear interactions of smaller
waves. \textcite{dysthe2008} mentions that such waves in coastal waters are often the result of refractive focusing. On
February 28, 2017, the frequency of rogue waves was found to be of 1 wave per 1627, which is considerably more than the
excedance probability of 1 over 10\textsuperscript4 calculated by \textcite{dysthe2008}. Additionnal studies should be
conducted to understand focusing and the formation of rogue waves in front of the Saint-Jean-de-Luz bay.
An important point to note is that rogue waves are often short-lived: their nature means that they often separate into
shorter waves shortly after appearing. A reason for which such rogue waves can be maintained over longer distances can
be a change from a dispersive environment such as deep water to a non-dispersive environment. The bathymetry near the
wave buoy (Figure~\ref{fig:bathy}) shows that this might be what we observe here, as the buoy is located near a step in
the bathymetry, from around \SI{40}{\m} to \SI{20}{\m} depth.
\subsection{Reflection analysis}
The 13\% difference between those values highlights the existence of a notable amount of reflection at the buoy.
Nonetheless, the gap between the values is still fairly small and the extreme wave identified on February 28, 2017 at
17:23:08 could still be considered as an incident wave.
Unfortunately, the spectrum wave generation method used by SWASH could not reproduce simlar waves to the one observed
at the buoy. As mentionned by \textcite{dysthe2008}, such rogue waves cannot be deterministicly from the wave spectrum.
For this reason, this study only allows us to observe the influence of reflection on short waves, while mostly ignoring
infragravity waves. Those results are only useful if we consider that infragravity waves behave similarly to shorter
waves regarding reflection.
\subsection{Wave transformation}
The SWASH model yields a strongly changing wave over the domain, highlighting the highly complex composition of this
wave. Although the peak of the amplitude of the wave is reduced as the wave propagates, the length of the wave is
highlighted by the results. At $T+\SI{60}{\s}$ for instance, the water level is under \SI{0}{\m} for \SI{400}{\m}, and
then over \SI{0}{\m} for around the same length, showing the main infragavity component of the studied wave.
The wavelet analysis conducted at several points along the domain (Figure~\ref{fig:wavelet_sw}) show that the energy of
the studied wave (slightly before $t=\SI{1500}{\s}$) initially displays a strong infragravity component. Energy is then
transfered from the infragravity band towards shorter waves, and back to the infragravity band. This behavior is quite
unexpected, and further investigations should be conducted to understand and validate those results.
\subsection{Hydrodynamic conditions on the breakwater}
The hydrodynamic conditions on the breakwater are the main focus of this study. Considering an initially submerged
block, analytical equations proposed by \textcite{nandasena2011} yield a minimal flow velocity that would lead to block
displacement by saltation of \SI{19.4}{\m\per\s} The results from the Olaflow model yield a maximal wave velocity during
the displacement of the \SI{50}{\tonne} concrete block of \SI{14.5}{\m\per\s}. The results from the model are 25\% lower
than the analytical value.
Those results tend to confirm recent research by \textcite{lodhi2020}, where it was found that the block displacement
threshold tend to overestimate the minimal flow velocity needed for block movement, although further validation of the
model that is used would be needed to confirm those findings.
Additionally, similar flow velocities are reached in the model. Other shorter waves yield similar flow velocities on
the breakwater, but in a smaller timeframe. The importance of time dependency in studying block displacement would be
in accordance with research from \textcite{weiss2015}, who suggested that the use of time-dependent equations for block
displacement would lead to a better understanding of the phenomenon.
Although those results are a major step in a better understanding of block displacement in coastal regions, further
work is needed to understand in more depth the formation and propagation of infragravity waves in the near-shore
region. Furthermore, this study was limited to a single block displacement event, and further work should be done to
obtain more measurements and observations of such events, although their rarity and unpredictability makes this task
difficult.
\section{Methods}
\subsection{Buoy data analysis}
\subsubsection{Rogue wave identification}
Identifying rogue waves requires two main steps: computing the significant wave height, and computing the height of
individual waves. The first step is straightforward: $H_s=4\sigma$, where $\sigma$ is the standard deviation of the
surface elevation. Computing the height of individual waves is conducted using the zero-crossing method: the time domain
is split in sections where water level is strictly positive or negative, and wave size is computed according to the
maxima and minima in each zone. This method can fail to identify some waves or wrongly identify waves in case of
measurement errors or in the case where a small oscillation around 0 occurs in the middle of a larger wave. In order to
account for those issues, the signal is first fed through a low-pass filter to prevent high frequency oscillations of
over \SI{0.2}{\Hz}.
\subsubsection{Wavelet analysis}
All wavelet analysis in this study is conducted using a continuous wavelet transform over a Morlet window. The wavelet
power spectrum is normalized by the variance of the timeseries, following the method proposed by
\textcite{torrence1998}. This analysis extracts a time-dependent power spectrum and allows to identify the composition
of waves in a time-series.
\subsection{SWASH models}
\subsubsection{Domain}
\begin{figure}
\centering
\includegraphics{fig/bathy2d.pdf}
\caption{Bathymetry in front of the Artha breakwater. The extremities of the line are the buoy and the
breakwater.}\label{fig:bathy}
\end{figure}
A \SI{1750}{\m} long domain is constructed in order to study wave reflection and wave transformation over the bottom
from the wave buoy to the breakwater. Bathymetry with a resolution of around \SI{1}{\m} was used for most of the domain
(Figure~\ref{fig:bathy}).
The breakwater model used in the study is taken from \textcite{poncet2021}. A smoothed section is created and considered
as a porous media in the model.
A second domain is constructed for reflection analysis. The second model is the same as the first, excepted that the
breakwater is replaced by a smooth slope in order to remove the reflection generated by the structure.
The reflection analysis is conducted over \SI{4}{\hour} in order to generate a fair range of conditions. The wave
transformation study was conducted over a \SI{1}{\hour} timeframe in order to allow the model to reach steady-state
before the studied wave was generated.
\subsubsection{Model}
A non-linear non-hydrostatic shallow water model (SWASH, \cite{zijlema2011}) is used to model wave reflection and
transformation on the studied domain. The study is conducted using a layered one-dimensional model, that allows to
consider porous media in the domain.
The reflection analysis was conducted with 2 layers as to prevent model instability in overtopping conditions. The study
of wave transformation and the generation of boundary conditions for the Olaflow model is done with 4 layers.
\subsubsection{Porosity}
In the SWASH model, the porous breakwater armour is represented using macroscale porosity. The porosity parameters were
calibrated in \textcite{poncet2021}.
\subsubsection{Boundary conditions}
Two different sets of boundary conditions were used for both studies. In all cases, a sponge layer was added to the
shorewards boundary to prevent wave reflection on the boundary. In the reflection analysis, offshore conditions were
generated using the wave spectrum extracted from buoy data during the storm. The raw vertical surface elevation measured
by the wave buoy was used in a second part.
\subsection{Olaflow model}
\begin{figure*}
\centering
\includegraphics{fig/aw_t0.pdf}
\caption{Domain studied with Olaflow. Initial configuration.}\label{fig:of}
\end{figure*}
\subsubsection{Domain}
A \SI{150}{\m} long domain is built in order to obtain the hydrodynamic conditions on the Artha breakwater during the
passage of the identified extreme wave. The bathymetry with \SI{50}{\cm} resolution from \textcite{poncet2021} is used.
The domain extends \SI{30}{\m} up in order to be able to capture the largest waves hitting the breakwater. Measurements
are extracted \SI{20}{\m} shorewards from the breakwater crest. The domain is displayed in Figure~\ref{fig:of}.
A mesh in two-vertical dimensions with \SI{20}{\cm} resolution was generated using the interpolated bathymetry. As with
the SWASH model, the porous armour was considered at a macroscopic scale.
\subsubsection{Model}
A volume-of-fluid (VOF) model in two-vertical dimensions based on volume-averaged Reynolds-averaged Navier-Stokes
(VARANS) equations is used (olaFlow, \cite{higuera2015}). The model was initially setup using generic values for porous
breakwater studies. A sensibility study conducted on the porosity parameters found the influence of these values on
the final results to be very minor.
The k-ω SST turbulence model was used, as it produced much more realistic results than the default k-ε model,
especially compared to the photographs from the storm of February 28, 2017. The k-ε model yielded very high viscosity
and thus strong dissipation in the entire domain, preventing an accurate wave breaking representation.
\subsubsection{Boundary conditions}
Initial and boundary conditions were generated using the output from the SWASH wave transformation model. The boundary
condition is generated by a paddle-like wavemaker, using the water level and depth-averaged velocity computed by the
SWASH model.
\printbibliography
\end{document}

131
olaflow/README.md Normal file
View file

@ -0,0 +1,131 @@
# Olaflow
Ce dossier regroupe l'ensemble des scripts nécessaires à l'éxécution du modèle Olaflow ainsi qu'au post-traitement des
données.
## Scripts
### Run Olaflow
`run_ola.sh` permet de lancer les scripts Python et openfoam nécessaires au bon lancement du modèle Olaflow.
```./run_ola.sh CASE```
* `CASE` : choix du modèle Olaflow à utiliser
Le cas de base utilisé est dans le dossier `of`. Les fichiers modifiés sont dans le dossier `of_$CASE`. Les sorties
sont dans `out_of_$CASE` puis `out_post_$CASE`.
### Animate
`animate.py` permet d'obtenir une animation de `alpha.water` et `U` en sortie du modèle Olaflow.
```python -m processing.animate -o OUTPUT [-v] [-m] [-i]```
* `-o OUTPUT` : choix du dossier de sortie dans lequel travailler
* `-v` : verbose
* `-m` : au-lieu d'une animation, tracer le maximum de chaque grandeur dans le modèle
* `-i` : au-lieu d'une animation, tracer la valeur initiale de chaque grandeur
### Bathy
`bathy.py` permet de générer la bathymétrie utilisée par Olaflow.
```python -m processing.bathy [-c CONFIG] [-v]```
* `-c CONFIG` : choix d'un fichier de configuration
* `-v` : verbose
```
[bathy]
bathy : bathymétrie générée dans data
hstru : hauteur de poreux générée dans data
scale* : échelle de la bathymétrie en [x,y,z]
translate* : translation de la bathymétrie en [x,y,z]
out : dossier de sortie de la bathymétrie
```
### Diff
`diff.py` permet de comparer les sorties de plusieurs modèles Olaflow.
```python -m processing.diff -o OUTPUT ... [-t TIMESTEP] [-f FUNCTION] [-y FIELD]```
* `-o OUTPUT` : dossier des sorties de Olaflow à utiliser (répéter pour les modèles à comparer)
* `-t TIMESTEP` : instant auquel comparer les modèles
* `-f FUNCTION` : fonction de post-traitement à comparer (`graphUniform` ou `graphUniform2`)
* `-y FIELD` : champs à comparer (`alpha.water` ou `U`)
### Flow velocity
`flow_velocity.py` permet de tracer la vitesse du courant pour un post-traitement du modèle Olaflow.
```python -m processing.flow_velocity -o OUTPUT [-f FUNCTION]```
* `-o OUTPUT` : dossier des sorties de Olaflow à utiliser
* `-f FUNCTION` : fonction de post-traitement à comparer (`graphUniform` ou `graphUniform2`)
### Olaflow
`olaflow.py` définit la classe Python OFModel permettant de traiter les données de sortie de Olaflow.
### Pickle
`pickle.py` permet de lire les données de sortie d'un modèle Olaflow et de les convertir en objet `OFModel`, puis
d'enregistrer cet objet pour une utilisation efficace avec Python.
```python -m processing.pickle -i INPUT -o OUTPUT [-z]```
* `-i INPUT` : dossier de sortie d'Olaflow
* `-o OUTPUT` : dossier de sortie à utiliser
* `-z` : activer la compression gzip (déconseillé)
### STL
`stl.py` définit une fonction permettant de convertir un tableau de bathymétrie en fichier STL. Nécessite Openscad.
### SWS Olaflow
`sws_ola.py` permet de convertir les données de sortie d'un modèle Swash en condition initiale d'un modèle Olaflow.
```python -m processing.sws_ola -o OUTPUT [-c CONFIG]```
* `-o OUTPUT` : dossier de sortie à utiliser
* `-c CONFIG` : choix d'un fichier de configuration
```
[swash]
np_out : dossier de sortie swash
[bathy]
level : niveau d'eau dans SWASH
[olaflow]
t0 : instant initial du modèle Olaflow
```
### SWS Wavedict Irregular
`sws_wavedict_irregular.py` est une tentative de convertir les données de sortie d'un modèle SWASH en condition limite
irrégulière de spectre d'un modèle Olaflow. Ne fonctionne pas.
### SWS Wavedict Paddle
`sws_wavedict_paddle.py` permet de convertir les données de sortie d'un modèle SWASH en condition limite en hauteur
d'eau et vitesse d'un modèle Olaflow.
```python -m processing.sws_wavedict_paddle -o OUTPUT [-c CONFIG]```
* `-o OUTPUT` : dossier de sortie à utiliser
* `-c CONFIG` : choix d'un fichier de configuration
```
[swash]
np_out : dossier de sortie swash
[bathy]
level : niveau d'eau dans SWASH
[olaflow]
t0 : instant initial du modèle Olaflow
tf : instant final du modèle Olaflow
x0 : position de la limite du modèle Olaflow
```

View file

@ -0,0 +1,29 @@
/*--------------------------------*- C++ -*----------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.1.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
FoamFile
{
version 2.0;
format ascii;
class dictionary;
location "constant";
object porosityDict;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
// Materials: clear region, core, secondary armour layer, primary armour layer
// a,b,c: tuning parameters
a 2(0 50);
b 2(0 1.2);
c 2(0 0.34);
// D50: mean nominal diameter
D50 2(1 4);
// porosity (phi)
porosity 2(1 0.4);
// ************************************************************************* //

View file

@ -0,0 +1,29 @@
/*--------------------------------*- C++ -*----------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 2.1.0 |
| \\ / A nd | Web: www.OpenFOAM.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
FoamFile
{
version 2.0;
format ascii;
class dictionary;
location "constant";
object turbulenceProperties;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
simulationType RAS;
RAS
{
RASModel kOmegaSST;
turbulence on;
printCoeffs on;
}
// ************************************************************************* //

View file

@ -0,0 +1,81 @@
/*--------------------------------*- C++ -*----------------------------------*\ | ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.7.1 |
| \\ / A nd | Web: www.OpenFOAM.com |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
FoamFile
{
version 2.0;
format ascii;
class dictionary;
object blockMeshDict;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
scale 1;
vertices
(
(-150 0 -30)
(0 0 -30)
(0 0 30)
(-150 0 30)
(-150 1 -30)
(0 1 -30)
(0 1 30)
(-150 1 30)
);
blocks
(
hex (0 1 5 4 3 2 6 7) (750 1 300) simpleGrading (1 1 1)
);
edges
(
);
boundary
(
inlet
{
type patch;
faces
(
(0 4 7 3)
);
}
/*outlet
{
type patch;
faces
(
(1 5 6 2)
);
}*/
wall1
{
type wall;
faces
(
(0 1 5 4)
);
}
atmosphere
{
type patch;
faces
(
(3 2 6 7)
(1 5 6 2)
);
}
);
mergePatchPairs
(
);
// ************************************************************************* //

View file

@ -0,0 +1,138 @@
/*---------------------------------------------------------------------------*\
| ========= | |
| \\ / F ield | OpenFOAM: The Open Source CFD Toolbox |
| \\ / O peration | Version: 1.3 |
| \\ / A nd | Web: http://www.openfoam.org |
| \\/ M anipulation | |
\*---------------------------------------------------------------------------*/
FoamFile
{
version 2.0;
format ascii;
location "system";
class dictionary;
object controlDict;
}
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
application olaFlow;
startFrom latestTime;
startTime 0;
stopAt endTime;
endTime 400;
deltaT 0.1;
writeControl adjustableRunTime;
writeInterval 0.5;
purgeWrite 0;
writeFormat ascii;
writePrecision 6;
compression on;
timeFormat general;
timePrecision 6;
runTimeModifiable yes;
adjustTimeStep yes;
maxCo 0.45;
maxAlphaCo 0.45;
maxDeltaT 0.5;
/*
functions
{
gaugesVOF
{
type sets;
libs ("libsampling.so");
writeControl outputTime;
writeInterval 1;
setFormat raw;
surfaceFormat raw;
interpolationScheme cell;
fields ( alpha.water );
sets
(
GaugeVOF01
{
type lineCellFace;
axis xyz;
start ( 0.5 0.001 0 );
end ( 0.5 0.001 1.2 );
}
GaugeVOF02
{
type lineCellFace;
axis xyz;
start ( 9.25 0.001 0 );
end ( 9.25 0.001 1.2 );
}
GaugeVOF03
{
type lineCellFace;
axis xyz;
start ( 15.75 0.001 0 );
end ( 15.75 0.001 1.2 );
}
GaugeVOF04
{
type lineCellFace;
axis xyz;
start ( 17.75 0.001 0 );
end ( 17.75 0.001 1.2 );
}
GaugeVOF05
{
type lineCellFace;
axis xyz;
start ( 21.1 0.001 0 );
end ( 21.1 0.001 1.2 );
}
);
}
gaugesP
{
type sets;
libs ("libsampling.so");
writeControl outputTime;
writeInterval 1;
setFormat raw;
surfaceFormat raw;
interpolationScheme cellPointFace;
fields ( p );
sets
(
GaugesP
{
type boundaryPoints;
axis xyz;
patches 1(caisson);
points ((18.0 0.01 0.75)
(18.00 0.01 0.80)
(18.00 0.01 0.85)
(18.00 0.01 0.95)
(18.01 0.01 0.70)
(18.25 0.01 0.70)
(18.50 0.01 0.70)
(18.75 0.01 0.70));
maxDistance 0.01;
}
);
}
}
*/
// ************************************************************************* //

View file

@ -0,0 +1,25 @@
/*--------------------------------*- C++ -*----------------------------------*\
========= |
\\ / F ield | OpenFOAM: The Open Source CFD Toolbox
\\ / O peration | Website: https://openfoam.org
\\ / A nd | Version: 9
\\/ M anipulation |
-------------------------------------------------------------------------------
Description
Writes graph data for specified fields along a line, specified by start and
end points. A specified number of graph points are used, distributed
uniformly along the line.
\*---------------------------------------------------------------------------*/
start (-50 0.5 -15);
end (-50 0.5 15);
nPoints 100;
fields (alpha.water U);
axis z;
#includeEtc "caseDicts/postProcessing/graphs/graphUniform.cfg"
// ************************************************************************* //

View file

@ -0,0 +1,25 @@
/*--------------------------------*- C++ -*----------------------------------*\
========= |
\\ / F ield | OpenFOAM: The Open Source CFD Toolbox
\\ / O peration | Website: https://openfoam.org
\\ / A nd | Version: 9
\\/ M anipulation |
-------------------------------------------------------------------------------
Description
Writes graph data for specified fields along a line, specified by start and
end points. A specified number of graph points are used, distributed
uniformly along the line.
\*---------------------------------------------------------------------------*/
start (-20 0.5 -15);
end (-20 0.5 15);
nPoints 100;
fields (alpha.water U);
axis z;
#includeEtc "caseDicts/postProcessing/graphs/graphUniform.cfg"
// ************************************************************************* //

View file

@ -8,6 +8,7 @@ import pickle
import matplotlib.pyplot as plt
import matplotlib.animation as animation
from matplotlib.gridspec import GridSpec
from matplotlib.ticker import MultipleLocator
import numpy as np
from scipy import interpolate
@ -28,6 +29,12 @@ parser.add_argument(
help="Only compute maximum rather than animation",
action="store_true",
)
parser.add_argument(
"-i",
"--initial",
help="Only compute initial domain",
action="store_true",
)
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
@ -61,11 +68,19 @@ AW[:, iz0[idz0], ix0[idx0]] = model.fields["alpha.water"]
U = np.full((model.t.size, *X.shape), np.nan)
U[:, iz0[idz0], ix0[idx0]] = np.linalg.norm(model.fields["U"], axis=1)
fig = plt.figure()
gs = GridSpec(3, 1, figure=fig, height_ratios=[1, 0.05, 0.05])
fig = plt.figure(
figsize=(15 / 2.54, 4 / 2.54), dpi=200, constrained_layout=True
)
gs = GridSpec(
3 if not args.initial else 1,
1,
figure=fig,
height_ratios=[1, 0.1, 0.1] if not args.initial else [1],
)
ax = fig.add_subplot(gs[0])
cax1 = fig.add_subplot(gs[1])
cax2 = fig.add_subplot(gs[2])
if not args.initial:
cax1 = fig.add_subplot(gs[1])
cax2 = fig.add_subplot(gs[2])
aw_m = ax.imshow(
AW[0],
vmin=0,
@ -86,10 +101,15 @@ p_m = ax.imshow(
ax.axhline(4.5, ls="-.", lw=1, c="k", alpha=0.2, zorder=1.2)
fig.colorbar(aw_m, label=r"$\alpha_w$", cax=cax1, shrink=0.6, orientation="horizontal")
fig.colorbar(p_m, label=r"Porosity", cax=cax2, shrink=0.6, orientation="horizontal")
if not args.initial:
fig.colorbar(
aw_m, label=r"$\alpha_w$", cax=cax1, shrink=0.6, orientation="horizontal"
)
fig.colorbar(p_m, label=r"Porosity", cax=cax2, shrink=0.6, orientation="horizontal")
ax.set(xlabel="x (m)", ylabel="z (m)", aspect="equal", facecolor="#000000")
ax.grid(c="k", alpha=0.2)
ax.xaxis.set_minor_locator(MultipleLocator(5))
ax.yaxis.set_minor_locator(MultipleLocator(5))
figU = plt.figure()
@ -107,7 +127,7 @@ u_m = axU.imshow(
U[0],
cmap="BuPu",
vmin=0,
vmax=np.nanmax(np.where(AW > 0.5, U, np.nan), initial=0),
vmax=20,
extent=(x0.min(), x0.max(), z0.min(), z0.max()),
zorder=1,
alpha=np.nan_to_num(AW[0]).clip(0, 1),
@ -116,7 +136,7 @@ ur_m = axU.imshow(
U[0],
cmap="YlOrBr",
vmin=0,
vmax=np.nanmax(np.where(AW > 0.5, U, np.nan), initial=0),
vmax=20,
extent=(x0.min(), x0.max(), z0.min(), z0.max()),
zorder=1,
alpha=1 - np.nan_to_num(AW[0]).clip(0, 1),
@ -137,6 +157,13 @@ if args.max:
fig.savefig(out.joinpath("max_aw.pdf"))
figU.savefig(out.joinpath("max_U.pdf"))
elif args.initial:
aw_m.set_array(AW[0])
ax.vlines(-20, -15, 15, color="k", lw=1, ls="--", label="Measurements")
ax.text(-20, 15, "Measurements", ha="right", va="bottom")
fig.savefig(out.joinpath("aw_t0.pdf"))
fig.savefig(out.joinpath("aw_t0.jpg"), dpi=200)
else:
fig.set(figwidth=19.2, figheight=10.8, dpi=100)
figU.set(figwidth=19.2, figheight=10.8, dpi=100)

View file

@ -1,104 +0,0 @@
import argparse
import gzip
import logging
import multiprocessing as mp
import pathlib
import pickle
import matplotlib.pyplot as plt
import matplotlib.animation as animation
from matplotlib.gridspec import GridSpec
import numpy as np
from scipy import interpolate
from .olaflow import OFModel
parser = argparse.ArgumentParser(description="Post-process olaflow results")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument(
"-o",
"--output",
type=pathlib.Path,
help="Output directory for pickled data",
required=True,
)
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
log = logging.getLogger("ola_post")
log.info("Animating olaFlow output")
out = args.output
out.mkdir(parents=True, exist_ok=True)
with (
path.open("rb")
if (path := out.joinpath("pickle")).exists()
else gzip.open(path.with_suffix(".gz"), "rb")
) as f:
model = pickle.load(f)
flt = np.where((model.x >= -60) & (model.x <= -20) & (model.z >= 0) & (model.z <= 10))[
0
]
x0, idx0 = np.unique(model.x[flt].astype(np.half), return_inverse=True)
z0, idz0 = np.unique(model.z[flt].astype(np.half), return_inverse=True)
X, Z = np.meshgrid(x0, z0)
P = np.full((model.t.size, *X.shape), np.nan)
P[:, idz0, idx0] = model.fields["porosity"][:, flt]
AW = np.full((model.t.size, *X.shape), np.nan)
AW[:, idz0, idx0] = model.fields["alpha.water"][:, flt]
watl = z0[np.argmax((AW > 0.5)[:, ::-1, :], axis=1)]
U = np.full((model.t.size, 2, *X.shape), np.nan)
UU = np.full((model.t.size, *X.shape), np.nan)
U[..., idz0, idx0] = model.fields["U"][..., flt][:, (0, 2)]
UU[..., idz0, idx0] = np.linalg.norm(model.fields["U"][..., flt], axis=1)
figU = plt.figure(figsize=(19.2, 10.8), dpi=100)
gsU = GridSpec(2, 1, figure=figU, height_ratios=[1, 0.05])
axU = figU.add_subplot(gsU[0])
caxu1 = figU.add_subplot(gsU[1])
# caxu2 = figU.add_subplot(gsU[2])
alp = np.clip(np.nan_to_num(AW), 0, 1)
axU.pcolormesh(X, Z, P[1], vmin=0, vmax=1, cmap="Greys_r")
u_m = axU.quiver(
X,
Z,
*U[0],
UU[0],
alpha=alp[0],
cmap="inferno_r",
clim=(0, np.nanmax(UU)),
)
# (wat_p,) = axU.plot(x0, watl[0])
axU.set(xlabel="x (m)", ylabel="z (m)", aspect="equal", facecolor="#bebebe")
axU.grid(c="k", alpha=0.2)
titU = axU.text(
0.5,
0.95,
f"t={model.t[0]}s",
horizontalalignment="center",
verticalalignment="top",
transform=axU.transAxes,
)
figU.colorbar(u_m, label=r"$U$", cax=caxu1, shrink=0.6, orientation="horizontal")
def animU(i):
titU.set_text(f"t={model.t[i]}s")
u_m.set_UVC(*U[i], UU[i])
u_m.set_alpha(alp[i])
# wat_p.set_ydata(watl[i])
aniU = animation.FuncAnimation(figU, animU, frames=model.t.size, interval=1 / 24)
aniU.save(out.joinpath("animUzoom.mp4"), fps=24)

View file

@ -9,7 +9,7 @@ import sys
from cycler import cycler
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
from matplotlib.ticker import MultipleLocator
import numpy as np
from scipy import interpolate
@ -67,14 +67,16 @@ def get_pickle(out):
models = list(map(get_pickle, args.output))
figsize = 15 / 2.54, 4 / 2.54 * len(models)
fig, (ax,) = plt.subplots(
fig, ax_ = plt.subplots(
len(models),
figsize=(6, 1.5 * len(models)),
dpi=100,
figsize=figsize,
dpi=200,
constrained_layout=True,
squeeze=False,
)
ax = ax_[:, 0]
if args.timestep is None:
match args.field:
@ -89,31 +91,71 @@ if args.timestep is None:
)
case "U":
for i, (_ax, _model) in enumerate(zip(ax, models)):
v = np.nanmax(np.abs(np.where(
_model.post_fields[args.func]["alpha.water"] > 0.5,
#np.linalg.norm(_model.post_fields[args.func][args.field], axis=2),
_model.post_fields[args.func][args.field][..., 0],
np.nan,
)))
v150 = np.nanmax(np.abs(np.where(
(_model.post_fields[args.func]["alpha.water"] > 0.5) & (_model.t[:, None] > 170) & (_model.t[:, None] < 200),
#np.linalg.norm(_model.post_fields[args.func][args.field], axis=2),
_model.post_fields[args.func][args.field][..., 0],
np.nan,
)))
_data = _model.post_fields[args.func][args.field][..., 0].T
#_c = _ax.contourf(
# _model.t,
# _model.post_fields[args.func][f"x_{args.field}"],
# _data,
# cmap="PiYG",
# #levels=[-15, -10, -5, -2, -1, 0, 1, 2, 5, 10, 15],
# vmin=-np.nanmax(np.abs(_data)),
# vmax=np.nanmax(np.abs(_data)),
# extend="both",
#)
_c = _ax.imshow(
np.linalg.norm(_model.post_fields[args.func][args.field], axis=2).T,
vmin=0,
cmap="inferno",
_data[::-1],
cmap="PiYG",
alpha=np.clip(_model.post_fields[args.func]["alpha.water"], 0, 1).T[::-1],
extent=(
_model.t.min(),
_model.t.max(),
_model.post_fields[args.func][f"x_{args.field}"].min(),
_model.post_fields[args.func][f"x_{args.field}"].max(),
),
vmin=-v150,
vmax=v150,
aspect="auto",
)
_ax.set(xlim=(100, 300))
_ax.set(facecolor="k")
_ax.xaxis.set_minor_locator(MultipleLocator(5))
_ax.yaxis.set_minor_locator(MultipleLocator(1))
fig.colorbar(_c, label=f"{args.field} (m/s)", ax=_ax)
log.info(f"Vitesse max: {v}m/s")
log.info(f"Vitesse max [170,200]: {v150}m/s")
log.info(f"Écart: {abs(np.nanmax(_data)-17.7)/17.7:%}")
case _:
log.error(f"Cannot plot field {args.field} from {args.func}")
sys.exit(1)
for i, (_ax, _model) in enumerate(zip(ax, models)):
_ax.set(xlabel="t (s)", ylabel="z (m)", title=f"Case {i}")
_ax.grid(color="k", alpha=0.2)
_ax.set(xlabel="t (s)", ylabel="z (m)")
if len(models) > 1:
_ax.set(title=f"Case {i}")
#_ax.grid(color="#797979", alpha=0.5)
fig.savefig(
args.output[0].joinpath(
f"diff_{args.func}_{args.field}_{'_'.join([o.name for o in args.output])}.pdf"
)
)
fig.savefig(
args.output[0].joinpath(
f"diff_{args.func}_{args.field}_{'_'.join([o.name for o in args.output])}.jpg"
)
)
else:
match args.field:
case "alpha.water":
@ -121,7 +163,9 @@ else:
_ax.tricontour(
_model.x,
_model.z,
_model.fields[args.field][np.where(_model.t == args.timestep)[0]][0],
_model.fields[args.field][np.where(_model.t == args.timestep)[0]][
0
],
levels=(0.5,),
colors="k",
)

View file

@ -0,0 +1,137 @@
import argparse
import gzip
from itertools import starmap
import logging
from multiprocessing import pool
import pathlib
import pickle
import sys
from cycler import cycler
import matplotlib.pyplot as plt
from matplotlib.ticker import MultipleLocator
import numpy as np
from scipy import interpolate
from .olaflow import OFModel
parser = argparse.ArgumentParser(description="Post-process olaflow results")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument(
"-o",
"--output",
type=pathlib.Path,
help="Output directory for pickled data",
required=True,
)
parser.add_argument(
"-f",
"--func",
type=str,
help="Post-process function to compare",
default="graphUniform",
choices=("graphUniform", "graphUniform2"),
)
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
log = logging.getLogger("ola_post")
log.info("Plotting comparison of model output")
def get_pickle(out):
with (
path.open("rb")
if (path := out.joinpath("pickle")).exists()
else gzip.open(path.with_suffix(".gz"), "rb")
) as f:
return pickle.load(f)
model = get_pickle(args.output)
figsize = 15 / 2.54, 6 / 2.54
fig, ax_ = plt.subplots(
2,
figsize=figsize,
dpi=200,
constrained_layout=True,
)
_ax = ax_[0]
v = np.nanmax(np.abs(np.where(
model.post_fields[args.func]["alpha.water"] > 0.5,
#np.linalg.norm(model.post_fields[args.func]["U"], axis=2),
model.post_fields[args.func]["U"][..., 0],
np.nan,
)))
v150 = np.nanmax(np.abs(np.where(
(model.post_fields[args.func]["alpha.water"] > 0.5) & (model.t[:, None] > 170) & (model.t[:, None] < 200),
#np.linalg.norm(model.post_fields[args.func]["U"], axis=2),
model.post_fields[args.func]["U"][..., 0],
np.nan,
)))
_data = model.post_fields[args.func]["U"][..., 0].T
#_c = _ax.contourf(
# model.t,
# model.post_fields[args.func]["x_U"],
# _data,
# cmap="PiYG",
# #levels=[-15, -10, -5, -2, -1, 0, 1, 2, 5, 10, 15],
# vmin=-np.nanmax(np.abs(_data)),
# vmax=np.nanmax(np.abs(_data)),
# extend="both",
#)
_c = _ax.imshow(
_data[::-1],
cmap="PiYG",
alpha=np.clip(model.post_fields[args.func]["alpha.water"], 0, 1).T[::-1],
extent=(
model.t.min(),
model.t.max(),
model.post_fields[args.func]["x_U"].min(),
model.post_fields[args.func]["x_U"].max(),
),
vmin=-v150,
vmax=v150,
aspect="auto",
)
_ax.set(xlim=(0, 400))
_ax.set(facecolor="k")
_ax.xaxis.set_minor_locator(MultipleLocator(10))
_ax.yaxis.set_minor_locator(MultipleLocator(1))
_ax.set(ylabel="z (m)")
_ax.axes.set_xticklabels([])
fig.colorbar(_c, label=f"U (m/s)", ax=_ax)
log.info(f"Vitesse max: {v}m/s")
log.info(f"Vitesse max [170,200]: {v150}m/s")
log.info(f"Écart: {abs(np.nanmax(_data)-17.7)/17.7:%}")
x = model.post_fields[args.func]["x_U"]
i0 = np.argmin(np.abs(x - 5))
_data = model.post_fields[args.func]["U"][..., i0, 0]
_alpha = model.post_fields[args.func]["alpha.water"][..., i0]
ax = ax_[1]
ax.fill_between(model.t, np.where(_alpha > 0.5, _data, 0), lw=1, color="#898989", edgecolor="k")
#ax.autoscale(True, "x", True)
ax.set(xlim=(0, 400))
ax.set(xlabel="t (s)", ylabel="U (m/s)")
ax.grid(c="k", alpha=.2)
ax.xaxis.set_minor_locator(MultipleLocator(10))
ax.yaxis.set_minor_locator(MultipleLocator(2))
fig.savefig(
args.output.joinpath(
f"U_{args.func}.pdf"
)
)
fig.savefig(
args.output.joinpath(
f"U_{args.func}.jpg"
)
)

View file

@ -1,90 +0,0 @@
import argparse
import configparser
import gzip
from itertools import starmap
import logging
import multiprocessing as mp
import pathlib
import pickle
from cycler import cycler
import matplotlib.pyplot as plt
import matplotlib.gridspec as gridspec
import numpy as np
from scipy import interpolate
from .olaflow import OFModel
parser = argparse.ArgumentParser(description="Post-process olaflow results")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument("-c", "--config", default="config.ini")
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
log = logging.getLogger("ola_post")
log.info("Animating olaFlow output")
config = configparser.ConfigParser()
config.read(args.config)
out = pathlib.Path(config.get("post", "out"))
out.mkdir(parents=True, exist_ok=True)
with (
path.open("rb")
if (path := out.joinpath("pickle")).exists()
else gzip.open(path.with_suffix(".gz"), "rb")
) as f:
model = pickle.load(f)
x0_conf = config.getfloat("post", "x")
x0_val = model.x[np.argmin(np.abs(model.x - x0_conf))]
# z0 = config.getfloat("post", "z")
# z0 = np.linspace(-5, 5, 16)
c0_ = ((model.x == x0_val)[None, :] & (model.fields["alpha.water"] > 0.95)).any(axis=0)
c0 = model.coords[c0_][:: (c0_.sum() // 8 + 1)]
i0 = np.argmin(
np.linalg.norm(model.coords[..., None] - c0.T[None, ...], axis=1),
axis=0,
)
aw = model.fields["alpha.water"][:, i0]
U = np.where(aw > 0.95, np.linalg.norm(model.fields["U"][..., i0], axis=1), np.nan)
P = np.where(aw > 0.95, model.fields["p"][..., i0], np.nan)
P_rgh = np.where(aw > 0.95, model.fields["p_rgh"][..., i0], np.nan)
with plt.rc_context(
{
"axes.prop_cycle": cycler(
color=np.linspace(0, 1, i0.size + 1)[:-1].astype("U")
),
"axes.grid": True,
"axes.xmargin": 0,
}
):
fig, ax = plt.subplots(3, constrained_layout=True)
ax1, ax2, ax3 = ax
ha = ax1.plot(model.t, U, lw=1)
ax1.set(xlabel="t (s)", ylabel="U (m/s)")
ax2.plot(model.t, P * 1e-3, lw=1)
ax2.set(xlabel="t (s)", ylabel="P (kPa)")
ax3.plot(model.t, P_rgh * 1e-3, lw=1)
ax3.set(xlabel="t (s)", ylabel="P_rgh (kPa)")
for a in ax:
a.set(ylim=0)
ax2.legend(
ha,
list(
starmap(lambda x, z: f"x={x:8}m; z={z:8}m", zip(model.x[i0], model.z[i0]))
),
bbox_to_anchor=(1.05, 0.5),
loc="center left",
)
fig.savefig(out.joinpath("fig.pdf"))

4
openfoam_env.fish Normal file
View file

@ -0,0 +1,4 @@
function openfoam_env
bash --rcfile /opt/openfoam9/etc/bashrc -ic "fish -C 'function fish_prompt; echo -s (set_color 66cc00) \"openfoam\" (set_color normal) \" \\\$ \"; end'"
end

36
swash/README.md Normal file
View file

@ -0,0 +1,36 @@
# SWASH
Ce dossier regroupe l'ensemble des scripts nécessaires à l'éxécution du modèle SWASH ainsi qu'au post-traitement des
données.
## Scripts
### Animate
`animate.py` permet d'obtenir une animation des résultats de SWASH.
```python -m processing.animate [-c CONFIG]```
* `-c CONFIG` : choix du fichier de configuration
```
[post]
inp : dossier contenant les données d'entrée
out : dossier de sortie
```
### Mat Npz
`mat_npz` permet de convertir les données de sortie de SWASH en données Numpy plus facile à exploiter en Python.
```python -m processing.mat_npz [-c CONFIG]```
* `-c CONFIG` : choix du fichier de configuration
```
[swash]
out : dossier de sortie de swash
[post]
inp : dossier de sortie pour les données numpy
```

View file

@ -10,5 +10,5 @@ mpi=8
[post]
inp=inp_post/ts_4lay_1h
out=out_post/ts_4lay_1h
x0=-1250
x0=-50
t0=180

View file

@ -9,7 +9,7 @@ mpi=8
[post]
inp=inp_post/real_spec_interp
#compare=inp_post/real_spec_interp_nb
compare=inp_post/real_spec_interp_nb
out=out_post/real_spec_interp
x0=-1250
t0=180

View file

@ -20,7 +20,6 @@ config = configparser.ConfigParser()
config.read(args.config)
inp = pathlib.Path(config.get("post", "inp"))
root = pathlib.Path(config.get("swash", "out"))
out = pathlib.Path(config.get("post", "out"))
out.mkdir(parents=True, exist_ok=True)

View file

@ -1,90 +0,0 @@
import argparse
import configparser
import logging
import pathlib
import matplotlib.animation as animation
import matplotlib.pyplot as plt
import numpy as np
parser = argparse.ArgumentParser(description="Animate swash output")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument("-c", "--config", default="config.ini")
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
log = logging.getLogger("post")
log.info("Starting post-processing")
config = configparser.ConfigParser()
config.read(args.config)
inp = pathlib.Path(config.get("post", "inp"))
root = pathlib.Path(config.get("swash", "out"))
out = pathlib.Path(config.get("post", "out"))
out.mkdir(parents=True, exist_ok=True)
def data(var):
return np.load(inp.joinpath(f"{var}.npy"))
x = data("xp")
t = data("tsec")
watl = data("watl")
botl = data("botl")
zk = data("zk")
velk = data("velk")
vz = data("vz")
wl = np.maximum(watl, -botl)
# print(x.size, -np.arange(0, 1 * bathy.hstru.size, 1)[::-1].size)
fig, ax = plt.subplots()
# ax.plot(x, -botl, c="k")
# ax.fill_between(
# x, -botl, -data["botl"] + bathy.hstru, color="k", alpha=0.2
# )
n = 0
vk = np.sqrt((velk[n] ** 2).sum(axis=1))
# print(vk.shape)
# plt.imshow(vk)
# plt.colorbar()
lines = ax.plot(x, zk[n].T, c="#0066cc")
quiv = []
for i in range(len(lines) - 1):
quiv.append(
ax.quiver(
x[::50],
(zk[n, i, ::50] + zk[n, i + 1, ::50]) / 2,
velk[n, i, 0, ::50],
vz[n, i, ::50],
units="dots",
width=2,
scale=0.05,
)
)
ax.autoscale(True, "w", True)
ax.set_ylim(top=15)
def animate(k):
for i, q in enumerate(quiv):
q.set_UVC(
velk[k, i, 0, ::50],
vz[k, i, ::50],
)
for i, l in enumerate(lines):
l.set_ydata(zk[k, i])
return *quiv, *lines
ani = animation.FuncAnimation(
fig, animate, frames=wl[:, 0].size, interval=20, blit=True
)
ani.save(out.joinpath("layers.mp4"))

View file

@ -7,8 +7,6 @@ import matplotlib.pyplot as plt
import numpy as np
import scipy.signal as sgl
from .read_swash import *
parser = argparse.ArgumentParser(description="Post-process swash output")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument("-c", "--config", default="config.ini")
@ -30,7 +28,8 @@ t = np.load(inp.joinpath("t.npy"))
botl = np.load(inp.joinpath("botl.npy"))
watl = np.load(inp.joinpath("watl.npy"))
vel = np.load(inp.joinpath("vel_x.npy"))
#vel = np.load(inp.joinpath("vel_x.npy"))
vel = np.load(inp.joinpath("vel.npy"))[0]
# Cospectral calculations
x0 = config.getint("post", "x0")
@ -65,22 +64,25 @@ R = np.sqrt(
if config.has_option("post", "compare"):
inp_comp = pathlib.Path(config.get("post", "compare"))
x_ = np.load(inp_comp.joinpath("xp.npy"))
t_ = np.load(inp_comp.joinpath("tsec.npy"))
x_ = np.load(inp_comp.joinpath("x.npy"))
t_ = np.load(inp_comp.joinpath("t.npy"))
botl_ = np.load(inp_comp.joinpath("botl.npy"))
watl_ = np.load(inp_comp.joinpath("watl.npy"))
vel_ = np.load(inp_comp.joinpath("vel_x.npy"))
# Cospectral calculations
arg_x0_ = np.abs(x_ - x0).argmin()
arg_t0_ = np.abs(t_ - t0).argmin()
dt_ = np.diff(t_).mean() * 1e-3
f_ = 1 / dt_
eta_ = watl_[t_ > t0, arg_x0_]
u_ = vel_[t_ > t0, arg_x0_]
phi_eta_ = sgl.welch(eta_, f, nperseg=nperseg)
phi_u_ = sgl.welch(u_, f, nperseg=nperseg)
phi_eta_u_ = sgl.csd(eta_, u_, f, nperseg=nperseg)
phi_eta_ = sgl.welch(eta_, f_, nperseg=nperseg)
phi_u_ = sgl.welch(u_, f_, nperseg=nperseg)
phi_eta_u_ = sgl.csd(eta_, u_, f_, nperseg=nperseg)
H_ = np.sqrt(np.abs(phi_eta_[1]))
U_ = np.sqrt(np.abs(phi_u_[1]))
@ -91,10 +93,6 @@ if config.has_option("post", "compare"):
(np.abs(phi_eta_[1]) + np.abs(phi_u_[1]) - 2 * phi_eta_u_[1].real)
/ (np.abs(phi_eta_[1]) + np.abs(phi_u_[1]) + 2 * phi_eta_u_[1].real)
)
# R_ = np.sqrt(
# (1 + G_**2 - 2 * G_ * np.cos(th_eta_u_))
# / (1 + G_**2 + 2 * G_ * np.cos(th_eta_u_))
# )
# Plotting

View file

@ -0,0 +1,72 @@
import argparse
import configparser
import logging
import pathlib
import matplotlib.pyplot as plt
import numpy as np
import scipy.signal as sgl
parser = argparse.ArgumentParser(description="Post-process swash output")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument("-c", "--config", default="config.ini")
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
log = logging.getLogger("post")
log.info("Starting post-processing")
config = configparser.ConfigParser()
config.read(args.config)
inp = pathlib.Path(config.get("post", "inp"))
root = pathlib.Path(config.get("swash", "out"))
log.info(f"Reading data from '{inp}'")
x = np.load(inp.joinpath("x.npy"))
t = np.load(inp.joinpath("t.npy"))
botl = np.load(inp.joinpath("botl.npy"))
watl = np.load(inp.joinpath("watl.npy"))
vel = np.load(inp.joinpath("vel.npy"))[0]
t0 = np.linspace(23 * 60 + 8, 23 * 60 + 8 + 100, 6)
# Plotting
log.info("Plotting results")
vlim = np.nanmin(np.maximum(watl, -botl)), np.nanmax(np.maximum(watl, -botl))
fig_x, ax = plt.subplots(
3, 2, figsize=(15 / 2.54, 2 / 3 * 10 / 2.54), constrained_layout=True
)
i0 = np.argmin(np.abs(t * 1e-3 - t0[0]))
for ax_x, t0_x in zip(ax.reshape(-1), t0):
ax_x.plot(x, -botl, color="k")
i = np.argmin(np.abs(t * 1e-3 - t0_x))
ax_x.plot(
x,
np.maximum(watl[i, :], -botl),
color="#0066ff",
)
ax_x.axvline(-1450 + 1450 * ((t[i] - t[i0]) * 1e-3+5) / 100, color="k", alpha=0.2, lw=10)
ax_x.grid(color="k", alpha=0.2)
ax_x.set(ylabel="z (m)", ylim=vlim)
ax_x.text(
0.95,
0.95,
f"$T+{(t[i]-t[i0])*1e-3:.1f}s$",
horizontalalignment="right",
verticalalignment="top",
transform=ax_x.transAxes,
)
ax_x.autoscale(axis="x", tight=True)
out = pathlib.Path(config.get("post", "out")).joinpath(f"trans")
log.info(f"Saving plots in '{out}'")
out.mkdir(parents=True, exist_ok=True)
fig_x.savefig(out.joinpath("x.pdf"))
fig_x.savefig(out.joinpath("x.jpg"), dpi=200)
log.info("Finished post-processing")

View file

@ -0,0 +1,93 @@
import argparse
import configparser
import logging
import pathlib
import matplotlib.pyplot as plt
from matplotlib.ticker import MultipleLocator, LogLocator, NullFormatter
import numpy as np
import scipy.signal as sgl
parser = argparse.ArgumentParser(description="Post-process swash output")
parser.add_argument("-v", "--verbose", action="count", default=0)
parser.add_argument("-c", "--config", default="config.ini")
args = parser.parse_args()
logging.basicConfig(level=max((10, 20 - 10 * args.verbose)))
log = logging.getLogger("post")
log.info("Starting post-processing")
config = configparser.ConfigParser()
config.read(args.config)
inp = pathlib.Path(config.get("post", "inp"))
root = pathlib.Path(config.get("swash", "out"))
log.info(f"Reading data from '{inp}'")
x = np.load(inp.joinpath("x.npy"))
t = np.load(inp.joinpath("t.npy")) * 1e-3
botl = np.load(inp.joinpath("botl.npy"))
watl = np.load(inp.joinpath("watl.npy"))
vel = np.load(inp.joinpath("vel.npy"))[0]
# Plotting
log.info("Plotting results")
vlim = np.nanmin(np.maximum(watl, -botl)), np.nanmax(np.maximum(watl, -botl))
x0 = np.linspace(-700, -200, 6)
i0 = np.argmin(np.abs(x[:, None] - x0), axis=0)
fig_x, ax = plt.subplots(
3, 2, figsize=(15 / 2.54, 7.5 / 2.54), constrained_layout=True
)
dt = np.mean(np.diff(t))
N = t.size
s0 = 2 * dt
dj = 0.5
J = 1 / dj * np.log2(N * dt / s0)
j = np.arange(0, J)
sj = s0 * 2 ** (j * dj)
Mw = sj / dt
sig = np.std(watl[:, i0])
M = np.stack([(np.abs(sgl.cwt(watl[:, i], sgl.morlet2, Mw))/sig)**2 for i in i0])
v = np.max(M)
T = 2 * sj * np.pi / 5
for ax_x, M_, x_ in zip(ax.flatten(), M, x[i0]):
c = ax_x.contourf(t, T, M_, cmap="Greys", vmin=0, levels=[1, 4, 16, 64], extend="both")
fig_x.colorbar(c, ax=ax_x, label="NWPS")
ax_x.grid(color="k", alpha=0.2)
ax_x.text(
0.95,
0.95,
f"x={x_:.0f}m",
horizontalalignment="right",
verticalalignment="top",
transform=ax_x.transAxes,
#c="w",
)
ax_x.semilogy()
ax_x.autoscale(True, "both", True)
ax_x.set_rasterization_zorder(1.5)
ax_x.set(ylabel="T (s)", ylim=(sj[0], sj[-1]))
ax_x.xaxis.set_minor_locator(MultipleLocator(100))
ax_x.yaxis.set_major_locator(LogLocator(10, numticks=2**10))
ax_x.yaxis.set_minor_locator(LogLocator(10, subs=np.arange(10), numticks=2**10))
ax_x.yaxis.set_minor_formatter(NullFormatter())
if ax_x not in ax[-1]:
ax_x.axes.set_xticklabels([])
else:
ax_x.set(xlabel="t (s)")
out = pathlib.Path(config.get("post", "out")).joinpath(f"trans")
log.info(f"Saving plots in '{out}'")
out.mkdir(parents=True, exist_ok=True)
fig_x.savefig(out.joinpath("wavelet.pdf"), dpi=300)
fig_x.savefig(out.joinpath("wavelet.jpg"), dpi=200)
fig_x.savefig(out.joinpath("wavelet.png"), dpi=200)
log.info("Finished post-processing")

View file

@ -19,6 +19,7 @@ config = configparser.ConfigParser()
config.read(args.config)
inp = pathlib.Path(config.get("post", "inp"))
inp_comp = pathlib.Path(config.get("post", "compare"))
root = pathlib.Path(config.get("swash", "out"))
log.info(f"Reading data from '{inp}'")
@ -26,13 +27,19 @@ x = np.load(inp.joinpath("x.npy"))
t = np.load(inp.joinpath("t.npy"))
watl = np.load(inp.joinpath("watl.npy"))
watl_comp = np.load(inp_comp.joinpath("watl.npy"))
# Cospectral calculations
x0 = config.getint("post", "x0")
arg_x0 = np.abs(x - x0).argmin()
w0 = watl[:, arg_x0]
w0_comp = watl_comp[:, arg_x0]
cr0 = np.where(np.diff(np.sign(w0)))[0]
cr0_comp = np.where(np.diff(np.sign(w0_comp)))[0]
log.info(f"1: {cr0.size} waves")
log.info(f"2: {cr0_comp.size} waves")
wave = np.fromiter(
(
@ -44,6 +51,16 @@ wave = np.fromiter(
),
dtype=np.single,
)
wave_comp = np.fromiter(
(
np.abs(
np.max(np.abs(w0_comp[cr0_comp[i - 1] : cr0_comp[i]]))
+ np.max(np.abs(w0_comp[cr0_comp[i] : cr0_comp[i + 1]]))
)
for i in range(1, len(cr0) - 1)
),
dtype=np.single,
)
i0 = np.argmax(wave)
@ -53,8 +70,39 @@ out.mkdir(parents=True, exist_ok=True)
fig, ax = plt.subplots()
ax.plot(t[cr0[1:-1]] * 1e-3, wave)
ax.set(xlabel="t (s)", ylabel="z (m)")
ax.autoscale(True, "x", True)
ax.grid()
fig.savefig(out.joinpath("wsize.pdf"))
fig2, ax2 = plt.subplots()
ax2.plot(t[cr0[i0 - 5] : cr0[i0 + 7]], w0[cr0[i0 - 5] : cr0[i0 + 7]])
fig2, ax2 = plt.subplots(
figsize=(10 / 2.54, 4 / 2.54), constrained_layout=True
)
ax2.plot(
t[cr0[i0 - 5] : cr0[i0 + 7]] * 1e-3,
w0[cr0[i0 - 5] : cr0[i0 + 7]],
color="k",
label="Cas 1",
)
ax2.plot(
t[cr0[i0 - 5] : cr0[i0 + 7]] * 1e-3,
w0_comp[cr0[i0 - 5] : cr0[i0 + 7]],
color="k",
ls="-.",
label="Cas 2",
)
ax2.set(xlabel="t (s)", ylabel="z (m)")
ax2.autoscale(True, "x", True)
ax2.grid()
ax2.legend()
fig2.savefig(out.joinpath("maxw.pdf"))
fig2.savefig(out.joinpath("maxw.jpg"), dpi=200)
log.info(
f"RMS difference: {np.sqrt(np.mean((w0_comp-w0)**2))}m ; {np.sqrt(np.mean((w0_comp-w0)**2))/(w0.max()-w0.min()):%}"
)
log.info(f"Bias: {np.mean(w0_comp-w0)}m")
log.info(f"Maximum wave size: {wave.max()}m ; {wave_comp.max()}m")
log.info(
f"Maximum wave size difference: {abs(wave_comp.max()-wave.max())/wave.max():%}"
)

View file

@ -18,3 +18,18 @@ Olaflow comparison with photos
Comparison of olaflow output with block displacement theories
Format rapport: Journal of Geophysical Research
Ajouter Figures: photos: vague & bloc sur la digue, wavelet analysis bouée autres vagues,
Snapshot vagues sortie olaflow
Étoffer un peu contenu
Figure 6: Line plot en 1 point de la vitesse du courant
Faire lien entre photos et splashs dans swash
Génération vague scélérate : combinaison + dispersion -> zone non dispersive : ajouter profil bathymétrie
Tester bathy plane avec swash pour voir si transfert énergie IG -> G