-
Notifications
You must be signed in to change notification settings - Fork 0
/
index.xml
676 lines (525 loc) · 73.8 KB
/
index.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
<title>Joris Paret</title>
<link>https://jorisparet.github.io/</link>
<atom:link href="https://jorisparet.github.io/index.xml" rel="self" type="application/rss+xml" />
<description>Joris Paret</description>
<generator>Wowchemy (https://wowchemy.com)</generator><language>en-us</language><lastBuildDate>Sat, 01 Jun 2030 13:00:00 +0000</lastBuildDate>

<item>
<title>Example Talk</title>
<link>https://jorisparet.github.io/talk/example-talk/</link>
<pubDate>Sat, 01 Jun 2030 13:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/talk/example-talk/</guid>
<description><div class="alert alert-note">
<div>
Click on the <strong>Slides</strong> button above to view the built-in slides feature.
</div>
</div>
<p>Slides can be added in a few ways:</p>
<ul>
<li><strong>Create</strong> slides using Wowchemy&rsquo;s <a href="https://wowchemy.com/docs/managing-content/#create-slides" target="_blank" rel="noopener"><em>Slides</em></a> feature and link using <code>slides</code> parameter in the front matter of the talk file</li>
<li><strong>Upload</strong> an existing slide deck to <code>static/</code> and link using <code>url_slides</code> parameter in the front matter of the talk file</li>
<li><strong>Embed</strong> your slides (e.g. Google Slides) or presentation video on this page using <a href="https://wowchemy.com/docs/writing-markdown-latex/" target="_blank" rel="noopener">shortcodes</a>.</li>
</ul>
<p>Further event details, including <a href="https://wowchemy.com/docs/writing-markdown-latex/" target="_blank" rel="noopener">page elements</a> such as image galleries, can be added to the body of this page.</p>
</description>
</item>
<item>
<title>submv</title>
<link>https://jorisparet.github.io/project/submv/</link>
<pubDate>Sat, 07 Jan 2023 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/project/submv/</guid>
<description><h1 id="submv">submv</h1>
<p><strong>submv</strong> allows you to shift a subtitle file by a given amount to synchronize it to a video stream using a simple command line tool.</p>
<h2 id="quickstart">Quickstart</h2>
<p>In a console, simply type</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">submv path/to/subtitles.srt -1.5
</span></span></code></pre></div><p>to shift the file <code>subtitles.srt</code> by -1.5s. By default, it will <strong>overwrite</strong> the original file.</p>
<p>More options:</p>
<ul>
<li>The shifted subtitles can be written to a new file by using the <code>--output</code> flag (or <code>-o</code> for short). Example: <code>submv file.srt 2.1 --output new_file.srt</code>.</li>
<li>The default format is <a href="https://en.wikipedia.org/wiki/SubRip" target="_blank" rel="noopener">SubRip</a> (<code>*.srt</code> files). Other formats can be read using the <code>--format</code> flag (or <code>-f</code> for short). Example: <code>submv file.sub --format sub</code>.</li>
<li>For certain formats such as <a href="https://en.wikipedia.org/wiki/MicroDVD" target="_blank" rel="noopener">MicroDVD</a> (<code>*.sub</code> files), the timecodes depend on the video framerate. To account for this, the correct framerate must be specified with the <code>--framerate</code> flag (or <code>-r</code> for short). Example: <code>submv file.sub --framerate 30</code>.</li>
</ul>
<h2 id="installation">Installation</h2>
<p><strong>1.</strong> From <a href="https://pypi.org/project/submv/" target="_blank" rel="noopener">PyPI</a>:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">pip install submv
</span></span></code></pre></div><hr>
<p><strong>2.</strong> From the <a href="https://github.com/jorisparet/submv" target="_blank" rel="noopener">code repository</a>:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-fallback" data-lang="fallback"><span class="line"><span class="cl">git clone https://github.com/jorisparet/submv
</span></span><span class="line"><span class="cl">cd submv
</span></span><span class="line"><span class="cl">pip install .
</span></span></code></pre></div><h4 id="linux">Linux</h4>
<p>The default folder should be under <code>/home/&lt;user&gt;/.local/bin/</code>. Make sure this location (or the correct one, if different) is included in your <code>$PATH</code> environment variable to be able to run the scripts from the console. If not, type the following command <code>export PATH=$PATH:/path/to/submv/script/</code> in the console or add it your <code>.bashrc</code> file.</p>
<h4 id="windows">Windows</h4>
<p>The default folder should be under <code>C:\Users\&lt;user&gt;\AppData\Local\Programs\Python\&lt;python_version&gt;\Scripts\</code>. Make sure this location (or the correct one, if different) is included in your <code>$PATH</code> environment variable to be able to run the scripts from the console. If not, type the following command <code>set PATH=%PATH%;C:\path\to\submv\script\</code> in the console, or select <code>Edit the system environment variables</code> in the search bar, click <code>Environment Variables…</code>, click <code>PATH</code>, click <code>Edit...</code> and add the correct path to the scripts.</p>
<h2 id="author">Author</h2>
<p><a href="https://www.jorisparet.com/" target="_blank" rel="noopener">Joris Paret</a></p>
</description>
</item>
<item>
<title>“Dimensionality reduction of local structure in glassy binary mixtures” - The Journal of Chemical Physics</title>
<link>https://jorisparet.github.io/post/2022-11-22_jcp/</link>
<pubDate>Mon, 24 Oct 2022 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/post/2022-11-22_jcp/</guid>
<description><h1 id="authors">Authors</h1>
<ul>
<li>Daniele Coslovich</li>
<li>Robert L. Jack</li>
<li>Joris Paret</li>
</ul>
<h1 id="abstract">Abstract</h1>
<p>We consider unsupervised learning methods for characterizing the disordered microscopic structure of supercooled liquids and glasses. Specifically, we perform dimensionality reduction of smooth structural descriptors that describe radial and bond-orientational correlations and assess the ability of the method to grasp the essential structural features of glassy binary mixtures. In several cases, a few collective variables account for the bulk of the structural fluctuations within the first coordination shell and also display a clear connection with the fluctuations of particle mobility. Fine-grained descriptors that characterize the radial dependence of bond-orientational order better capture the structural fluctuations relevant for particle mobility but are also more difficult to parameterize and to interpret. We also find that principal component analysis of bond-orientational order parameters provides identical results to neural network autoencoders while having the advantage of being easily interpretable. Overall, our results indicate that glassy binary mixtures have a broad spectrum of structural features. In the temperature range we investigate, some mixtures display well-defined locally favored structures, which are reflected in bimodal distributions of the structural variables identified by dimensionality reduction.</p>
<h1 id="article-and-data-availability">Article and data availability</h1>
<p>This paper is published <a href="https://doi.org/10.1063/5.0128265" target="_blank" rel="noopener">The Journal of Chemical Physics</a> and is also available on <a href="https://arxiv.org/abs/2211.01904" target="_blank" rel="noopener">arXiv</a>. The computation of descriptors and most of the dimensionality reduction analysis were performed using the <a href="https://www.jorisparet.com/partycls" target="_blank" rel="noopener">partycls</a> package. Data analysis has been carried out using a reproducible workflow, deposited in the Zenodo <a href="https://doi.org/10.5281/zenodo.7108317" target="_blank" rel="noopener">public repository</a>.</p>
</description>
</item>
<item>
<title>New major release of partycls (2.0.0)</title>
<link>https://jorisparet.github.io/post/2022-10-24_partycls-2-0-0/</link>
<pubDate>Mon, 24 Oct 2022 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/post/2022-10-24_partycls-2-0-0/</guid>
<description><p>A new major release of <strong>partycls</strong> (v2.0.0) is now available on <a href="https://github.com/jorisparet/partycls" target="_blank" rel="noopener">GitHub</a> and <a href="https://pypi.org/project/partycls/" target="_blank" rel="noopener">PyPI</a>. The new <a href="https://www.jorisparet.com/partycls" target="_blank" rel="noopener">homepage</a> also features new tutorials and a more consistent documentation, with a more detailed API presentation and cross-references between the different pages.</p>
<p>The most important changes include:</p>
<ul>
<li>A variety of new structural descriptors and features (<em>e.g.</em> Voronoi tessellation).</li>
<li>A global optimization for most computations.</li>
<li>Several bug fixes.</li>
</ul>
<p>See the <a href="https://www.jorisparet.com/partycls/changelog.html" target="_blank" rel="noopener">changelog</a> for more details on this new version.</p>
</description>
</item>
<item>
<title>Dimensionality reduction of local structure in glassy binary mixtures</title>
<link>https://jorisparet.github.io/publication/2022_jcp/</link>
<pubDate>Fri, 30 Sep 2022 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/publication/2022_jcp/</guid>
<description><!-- Supplementary notes can be added here, including [code, math, and images](https://wowchemy.com/docs/writing-markdown-latex/). -->
</description>
</item>
<item>
<title>First official release of hamoco</title>
<link>https://jorisparet.github.io/post/2022-09-07_hamoco-first-release/</link>
<pubDate>Wed, 07 Sep 2022 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/post/2022-09-07_hamoco-first-release/</guid>
<description><p><strong>hamoco</strong> (<em>handy mouse controller</em>) allows you to take control of your mouse by using hand gestures that are captured in real time by your webcam. It relies on <a href="https://google.github.io/mediapipe/" target="_blank" rel="noopener">MediaPipe</a> to track hands, and the nature of the different hand poses are predicted by a small neural network built with <a href="https://www.tensorflow.org/" target="_blank" rel="noopener">TensorFlow</a>. Basically, I thought that it might be fun to try and replicate the famous scene from the movie <a href="https://en.wikipedia.org/wiki/Minority_Report_%28film%29" target="_blank" rel="noopener">Minority Report</a> (spoiler: it&rsquo;s cooler when Tom Cruise does it).</p>
<p>You can perform all the basic mouse actions: <em>motion</em>, <em>left/right click</em>, <em>vertical scrolling</em> and <em>drag &amp; drop</em>. There are many options to adjust the experience to your liking (<em>e.g.</em> sensitivity, motion smoothing) and even an automated pipeline to record your own data and train a custom neural network tailored to your needs.</p>
<p>The code is now available on <a href="https://pypi.org/project/hamoco/" target="_blank" rel="noopener">PyPI</a> in version 1.0.1, and you can also check out the page of the project on <a href="https://github.com/jorisparet/hamoco" target="_blank" rel="noopener">GitHub</a>.</p>
</description>
</item>
<item>
<title>Synth Road is now available on Android</title>
<link>https://jorisparet.github.io/post/2022-06-11_synth-road-first-release/</link>
<pubDate>Sat, 11 Jun 2022 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/post/2022-06-11_synth-road-first-release/</guid>
<description><p><strong>Synth Road</strong> is an arcade mobile game with synthwave vibes created using the <a href="https://unity.com/" target="_blank" rel="noopener">Unity</a> game engine. The goal is to go as far as possible on the road by avoiding the obstacles, and by using powers to help you do so. You can then publish your highscores on the leaderboard to compare your results with other players. The game is free to download on <a href="https://play.google.com/store/apps/details?id=com.JorisParet.SynthRoad" target="_blank" rel="noopener">Google Play</a>, and it contains a single <strong>optional</strong> advertisement whose generated money (if any) will be given to humanitarian NGOs.</p>
<p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img alt="screenshot" srcset="
/post/2022-06-11_synth-road-first-release/screenshot_hu6d331ad27abbe38bd918b5eef0c16235_87861_db9f84950c1efad89660bc9d7b150d17.webp 400w,
/post/2022-06-11_synth-road-first-release/screenshot_hu6d331ad27abbe38bd918b5eef0c16235_87861_4013f908aaeacf7461da3669d5a07f56.webp 760w,
/post/2022-06-11_synth-road-first-release/screenshot_hu6d331ad27abbe38bd918b5eef0c16235_87861_1200x1200_fit_q100_h2_lanczos.webp 1200w"
src="https://jorisparet.github.io/post/2022-06-11_synth-road-first-release/screenshot_hu6d331ad27abbe38bd918b5eef0c16235_87861_db9f84950c1efad89660bc9d7b150d17.webp"
width="270"
height="540"
loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<p>This game is the logical consequence of two things:</p>
<ol>
<li>My obsession for the indie game <a href="https://www.mobiusdigitalgames.com/outer-wilds.html" target="_blank" rel="noopener">Outer Wilds</a>, created by the geniuses at <a href="https://www.mobiusdigitalgames.com/" target="_blank" rel="noopener">Mobius Digital Games</a>, which was also created with Unity.</li>
<li>The amazing YouTube channel of <a href="https://www.youtube.com/c/SebastianLague" target="_blank" rel="noopener">Sebastian Lague</a>, where he creates various Unity projects and explains them in the most pedagogical, poetic and relaxing way.</li>
</ol>
<p>These two things motivated me to experiment with Unity, and after a few weeks I wanted to prove to myself that I could create an entire project on my own from scratch. The most common mistake among amateur game developers is to directly start with unreasonably big projects, so I tried to keep it simple and came up with this mobile game. This was an opportunity to have a first look at various aspects of game design (gameplay, VFX, SFX, UI, etc.) and to learn more about the countless features of Unity. The game mechanics and visuals are simple, but for a first try I am pretty satisfied with the result, and I hope that I will have more time in the future for more elaborate projects.</p>
<p>The code and all the game assets are available on the <a href="https://github.com/jorisparet/synth-road" target="_blank" rel="noopener">GitHub page</a> of the project.</p>
</description>
</item>
<item>
<title>hamoco</title>
<link>https://jorisparet.github.io/project/hamoco/</link>
<pubDate>Sun, 29 May 2022 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/project/hamoco/</guid>
<description><p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img alt=""
src="https://jorisparet.github.io/project/hamoco/logo.svg"
loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<h1 id="hamoco">Hamoco</h1>
<p><strong>hamoco</strong> (<em>handy mouse controller</em>) is a python application that allows you to control your mouse from your webcam using various hand gestures. You have a laptop equipped with a webcam? Well, good news, that&rsquo;s all you need to feel like Tom Cruise in <a href="https://en.wikipedia.org/wiki/Minority_Report_%28film%29" target="_blank" rel="noopener">Minority Report</a>! Kind of.</p>
<h3 id="demonstration">Demonstration</h3>
<p>In the example below, the hand is used to move the pointer, open a file by double-clicking on it, scroll through it, select a paragraph and cut it. The file is then dragged and dropped into a folder.</p>
<p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img src="https://raw.githubusercontent.com/jorisparet/hamoco/main/images/demo.gif" alt="" loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<h3 id="how-does-it-work">How does it work?</h3>
<p>By using the power of <a href="https://pypi.org/project/PyAutoGUI/" target="_blank" rel="noopener">PyAutoGUI</a> to control the mouse, <a href="https://pypi.org/project/opencv-python/" target="_blank" rel="noopener">OpenCV</a> to process the video feed, and <a href="https://google.github.io/mediapipe/" target="_blank" rel="noopener">MediaPipe</a> to track hands, <strong>hamoco</strong> predicts the nature of a hand pose in real-time thanks to a neural network built with <a href="https://keras.io/" target="_blank" rel="noopener">Keras</a> and uses it to perform various kinds of mouse pointer actions.</p>
<h2 id="installation">Installation</h2>
<p><strong>1.</strong> From <a href="https://pypi.org/project/hamoco/" target="_blank" rel="noopener">PyPI</a>:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">pip install hamoco
</span></span></code></pre></div><hr>
<p><strong>2.</strong> From the <a href="https://github.com/jorisparet/hamoco" target="_blank" rel="noopener">code repository</a>:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-fallback" data-lang="fallback"><span class="line"><span class="cl">git clone https://github.com/jorisparet/hamoco
</span></span><span class="line"><span class="cl">cd hamoco
</span></span><span class="line"><span class="cl">pip install -r requirements.txt
</span></span><span class="line"><span class="cl">pip install .
</span></span></code></pre></div><hr>
<p>The installation copies three scripts in the default script folder of <code>pip</code>:</p>
<ol>
<li><code>hamoco-run</code></li>
<li><code>hamoco-data</code></li>
<li><code>hamoco-train</code></li>
</ol>
<h4 id="linux">Linux</h4>
<p>The default folder should be under <code>/home/&lt;user&gt;/.local/bin/</code>. Make sure this location (or the correct one, if different) is included in your <code>$PATH</code> environment variable to be able to run the scripts from the console. If not, type the following command <code>export PATH=$PATH:/path/to/hamoco/scripts/</code> in the console or add it your <code>.bashrc</code> file.</p>
<h4 id="windows">Windows</h4>
<p>The default folder should be under <code>C:\Users\&lt;user&gt;\AppData\Local\Programs\Python\&lt;python_version&gt;\Scripts\</code>. Make sure this location (or the correct one, if different) is included in your <code>$PATH</code> environment variable to be able to run the scripts from the console. If not, type the following command <code>set PATH=%PATH%;C:\path\to\hamoco\scripts\</code> in the console, or select <code>Edit the system environment variables</code> in the search bar, click <code>Environment Variables…</code>, click <code>PATH</code>, click <code>Edit...</code> and add the correct path to the scripts.</p>
<h3 id="requirements">Requirements:</h3>
<ul>
<li><a href="https://pypi.org/project/PyAutoGUI/" target="_blank" rel="noopener">PyAutoGUI</a></li>
<li><a href="https://pypi.org/project/numpy/" target="_blank" rel="noopener">NumPy</a></li>
<li><a href="https://pypi.org/project/opencv-python/" target="_blank" rel="noopener">OpenCV</a></li>
<li><a href="https://google.github.io/mediapipe/" target="_blank" rel="noopener">MediaPipe</a></li>
<li><a href="https://www.tensorflow.org" target="_blank" rel="noopener">TensorFlow</a></li>
</ul>
<h2 id="quick-start">Quick start</h2>
<h3 id="running-the-scripts">Running the scripts</h3>
<p><strong>hamoco</strong> is composed of three executable scripts: <em><a href="#hamoco-run">hamoco-run</a></em>, <em><a href="#hamoco-data">hamoco-data</a></em>, and <em><a href="#hamoco-train">hamoco-train</a></em>, that are listed below. Run these scripts directly from the console, <em>e.g.</em> <code>hamoco-run --sensitivity 0.5 --show</code>.</p>
<h3 id="hamoco-run">hamoco-run</h3>
<p><em>hamoco-run</em> is the <strong>main application</strong>. It activates the webcam and allows to use hand gestures to take control of the mouse pointer. Several basic actions can then be performed, such as <em>left click</em>, <em>right click</em>, <em>drag and drop</em> and <em>scrolling</em>. Note that it requires <strong>a bit of practice</strong> before getting comfortable with the controls. Various settings can be adjusted to customize the hand controller to your liking, such as the global sensivitity, parameters for motion smoothing and much more. Type <code>hamoco-run --help</code> for more information on the available options.</p>
<p>Examples:</p>
<ul>
<li><code>hamoco-run --sensitivity 0.4 --scrolling_threshold 0.2</code> : adapts the sensitivity and sets a custom threshold value to trigger scrolling motions.</li>
<li><code>hamoco-run --min_cutoff_filter 0.05 --show</code> : sets a custom value for the cutoff frequency used for motion smoothing and opens a window that shows the processed video feed in real-time.</li>
<li><code>hamoco-run --scrolling_speed 20</code> : sets a custom value for the scrolling speed. Note that for a given value, results may differ significantly depending on the operating system.</li>
<li><code>hamoco-run --margin 0.2 --stop_sequence THUMB_SIDE CLOSE INDEX_MIDDLE_UP</code> : adapts the size of the detection margin (indicated by the dark frame in the preview windows using <code>--show</code>), and changes the sequence of consecutive poses to stop the application.</li>
</ul>
<p>Configuration files with default values for the control parameters can be found in the installation folder, under <code>hamoco/config/</code>. Simply edit the file that corresponds to your operating system (<code>posix.json</code> for <strong>Linux</strong> and <code>nt.json</code> for <strong>Windows</strong>) to save your settings permanently, and hence avoid specifying the parameters by hand in the console.</p>
<h4 id="hand-poses--mouse-actions">Hand poses &amp; Mouse actions:</h4>
<ul>
<li><code>OPEN</code> : the pointer is free and follows the center of the palm (indicated by the white square) ;</li>
<li><code>CLOSE</code> : the pointer stops all actions. The hand can be moved anywhere in the frame without moving the pointer. This is used to reset the origin of motion (see the <em>nota bene</em> below) ;</li>
<li><code>INDEX_UP</code> : performs a left-click at the current pointer location. Execute twice rapidly for a double-click ;</li>
<li><code>PINKY_UP</code> : performs a right click at the current pointer location ;</li>
<li><code>INDEX_MIDDLE_UP</code> : holds the left mouse button down and moves the pointer by following the center of the palm. This is used for selection and drag &amp; drop ;</li>
<li><code>THUMB_SIDE</code> : enables vertical scrolling using the first triggering location as origin. Scrolling up or down is done by moving the hand up or down relative to the origin while keeping the same hand pose ;</li>
</ul>
<p><strong>N.B.</strong> note that, much like a real mouse, the recorded motion of the pointer is <em>relative</em> to its previous position. When your mouse reaches the edge of your mouse pad, you simply lift it and land it back somewhere on the pad to start moving again. Similarly, if your hand reaches the edge of the frame, the pointer will stop moving: simply close your fist and move it back into the frame to reset the origin of motion (exactly like when lifting and moving a real mouse).</p>
<p>The various hand poses are illustrated below:</p>
<p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img src="https://raw.githubusercontent.com/jorisparet/hamoco/main/images/hand_poses.jpg" alt="" loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<h4 id="exiting-the-application">Exiting the application:</h4>
<p>There are two ways to exit the application:</p>
<ol>
<li>In the preview mode (<code>--show</code> option enabled), simply click on the preview window and press <code>ESC</code> ;</li>
<li>Execute a predetermined sequence of consecutive hand poses. The default sequence can be found in the help message (<code>hamoco-run --help</code>). A new sequence can be specified with the <code>--stop_sequence</code> option followed by the consecutive hand poses, or it can simply be changed in the <code>.json</code> configuration file.</li>
</ol>
<h3 id="hamoco-data">hamoco-data</h3>
<p><em>hamoco-data</em> activates the webcam and allows to record your own labeled data for hand poses in order to train a custom neural-network-based classification model for the main application. This model can then be used in place of the one provided by default and will be more performant, as it will be trained on your personal and natural hand poses (see <em><a href="#hamoco-train">hamoco-train</a></em>). Type <code>hamoco-data --help</code> for more information on the available options.</p>
<p>This application requires two arguments:</p>
<ul>
<li><code>pose</code>: a string that indicates the type of hand pose you intend to record. It should be one of: <code>OPEN</code>, <code>CLOSE</code>, <code>INDEX_UP</code>, <code>PINKY_UP</code>, <code>THUMB_SIDE</code>, <code>INDEX_MIDDLE_UP</code>.</li>
<li><code>path_to_data</code>: path to the folder inside of which you want the recorded data to be saved.</li>
</ul>
<p>Examples:</p>
<ul>
<li><code>hamoco-data OPEN data/ --delay 1.0</code> : starts the recording for the <code>OPEN</code> hand pose, stores the resulting data in the <code>data</code> folder (provided it exists!), and takes a new snapshot every second.</li>
<li><code>hamoco-data INDEX_UP data/ --delay 0.25 --images</code> : starts the recording for the <code>INDEX_UP</code> hand pose, stores the resulting data in the <code>data</code> folder, takes a new snapshot every 0.25s, and saves the images (in addition to the numeric data file used for training the model). Saving images can be useful if you want to manually check if your hand was in a correct position when its numerical data was recorded, and hence keep or remove specific data files accordingly.</li>
<li><code>hamoco-data CLOSE data/ --reset --stop_after 200</code> : starts the recording of the <code>CLOSE</code> hand pose, stores the resulting data in the <code>data</code> folder, deletes every previously recorded file for this hand pose, and automatically stop the recording after taking 200 snapshots.</li>
</ul>
<h3 id="hamoco-train">hamoco-train</h3>
<p>Provided a path to a directory with compatible data, <em>hamoco-train</em> trains a customizable NN-based classification model to predict a hand pose. This classification model can then be used in the main application in place of the one provided by default. Type <code>hamoco-train --help</code> for more information on the available options.</p>
<p>This application requires two arguments:</p>
<ul>
<li><code>path_to_model</code> : path to save the newly trained model.</li>
<li><code>path_to_data</code> : path to the data folder to use to train the model (see <em><a href="#hamoco-data">hamoco-data</a></em>).</li>
</ul>
<p>Examples:</p>
<ul>
<li><code>hamoco-train my_custom_model.h5 data/ --hiden_layers 50 25 --epochs 20</code> : trains and save a model named <code>my_custom_model.h5</code> that contains two hidden layers (with dimensions 50 and 25 respectively) over 20 epochs, by using the compatible data in the <code>data</code> folder.</li>
<li><code>hamoco-train my_custom_model.h5 data/ --epochs 10 --learning_rate 0.1</code> : trains and save a model named <code>my_custom_model.h5</code> with default dimensions over 20 epochs and with a learning rate of 0.1, by using the compatible data in the <code>data</code> folder.</li>
</ul>
<p>Your model can then be used in the main application with the <code>--model</code> flag of <em><a href="#hamoco-run">hamoco-run</a></em>, <em>e.g.</em> <code>hamoco-run --model &lt;path_to_your_model&gt;</code> , or you can change the <code>.json</code> configuration file to point to it.</p>
<h2 id="author">Author</h2>
<p><a href="https://www.jorisparet.com/" target="_blank" rel="noopener">Joris Paret</a></p>
</description>
</item>
<item>
<title>Synth Road</title>
<link>https://jorisparet.github.io/project/synth-road/</link>
<pubDate>Fri, 28 Jan 2022 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/project/synth-road/</guid>
<description><p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img alt="" srcset="
/project/synth-road/logo_hubc7924bf52b605797180a57e07dea06c_474684_8ee3a0519e0d5514617e91764f8db2af.webp 400w,
/project/synth-road/logo_hubc7924bf52b605797180a57e07dea06c_474684_4f637fca32d2473f216cbfd89eae1a83.webp 760w,
/project/synth-road/logo_hubc7924bf52b605797180a57e07dea06c_474684_1200x1200_fit_q100_h2_lanczos_3.webp 1200w"
src="https://jorisparet.github.io/project/synth-road/logo_hubc7924bf52b605797180a57e07dea06c_474684_8ee3a0519e0d5514617e91764f8db2af.webp"
width="760"
height="371"
loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<p><em>Synth Road</em> is an arcade mobile game of obstacle course for Android with <a href="https://en.wikipedia.org/wiki/Synthwave" target="_blank" rel="noopener">synthwave</a> vibes. Money generated from the <strong>optional</strong> advertisement is donated to NGOs.</p>
<p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img alt="" srcset="
/project/synth-road/screenshot_huaf22aa47176846097c1370633a1b7763_167575_9c919d0d28f0bc78df46291cb3815f4d.webp 400w,
/project/synth-road/screenshot_huaf22aa47176846097c1370633a1b7763_167575_9a948736c494547099dea9803b1f3934.webp 760w,
/project/synth-road/screenshot_huaf22aa47176846097c1370633a1b7763_167575_1200x1200_fit_q100_h2_lanczos.webp 1200w"
src="https://jorisparet.github.io/project/synth-road/screenshot_huaf22aa47176846097c1370633a1b7763_167575_9c919d0d28f0bc78df46291cb3815f4d.webp"
width="380"
height="760"
loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<h2 id="game-rules">Game rules</h2>
<hr>
<ol>
<li>Hold your finger down at the bottom of the screen and move it to slide the player left and right to avoid the obstacles.</li>
<li>Collect as many blue bonuses as possible to increase the score multiplier.</li>
<li>Collect gold bonuses to temporarily become invincible: bash through obstacles and collect as many blue bonuses as possible.</li>
<li>Use the power buttons to temporarily slow time, shrink the player or disintegrate obstacles.</li>
<li>Publish your highscores on the leaderboard to compare your results with other players.</li>
</ol>
<h2 id="about">About</h2>
<hr>
<p>Synth Road is made with <a href="https://unity.com/" target="_blank" rel="noopener">Unity</a>. Illustrations, music and sound effects are original creations.</p>
<h2 id="author">Author</h2>
<hr>
<p><a href="https://jorisparet.github.io" target="_blank" rel="noopener">Joris Paret</a></p>
</description>
</item>
<item>
<title>Hidden order in disordered materials</title>
<link>https://jorisparet.github.io/publication/2021_thesis/</link>
<pubDate>Fri, 26 Nov 2021 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/publication/2021_thesis/</guid>
<description><!-- Supplementary notes can be added here, including [code, math, and images](https://wowchemy.com/docs/writing-markdown-latex/). -->
</description>
</item>
<item>
<title>“partycls: A Python package for structural clustering” - The Journal of Open Source Software</title>
<link>https://jorisparet.github.io/post/2021-11-08_partycls-joss/</link>
<pubDate>Mon, 08 Nov 2021 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/post/2021-11-08_partycls-joss/</guid>
<description><h1 id="presentation">Presentation</h1>
<p><strong>partycls</strong> is a Python framework for cluster analysis of systems of interacting particles. By grouping particles that share similar structural or dynamical properties, partycls enables rapid and unsupervised exploration of the system’s relevant features. It provides descriptors suitable for applications in condensed matter physics and integrates the necessary tools of unsupervised learning, such as dimensionality reduction, into a streamlined workflow. Through a simple and expressive interface, partycls allows one to open a trajectory file, perform a clustering based on the selected structural descriptor, and analyze and save the results with only a few lines of code.</p>
<hr>
<h1 id="related-publication">Related publication</h1>
<p>A <a href="https://joss.theoj.org/papers/10.21105/joss.03723" target="_blank" rel="noopener">short paper</a> presenting the code, written in collaboration with <a href="https://www2.units.it/daniele.coslovich/" target="_blank" rel="noopener">Daniele Coslovich</a>, was published in <em>The Journal of Open Source Software</em>.</p>
<hr>
<h1 id="short-example">Short example</h1>
<p>As a simple example, we consider the detection of the grain boundaries in a polycrystal formed by differently oriented FCC crystallites. This is easily achieved even with a simple radial descriptor, since the average radial distribution of particles at the boundaries is different than the one of the crystal in the bulk. The following short piece of code opens the input trajectory stored in the file <code>grains.xyz</code>, computes the local radial distribution functions of the particles, applies a standard <a href="https://en.wikipedia.org/wiki/Standard_score" target="_blank" rel="noopener">Z-Score</a> normalization on the data, and finally performs a clustering using the <a href="https://towardsdatascience.com/gaussian-mixture-models-explained-6986aaf5a95" target="_blank" rel="noopener">Gaussian mixture model</a> (GMM) with $K = 2$ clusters (default):</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-python" data-lang="python"><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">partycls</span> <span class="kn">import</span> <span class="n">Workflow</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="n">wf</span> <span class="o">=</span> <span class="n">Workflow</span><span class="p">(</span><span class="s1">&#39;grains.xyz&#39;</span><span class="p">,</span>
</span></span><span class="line"><span class="cl"> <span class="n">descriptor</span><span class="o">=</span><span class="s1">&#39;gr&#39;</span><span class="p">,</span>
</span></span><span class="line"><span class="cl"> <span class="n">scaling</span><span class="o">=</span><span class="s1">&#39;zscore&#39;</span><span class="p">,</span>
</span></span><span class="line"><span class="cl"> <span class="n">clustering</span><span class="o">=</span><span class="s1">&#39;gmm&#39;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">wf</span><span class="o">.</span><span class="n">run</span><span class="p">()</span>
</span></span></code></pre></div><p>Each of these steps is easily tunable, so as to change the workflow with little effort. The labels are available as a simple attribute of the <code>Workflow</code> instance. Optionally, a set of output files can be produced for further analysis, including a trajectory file with the cluster labels. Quick visualization of the clusters, as in the following figure, is possible within partycls through optional visualization backends.</p>
<p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img alt="" srcset="
/post/2021-11-08_partycls-joss/grains_huf7f5a4cdd2bca08a8f624fe01d8b681f_360916_485585a79fefc515a99cd33b3de36443.webp 400w,
/post/2021-11-08_partycls-joss/grains_huf7f5a4cdd2bca08a8f624fe01d8b681f_360916_29b3139895507953058505ba0aa2846c.webp 760w,
/post/2021-11-08_partycls-joss/grains_huf7f5a4cdd2bca08a8f624fe01d8b681f_360916_1200x1200_fit_q100_h2_lanczos_3.webp 1200w"
src="https://jorisparet.github.io/post/2021-11-08_partycls-joss/grains_huf7f5a4cdd2bca08a8f624fe01d8b681f_360916_485585a79fefc515a99cd33b3de36443.webp"
width="760"
height="217"
loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<ul>
<li><strong>(a)</strong> A polycrystalline material with differently oriented FCC crystallites.</li>
<li><strong>(b)</strong> Using the individual radial distributions of the particles as structural descriptor, the algorithm identifies the crystalline domains (blue, $k = 0$) and the grain boundaries (red, $k = 1$).</li>
<li><strong>(c)</strong> The radial distribution functions restricted to these two clusters display a marked difference, with higher peaks for the crystals. The 3D visualization was performed with <a href="https://www.ovito.org/" target="_blank" rel="noopener">OVITO</a>.</li>
</ul>
</description>
</item>
<item>
<title>partycls: A Python package for structural clustering</title>
<link>https://jorisparet.github.io/publication/2021_joss/</link>
<pubDate>Mon, 01 Nov 2021 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/publication/2021_joss/</guid>
<description><!-- Supplementary notes can be added here, including [code, math, and images](https://wowchemy.com/docs/writing-markdown-latex/). -->
</description>
</item>
<item>
<title>partycls</title>
<link>https://jorisparet.github.io/project/partycls/</link>
<pubDate>Tue, 16 Mar 2021 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/project/partycls/</guid>
<description><p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img alt=""
src="https://jorisparet.github.io/project/partycls/logo.svg"
loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<h2 id="homepage">Homepage</h2>
<p>For more details and tutorials, visit the <a href="https://www.jorisparet.com/partycls" target="_blank" rel="noopener">homepage</a> of the project.</p>
<h2 id="quick-start">Quick start</h2>
<p>This quick example shows how to use partycls to identify grain boundaries in a polycrystalline system. The system configuration is stored in a <a href="https://en.wikipedia.org/wiki/XYZ_file_format" target="_blank" rel="noopener">XYZ</a> trajectory file with a single frame. We use the local distribution of bond angles around each particle as a structural descriptor and perform a clustering using the <a href="https://en.wikipedia.org/wiki/K-means_clustering" target="_blank" rel="noopener">K-Means</a> algorithm.</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-python" data-lang="python"><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">partycls</span> <span class="kn">import</span> <span class="n">Trajectory</span><span class="p">,</span> <span class="n">Workflow</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="n">traj</span> <span class="o">=</span> <span class="n">Trajectory</span><span class="p">(</span><span class="s1">&#39;grains.xyz&#39;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">wf</span> <span class="o">=</span> <span class="n">Workflow</span><span class="p">(</span><span class="n">traj</span><span class="p">,</span> <span class="n">descriptor</span><span class="o">=</span><span class="s1">&#39;ba&#39;</span><span class="p">,</span> <span class="n">clustering</span><span class="o">=</span><span class="s1">&#39;kmeans&#39;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">wf</span><span class="o">.</span><span class="n">run</span><span class="p">()</span>
</span></span><span class="line"><span class="cl"><span class="n">traj</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span><span class="o">.</span><span class="n">show</span><span class="p">(</span><span class="n">color</span><span class="o">=</span><span class="s1">&#39;label&#39;</span><span class="p">,</span> <span class="n">backend</span><span class="o">=</span><span class="s1">&#39;ovito&#39;</span><span class="p">)</span>
</span></span></code></pre></div><p>
<figure >
<div class="d-flex justify-content-center">
<div class="w-100" ><img src="https://raw.githubusercontent.com/jorisparet/partycls/master/data/snapshots/grains_labels.png" alt="" loading="lazy" data-zoomable /></div>
</div></figure>
</p>
<p>The results are also written to a set of files including a labeled trajectory file and additional information on the clustering results. The whole workflow can be tuned and customized, check out the <a href="https://www.jorisparet.com/partycls/tutorials" target="_blank" rel="noopener">tutorials</a> to see how and for further examples.</p>
<p>Thanks to a flexible system of filters, partycls makes it easy to restrict the analysis to a given subset of particles based on arbitrary particle properties. Say we have a binary mixture composed of particles with types A and B, and we are only interested in analyzing the bond angles of B particles in a vertical slice:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-python" data-lang="python"><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">partycls</span> <span class="kn">import</span> <span class="n">Trajectory</span>
</span></span><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">partycls.descriptors</span> <span class="kn">import</span> <span class="n">BondAngleDescriptor</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="n">traj</span> <span class="o">=</span> <span class="n">Trajectory</span><span class="p">(</span><span class="s1">&#39;trajectory.xyz&#39;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">D</span> <span class="o">=</span> <span class="n">BondAngleDescriptor</span><span class="p">(</span><span class="n">traj</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">D</span><span class="o">.</span><span class="n">add_filter</span><span class="p">(</span><span class="s2">&#34;species == &#39;B&#39;&#34;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">D</span><span class="o">.</span><span class="n">add_filter</span><span class="p">(</span><span class="s2">&#34;x &gt; 0.0&#34;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">D</span><span class="o">.</span><span class="n">add_filter</span><span class="p">(</span><span class="s2">&#34;x &lt; 1.0&#34;</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">D</span><span class="o">.</span><span class="n">compute</span><span class="p">()</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="c1"># Angular correlations for the selected particles</span>
</span></span><span class="line"><span class="cl"><span class="nb">print</span><span class="p">(</span><span class="n">D</span><span class="o">.</span><span class="n">features</span><span class="p">)</span>
</span></span></code></pre></div><p>We can then perform a clustering based on these structural features and ask for 3 clusters:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-python" data-lang="python"><span class="line"><span class="cl"><span class="kn">from</span> <span class="nn">partycls</span> <span class="kn">import</span> <span class="n">KMeans</span>
</span></span><span class="line"><span class="cl">
</span></span><span class="line"><span class="cl"><span class="n">clustering</span> <span class="o">=</span> <span class="n">KMeans</span><span class="p">(</span><span class="n">n_clusters</span><span class="o">=</span><span class="mi">3</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="n">clustering</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">D</span><span class="o">.</span><span class="n">features</span><span class="p">)</span>
</span></span><span class="line"><span class="cl"><span class="nb">print</span><span class="p">(</span><span class="s1">&#39;Cluster membership of the particles&#39;</span><span class="p">,</span> <span class="n">clustering</span><span class="o">.</span><span class="n">labels</span><span class="p">)</span>
</span></span></code></pre></div><h2 id="main-features">Main features</h2>
<h3 id="trajectory-formats">Trajectory formats</h3>
<p>partycls accepts several trajectory formats (including custom ones) either through its built-in trajectory reader or via third-party packages, such as <a href="www.mdtraj.org">MDTraj</a> and <a href="https://framagit.org/atooms/atooms" target="_blank" rel="noopener">atooms</a>. The code is currently optimized for small and medium system sizes (of order 10⁴ particles). Multiple trajectory frames can be analyzed to extend the structural dataset.</p>
<h3 id="structural-descriptors">Structural descriptors</h3>
<p>partycls implements various structural descriptors:</p>
<ul>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/gr.html" target="_blank" rel="noopener">Radial descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/tetra.html" target="_blank" rel="noopener">Tetrahedral descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/ba.html" target="_blank" rel="noopener">Bond-angle descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/sba.html" target="_blank" rel="noopener">Smoothed bond-angle descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/bo.html" target="_blank" rel="noopener">Bond-orientational descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/sbo.html" target="_blank" rel="noopener">Smoothed bond-orientational descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/labo.html" target="_blank" rel="noopener">Locally averaged bond-orientational descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/rbo.html" target="_blank" rel="noopener">Radial bond-orientational descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/compact.html" target="_blank" rel="noopener">Compactness descriptor</a></li>
<li><a href="https://www.jorisparet.com/partycls/tutorials/descriptors/coord.html" target="_blank" rel="noopener">Coordination descriptor</a></li>
</ul>
<h3 id="machine-learning">Machine learning</h3>
<p>partycls performs feature scaling, dimensionality reduction and cluster analysis using the <a href="https://scikit-learn.org" target="_blank" rel="noopener">scikit-learn</a> package and additional built-in algorithms.</p>
<h2 id="dependencies">Dependencies</h2>
<p>partycls relies on several external packages, most of which only provide additional features and are not necessarily required.</p>
<h3 id="required">Required</h3>
<ul>
<li>Fortran compiler (<em>e.g.</em> <a href="https://gcc.gnu.org/wiki/GFortran" target="_blank" rel="noopener">gfortran</a>)</li>
<li><a href="https://pypi.org/project/numpy/" target="_blank" rel="noopener">NumPy</a></li>
<li><a href="https://scikit-learn.org" target="_blank" rel="noopener">scikit-learn</a></li>
</ul>
<h3 id="optional">Optional</h3>
<ul>
<li><a href="https://www.mdtraj.org" target="_blank" rel="noopener">MDTraj</a> (additional trajectory formats)</li>
<li><a href="https://framagit.org/atooms/atooms" target="_blank" rel="noopener">atooms</a> (additional trajectory formats)</li>
<li><a href="https://singroup.github.io/dscribe" target="_blank" rel="noopener">DScribe</a> (additional descriptors)</li>
<li><a href="https://matplotlib.org/" target="_blank" rel="noopener">Matplotlib</a> (visualization)</li>
<li><a href="https://ovito.org/" target="_blank" rel="noopener">OVITO</a> &lt; 3.7.0 (visualization)</li>
<li><a href="https://github.com/avirshup/py3dmol" target="_blank" rel="noopener">Py3DMol</a> (interactive 3D visualization)</li>
<li><a href="https://github.com/joe-jordan/pyvoro" target="_blank" rel="noopener">pyvoro</a> or its <a href="https://framagit.org/coslo/pyvoro" target="_blank" rel="noopener">memory-optimized fork</a> for large systems (Voronoi neighbors and tessellation)</li>
<li><a href="https://tqdm.github.io/" target="_blank" rel="noopener">tqdm</a> (progress bars)</li>
</ul>
<h2 id="documentation">Documentation</h2>
<p>Check the <a href="https://www.jorisparet.com/partycls/tutorials" target="_blank" rel="noopener">tutorials</a> to see various examples and detailed instructions on how to run the code, as well as an in-depth presentation of the built-in structural descriptors.</p>
<p>For a more detailed documentation, you can check the <a href="https://www.jorisparet.com/partycls/api" target="_blank" rel="noopener">API</a>.</p>
<h2 id="installation">Installation</h2>
<h3 id="from-pypi">From PyPI</h3>
<p>The latest stable release is available on <a href="https://pypi.org/project/partycls/" target="_blank" rel="noopener">PyPI</a>. Install it with <code>pip</code>:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">pip install partycls
</span></span></code></pre></div><h3 id="from-source">From source</h3>
<p>To install the latest development version from source, clone the source code from the official <a href="https://github.com/jorisparet/partycls" target="_blank" rel="noopener">GitHub repository</a> and install it with:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">git clone https://github.com/jorisparet/partycls.git
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> partycls
</span></span><span class="line"><span class="cl">make install
</span></span></code></pre></div><p>Run the tests using:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl">make <span class="nb">test</span>
</span></span></code></pre></div><p>or manually compile the Fortran sources and run the tests:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-bash" data-lang="bash"><span class="line"><span class="cl"><span class="nb">cd</span> partycls/
</span></span><span class="line"><span class="cl">f2py -c -m neighbors_wrap neighbors.f90
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> descriptor/
</span></span><span class="line"><span class="cl">f2py -c -m realspace_wrap realspace.f90
</span></span><span class="line"><span class="cl"><span class="nb">cd</span> ../../
</span></span><span class="line"><span class="cl">pytest tests/
</span></span></code></pre></div><h2 id="support-and-contribution">Support and contribution</h2>
<p>If you wish to contribute or report an issue, feel free to <a href="mailto:[email protected]">contact us</a> or to use the <a href="https://github.com/jorisparet/partycls/issues" target="_blank" rel="noopener">issue tracker</a> and <a href="https://github.com/jorisparet/partycls/pulls" target="_blank" rel="noopener">pull requests</a> from the <a href="https://github.com/jorisparet/partycls" target="_blank" rel="noopener">code repository</a>.</p>
<p>We largely follow the <a href="https://guides.github.com/introduction/flow/" target="_blank" rel="noopener">GitHub flow</a> to integrate community contributions. In essence:</p>
<ol>
<li>Fork the repository.</li>
<li>Create a feature branch from <code>master</code>.</li>
<li>Unleash your creativity.</li>
<li>Run the tests.</li>
<li>Open a pull request.</li>
</ol>
<p>We also welcome contributions from other platforms, such as GitLab instances. Just let us know where to find your feature branch.</p>
<h2 id="citing-partycls">Citing partycls</h2>
<p>If you use partycls in a scientific publication, please consider citing the following article:</p>
<p><em><a href="https://joss.theoj.org/papers/10.21105/joss.03723" target="_blank" rel="noopener">partycls: A Python package for structural clustering</a>. Paret et al., (2021). Journal of Open Source Software, 6(67), 3723</em></p>
<p>Bibtex entry:</p>
<div class="highlight"><pre tabindex="0" class="chroma"><code class="language-go" data-lang="go"><span class="line"><span class="cl"><span class="err">@</span><span class="nx">article</span><span class="p">{</span><span class="nx">Paret2021</span><span class="p">,</span>
</span></span><span class="line"><span class="cl"> <span class="nx">doi</span> <span class="p">=</span> <span class="p">{</span><span class="mf">10.21105</span><span class="o">/</span><span class="nx">joss</span><span class="mf">.03723</span><span class="p">},</span>
</span></span><span class="line"><span class="cl"> <span class="nx">url</span> <span class="p">=</span> <span class="p">{</span><span class="nx">https</span><span class="p">:</span><span class="c1">//doi.org/10.21105/joss.03723},
</span></span></span><span class="line"><span class="cl"><span class="c1"></span> <span class="nx">year</span> <span class="p">=</span> <span class="p">{</span><span class="mi">2021</span><span class="p">},</span>
</span></span><span class="line"><span class="cl"> <span class="nx">publisher</span> <span class="p">=</span> <span class="p">{</span><span class="nx">The</span> <span class="nx">Open</span> <span class="nx">Journal</span><span class="p">},</span>
</span></span><span class="line"><span class="cl"> <span class="nx">volume</span> <span class="p">=</span> <span class="p">{</span><span class="mi">6</span><span class="p">},</span>
</span></span><span class="line"><span class="cl"> <span class="nx">number</span> <span class="p">=</span> <span class="p">{</span><span class="mi">67</span><span class="p">},</span>
</span></span><span class="line"><span class="cl"> <span class="nx">pages</span> <span class="p">=</span> <span class="p">{</span><span class="mi">3723</span><span class="p">},</span>
</span></span><span class="line"><span class="cl"> <span class="nx">author</span> <span class="p">=</span> <span class="p">{</span><span class="nx">Joris</span> <span class="nx">Paret</span> <span class="nx">and</span> <span class="nx">Daniele</span> <span class="nx">Coslovich</span><span class="p">},</span>
</span></span><span class="line"><span class="cl"> <span class="nx">title</span> <span class="p">=</span> <span class="p">{</span><span class="nx">partycls</span><span class="p">:</span> <span class="nx">A</span> <span class="nx">Python</span> <span class="kn">package</span> <span class="k">for</span> <span class="nx">structural</span> <span class="nx">clustering</span><span class="p">},</span>
</span></span><span class="line"><span class="cl"> <span class="nx">journal</span> <span class="p">=</span> <span class="p">{</span><span class="nx">Journal</span> <span class="nx">of</span> <span class="nx">Open</span> <span class="nx">Source</span> <span class="nx">Software</span><span class="p">}</span>
</span></span><span class="line"><span class="cl"><span class="p">}</span>
</span></span></code></pre></div><h2 id="authors">Authors</h2>
<p><a href="https://www.jorisparet.com/" target="_blank" rel="noopener">Joris Paret</a></p>
<p><a href="https://www.units.it/daniele.coslovich/" target="_blank" rel="noopener">Daniele Coslovich</a></p>
</description>
</item>
<item>
<title>Assessing the structural heterogeneity of supercooled liquids through community inference</title>
<link>https://jorisparet.github.io/publication/2020_jcp/</link>
<pubDate>Thu, 09 Apr 2020 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/publication/2020_jcp/</guid>
<description><!-- Supplementary notes can be added here, including [code, math, and images](https://wowchemy.com/docs/writing-markdown-latex/). -->
</description>
</item>
<item>
<title></title>
<link>https://jorisparet.github.io/admin/config.yml</link>
<pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
<guid>https://jorisparet.github.io/admin/config.yml</guid>
<description></description>
</item>
</channel>
</rss>