Skip to content

Commit

Permalink
add ep64 results
Browse files Browse the repository at this point in the history
  • Loading branch information
haya2333 committed Dec 30, 2021
1 parent 08baf41 commit dabad43
Show file tree
Hide file tree
Showing 21 changed files with 3,678 additions and 18 deletions.
45 changes: 28 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
# *FaRL* for *Fa*cial *R*epresentation *L*earning


[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/general-facial-representation-learning-in-a/face-alignment-on-300w)](https://paperswithcode.com/sota/face-alignment-on-300w?p=general-facial-representation-learning-in-a)
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/general-facial-representation-learning-in-a/face-alignment-on-aflw-19)](https://paperswithcode.com/sota/face-alignment-on-aflw-19?p=general-facial-representation-learning-in-a)
Expand All @@ -22,11 +21,11 @@ After the pre-training, the image encoder can be utilized for various downstream

We offer different pre-trained transformer backbones as below.

| Model Name | Pre-training Data | Link |
| ----------- | -------------- | ----- |
| FaRL-Base-Patch16-LAIONFace20M-ep16 (used in paper) | LAION Face 20M | [OneDrive](https://1drv.ms/u/s!AperexS2nqQomyPsG2M4uPXay7Au?e=Ocvk1T), [BLOB](https://facevcstandard.blob.core.windows.net/haya/releases/farl/FaRL-Base-Patch16-LAIONFace20M-ep16.pth?sv=2020-08-04&st=2021-12-17T13%3A00%3A07Z&se=2025-01-18T13%3A00%3A00Z&sr=b&sp=r&sig=D0ZPJgp8BrAgHIdACfZzqPnyOcX1ivGdHnF8qgtWdoI%3D) |
| FaRL-Base-Patch16-LAIONFace20M-ep64 | LAION Face 20M | [BLOB](https://facevcstandard.blob.core.windows.net/haya/releases/farl/FaRL-Base-Patch16-LAIONFace20M-ep64.pth?sv=2020-08-04&st=2021-12-27T05%3A22%3A56Z&se=2025-12-21T05%3A22%3A00Z&sr=b&sp=r&sig=til1J9u%2FQqf6qRc6cPx9nPyOGl%2F9ahTyvQ3VBPePs6A%3D) |
| FaRL-Base-Patch16-LAIONFace50M-ep16 | LAION Face 50M | [OneDrive](https://1drv.ms/u/s!AperexS2nqQomyZp2z2DdUNoqTVp?e=T7C1QA), [BLOB](https://facevcstandard.blob.core.windows.net/haya/releases/farl/FaRL-Base-Patch16-LAIONFace50M-ep16.pth?sv=2020-08-04&st=2021-12-17T13%3A01%3A48Z&se=2025-01-17T13%3A01%3A00Z&sr=b&sp=r&sig=6g1B3f4vEmFc1tmz8QWSH6lRoK%2BABA%2FWfmqXLGS61MM%3D) |
| Model Name | Data | Epoch | Link |
| ----------- | -------------- | ----- | ---- |
| FaRL-Base-Patch16-LAIONFace20M-ep16 (used in paper) | LAION Face 20M | 16 | [BLOB](https://facevcstandard.blob.core.windows.net/haya/releases/farl/FaRL-Base-Patch16-LAIONFace20M-ep16.pth?sv=2020-08-04&st=2021-12-17T13%3A00%3A07Z&se=2025-01-18T13%3A00%3A00Z&sr=b&sp=r&sig=D0ZPJgp8BrAgHIdACfZzqPnyOcX1ivGdHnF8qgtWdoI%3D) |
| FaRL-Base-Patch16-LAIONFace20M-ep64 | LAION Face 20M | 64 | [BLOB](https://facevcstandard.blob.core.windows.net/haya/releases/farl/FaRL-Base-Patch16-LAIONFace20M-ep64.pth?sv=2020-08-04&st=2021-12-27T05%3A22%3A56Z&se=2025-12-21T05%3A22%3A00Z&sr=b&sp=r&sig=til1J9u%2FQqf6qRc6cPx9nPyOGl%2F9ahTyvQ3VBPePs6A%3D) |
<!-- | FaRL-Base-Patch16-LAIONFace50M-ep16 | LAION Face 50M | [BLOB](https://facevcstandard.blob.core.windows.net/haya/releases/farl/FaRL-Base-Patch16-LAIONFace50M-ep16.pth?sv=2020-08-04&st=2021-12-17T13%3A01%3A48Z&se=2025-01-17T13%3A01%3A00Z&sr=b&sp=r&sig=6g1B3f4vEmFc1tmz8QWSH6lRoK%2BABA%2FWfmqXLGS61MM%3D) | -->


## Setup Downstream Training
Expand Down Expand Up @@ -66,7 +65,7 @@ python -m blueprint.run \
--exp_name farl --blob_root ./blob
```

It is also easy to create new config files for training and evaluation on your own. For example, you can customize your own face parsing task on CelebAMask-HQ by editing the values below.
It is also easy to create new config files for training and evaluation on your own. For example, you can customize your own face parsing task on CelebAMask-HQ by editing the values below (remember to remove the comments before running).

```yaml
package: farl.experiments.face_parsing
Expand All @@ -87,25 +86,42 @@ local_run:
## Performance
The following table illustrates their performances reported in the paper (Paper) or reproduced using this repo (Rep). There are small differences between their performances due to code refactorization.
The following table illustrates the performances of our `FaRL-Base-Patch16-LAIONFace20M-ep16` pre-training, which is pre-trained with 16 epoches, both reported in the paper (Paper) and reproduced using this repo (Rep). There are small differences between their performances due to code refactorization.

| File Name | Task | Benchmark | Metric | Score (Paper/Rep) | Logs (Paper/Rep) |
| Name | Task | Benchmark | Metric | Score (Paper/Rep) | Logs (Paper/Rep) |
| ---- | ---- | ---- | --- | --- | --- |
| [face_parsing/<br/>train_celebm_farl-b-ep16-448_refinebb.yaml](./farl/experiments/face_parsing/train_celebm_farl-b-ep16_448_refinebb.yaml) | Face Parsing | CelebAMask-HQ | F1-mean ⇑ | 89.56/89.65 | [Paper](./logs/paper/face_parsing.train_celebm_farl-b-ep16-448_refinebb), [Rep](./logs/reproduce/face_parsing.train_celebm_farl-b-ep16_448_refinebb) |
| [face_parsing/<br/>train_lapa_farl-b-ep16_448_refinebb.yaml](./farl/experiments/face_parsing/train_lapa_farl-b-ep16_448_refinebb.yaml) | Face Parsing | LaPa | F1-mean ⇑ | 93.88/93.86 | [Paper](./logs/paper/face_parsing.train_lapa_farl-b-ep16_448_refinebb), [Rep](./logs/reproduce/face_parsing.train_lapa_farl-b-ep16_448_refinebb) |
| [face_alignment/<br/>train_aflw19_farl-b-ep16_448_refinebb.yaml](./farl/experiments/face_alignment/train_aflw19_farl-b-ep16_448_refinebb.yaml) | Face Alignment | AFLW-19 (Full) | NME_diag ⇓ | 0.943/0.943 | [Paper](./logs/paper/face_alignment.train_aflw19_farl-b-ep16_448_refinebb), [Rep](./logs/reproduce/face_alignment.train_aflw19_farl-b-ep16_448_refinebb) |
| [face_alignment/<br/>train_ibug300w_farl-b-ep16_448_refinebb.yaml](./farl/experiments/face_alignment/train_ibug300w_farl-b-ep16_448_refinebb.yaml) | Face Alignment | 300W (Full) | NME_inter-ocular ⇓ | 2.93/2.92 | [Paper](./logs/paper/face_alignment.train_ibug300w_farl-b-ep16_448_refinebb), [Rep](./logs/reproduce/face_alignment.train_ibug300w_farl-b-ep16_448_refinebb) |
| [face_alignment/<br/>train_wflw_farl-b-ep16_448_refinebb.yaml](./farl/experiments/face_alignment/train_wflw_farl-b-ep16_448_refinebb.yaml) | Face Alignment | WFLW (Full) | NME_inter-ocular ⇓ | 3.96/3.98 | [Paper](./logs/paper/face_alignment.train_wflw_farl-b-ep16_448_refinebb), [Rep](./logs/reproduce/face_alignment.train_wflw_farl-b-ep16_448_refinebb) |

We also report results using the 50M pre-trained backbone, showing further enhancement on LaPa and AFLW-19.

| File Name | Task | Benchmark | Metric | Score | Logs |
Below we also report results of our new `FaRL-Base-Patch16-LAIONFace20M-ep64`, which is pre-trained with 64 epoches instead of 16 epoches as above, showing further improvements on most tasks.

| Name | Task | Benchmark | Metric | Score | Logs |
| ---- | ---- | ---- | --- | --- | --- |
| [face_parsing/<br/>train_celebm_farl-b-ep64-448_refinebb.yaml](./farl/experiments/face_parsing/train_celebm_farl-b-ep64_448_refinebb.yaml) | Face Parsing | CelebAMask-HQ | F1-mean ⇑ | 89.57 | [Rep](./logs/reproduce/face_parsing.train_celebm_farl-b-ep64_448_refinebb) |
| [face_parsing/<br/>train_lapa_farl-b-ep64_448_refinebb.yaml](./farl/experiments/face_parsing/train_lapa_farl-b-ep64_448_refinebb.yaml) | Face Parsing | LaPa | F1-mean ⇑ | 94.04 | [Rep](./logs/reproduce/face_parsing.train_lapa_farl-b-ep64_448_refinebb) |
| [face_alignment/<br/>train_aflw19_farl-b-ep64_448_refinebb.yaml](./farl/experiments/face_alignment/train_aflw19_farl-b-ep64_448_refinebb.yaml) | Face Alignment | AFLW-19 (Full) | NME_diag ⇓ | 0.938 | [Rep](./logs/reproduce/face_alignment.train_aflw19_farl-b-ep64_448_refinebb) |
| [face_alignment/<br/>train_ibug300w_farl-b-ep64_448_refinebb.yaml](./farl/experiments/face_alignment/train_ibug300w_farl-b-ep64_448_refinebb.yaml) | Face Alignment | 300W (Full) | NME_inter-ocular ⇓ | 2.88 | [Rep](./logs/reproduce/face_alignment.train_ibug300w_farl-b-ep64_448_refinebb) |
| [face_alignment/<br/>train_wflw_farl-b-ep64_448_refinebb.yaml](./farl/experiments/face_alignment/train_wflw_farl-b-ep64_448_refinebb.yaml) | Face Alignment | WFLW (Full) | NME_inter-ocular ⇓ | 3.88 | [Rep](./logs/reproduce/face_alignment.train_wflw_farl-b-ep64_448_refinebb) |


<!-- We also report results using the 50M pre-trained backbone, showing further enhancement on LaPa and AFLW-19.

| Config | Task | Benchmark | Metric | Score | Logs |
| ---- | ---- | ---- | --- | --- | --- |
| [face_parsing/<br/>train_celebm_farl-b-50m-ep16-448_refinebb.yaml](./farl/experiments/face_parsing/train_celebm_farl-b-50m-ep16_448_refinebb.yaml) | Face Parsing | CelebAMask-HQ | F1-mean ⇑ | 89.68 | [Rep](./logs/reproduce/face_parsing.train_celebm_farl-b-50m-ep16_448_refinebb) |
| [face_parsing/<br/>train_lapa_farl-b-50m-ep16_448_refinebb.yaml](./farl/experiments/face_parsing/train_lapa_farl-b-50m-ep16_448_refinebb.yaml) | Face Parsing | LaPa | F1-mean ⇑ | 94.01 | [Rep](./logs/reproduce/face_parsing.train_lapa_farl-b-50m-ep16_448_refinebb) |
| [face_alignment/<br/>train_aflw19_farl-b-50m-ep16_448_refinebb.yaml](./farl/experiments/face_alignment/train_aflw19_farl-b-50m-ep16_448_refinebb.yaml) | Face Alignment | AFLW-19 (Full) | NME_diag ⇓ | 0.937 | [Rep](./logs/reproduce/face_alignment.train_aflw19_farl-b-50m-ep16_448_refinebb) |
| [face_alignment/<br/>train_ibug300w_farl-b-50m-ep16_448_refinebb.yaml](./farl/experiments/face_alignment/train_ibug300w_farl-b-50m-ep16_448_refinebb.yaml) | Face Alignment | 300W (Full) | NME_inter-ocular ⇓ | 2.92 | [Rep](./logs/reproduce/face_alignment.train_ibug300w_farl-b-50m-ep16_448_refinebb) |
| [face_alignment/<br/>train_wflw_farl-b-50m-ep16_448_refinebb.yaml](./farl/experiments/face_alignment/train_wflw_farl-b-50m-ep16_448_refinebb.yaml) | Face Alignment | WFLW (Full) | NME_inter-ocular ⇓ | 3.99 | [Rep](./logs/reproduce/face_alignment.train_wflw_farl-b-50m-ep16_448_refinebb) |
| [face_alignment/<br/>train_wflw_farl-b-50m-ep16_448_refinebb.yaml](./farl/experiments/face_alignment/train_wflw_farl-b-50m-ep16_448_refinebb.yaml) | Face Alignment | WFLW (Full) | NME_inter-ocular ⇓ | 3.99 | [Rep](./logs/reproduce/face_alignment.train_wflw_farl-b-50m-ep16_448_refinebb) | -->


## Contact

For help or issues concerning the code and the released models, feel free to submit a GitHub issue, or contact [Hao Yang](https://haya.pro) ([[email protected]](mailto:[email protected])).


## Citation
Expand All @@ -120,11 +136,6 @@ If you find our work helpful, please consider citing
}
```
## Contact

For help or issues concerning the code and the released models, please submit a GitHub issue.
Otherwise, please contact [Hao Yang](https://haya.pro) (`[email protected]`).

## Trademarks
Expand Down
2 changes: 1 addition & 1 deletion SUPPORT.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ This project uses GitHub Issues to track bugs and feature requests. Please searc
issues before filing new issues to avoid duplicates. For new issues, file your bug or
feature request as a new Issue.

For help and questions about using this project, please contact [Hao Yang](https://haya.pro) (`[email protected]`).
For help and questions about using this project, please contact [Hao Yang](https://haya.pro) ([[email protected]](mailto:[email protected])).

## Microsoft Support Policy

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.

package: farl.experiments.face_alignment

class: blueprint.ml.DistributedGPURun
local_run:
$PARSE('./trainers/aflw19_farl.yaml',
cfg_file=FILE,
train_data_ratio=None,
batch_size=5,
model_type='base',
model_path=BLOB('checkpoint/FaRL-Base-Patch16-LAIONFace20M-ep64.pth'),
input_resolution=448,
head_channel=768,
optimizer_name='refine_backbone',
enable_amp=False)
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.

package: farl.experiments.face_alignment

class: blueprint.ml.DistributedGPURun
local_run:
$PARSE('./trainers/ibug300w_farl.yaml',
cfg_file=FILE,
train_data_ratio=None,
batch_size=5,
model_type='base',
model_path=BLOB('checkpoint/FaRL-Base-Patch16-LAIONFace20M-ep64.pth'),
input_resolution=448,
head_channel=768,
optimizer_name='refine_backbone',
enable_amp=False)
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.

package: farl.experiments.face_alignment

class: blueprint.ml.DistributedGPURun
local_run:
$PARSE('./trainers/wflw_farl.yaml',
cfg_file=FILE,
train_data_ratio=None,
batch_size=5,
model_type='base',
model_path=BLOB('checkpoint/FaRL-Base-Patch16-LAIONFace20M-ep64.pth'),
input_resolution=448,
head_channel=768,
optimizer_name='refine_backbone',
enable_amp=False)
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.

package: farl.experiments.face_parsing

class: blueprint.ml.DistributedGPURun
local_run:
$PARSE('./trainers/celebm_farl.yaml',
cfg_file=FILE,
train_data_ratio=None,
batch_size=5,
model_type='base',
model_path=BLOB('checkpoint/FaRL-Base-Patch16-LAIONFace20M-ep64.pth'),
input_resolution=448,
head_channel=768,
optimizer_name='refine_backbone',
enable_amp=False)
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.

package: farl.experiments.face_parsing

class: blueprint.ml.DistributedGPURun
local_run:
$PARSE('./trainers/lapa_farl.yaml',
cfg_file=FILE,
train_data_ratio=None,
batch_size=5,
model_type='base',
model_path=BLOB('checkpoint/FaRL-Base-Patch16-LAIONFace20M-ep64.pth'),
input_resolution=448,
head_channel=768,
optimizer_name='refine_backbone',
enable_amp=False)
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
global_step epoch inter_ocular inter_pupil box diag auc_box_7 fr_box_7
500 1 0.2814381430261058 0.41885215943663073 0.10713332872438583 0.07575439787106703 0.0019018304996417192 0.9758321933424533
1000 2 0.17002066400460983 0.2540649748044203 0.061316688909848094 0.043357271611065396 0.14344540529390057 0.1094391244870041
1500 3 0.07078309413563753 0.10633050628954344 0.024175536942384147 0.017094629081590394 0.6611914533255163 0.004103967168262668
2000 4 0.04625239200478974 0.06961887566618218 0.015649217152454188 0.01106563225436091 0.7817782229170739 0.0029639762881896736
2500 5 0.04223303701363358 0.06360247360208618 0.014264682955900611 0.010086619685460485 0.800483518988991 0.002507979936160476
3000 6 0.041085674183258876 0.06185928393432227 0.013851238829981937 0.009794269149508913 0.8061440622760733 0.0020519835841312783
3500 7 0.040619098264987316 0.06113140560207549 0.01367669425160712 0.009670848124548012 0.8086129568106313 0.0020519835841312783
4000 8 0.04047411193708497 0.0609115203982665 0.013616412293677237 0.009628222839939938 0.8093884763207609 0.0020519835841312783
4500 9 0.04013620246995055 0.06040206609987745 0.013516743977864584 0.009557745742623903 0.8106336720734807 0.0015959872321021917
5000 10 0.039965249064628575 0.060082104670311075 0.013439680309095613 0.00950325280088189 0.8118835906455607 0.001367989056087593
5500 11 0.040075445631786506 0.060206053629891274 0.013456335037307983 0.009515030874381911 0.8117806657546739 0.001367989056087593
6000 12 0.03995748399003042 0.06002157048661579 0.01340347547483292 0.009477652755438113 0.8125644909126443 0.0015959872321021917
6500 13 0.03976160720011115 0.059695291236498235 0.013342023711198006 0.00943420034426834 0.8134608494560618 0.0015959872321021917
7000 14 0.0398093770205893 0.05979287651538631 0.013339652789003275 0.009432523477800459 0.8138977916748096 0.0018239854081167906
7500 15 0.03962548878222256 0.05947365958669987 0.013290326996475825 0.009397645003166146 0.8143570451436389 0.0018239854081167906
8000 16 0.03943022777106846 0.059187075453591684 0.013265424485734973 0.009380037905253399 0.8145327665950103 0.0015959872321021917
8500 17 0.039339854409582695 0.05909946092013295 0.013268546866761785 0.009382244445828613 0.8142730115301936 0.0018239854081167906
9000 18 0.0396895123789205 0.059661684327619126 0.01332956636761945 0.009425392446591872 0.8141977721321088 0.0018239854081167906
9500 19 0.03979430261725875 0.05978074184683866 0.013332530455234875 0.00942749026916476 0.8141378411829849 0.0018239854081167906
10000 20 0.03963351478002629 0.05950143573167821 0.013289879078306241 0.009397331025575429 0.814310956940916 0.0015959872321021917
10500 21 0.039648157790323615 0.05950849805310932 0.013291422873939274 0.009398422554318784 0.8143164940394765 0.0015959872321021917
11000 22 0.03967515464466128 0.05954689203306686 0.013296124710103881 0.009401746715957914 0.8141725294769073 0.0015959872321021917
11500 23 0.039708559772928066 0.059597504176997836 0.013305010014998005 0.009408031486236392 0.8140298677610581 0.0015959872321021917
12000 24 0.039667114730953246 0.05953555057106418 0.01331041982279376 0.009411854880942202 0.8138487720669665 0.0015959872321021917
12500 25 0.03967443449661387 0.05953680300245098 0.013326420503504136 0.009423168511136285 0.8136080711354311 0.0015959872321021917
13000 26 0.03973255079193742 0.05964009380210115 0.013359941307725397 0.009446872514619493 0.8130911666992381 0.0015959872321021917
13500 27 0.039732943916233825 0.0596550046935562 0.013384079313473891 0.00946394124098472 0.8126283304019284 0.0015959872321021917
14000 28 0.03985245370234304 0.05984229884949983 0.013416291152947143 0.00948671722760007 0.8121495342322979 0.0015959872321021917
14500 29 0.03988163535365545 0.059904976082456964 0.013447578454528614 0.00950884177594548 0.8116870236466682 0.0015959872321021917
15000 30 0.04000825616686082 0.06008586196447147 0.013489506204433763 0.009538488740307851 0.8111127939547914 0.0015959872321021917
15500 31 0.04010855156811089 0.06024135827909471 0.013525384014945458 0.009563858303586695 0.8105844896097976 0.0015959872321021917
16000 32 0.04018360699794089 0.06036461840141216 0.013547252858549397 0.009579322352237588 0.8102525894078564 0.0015959872321021917
16500 33 0.040370160131980665 0.06061522296395848 0.013583014993106617 0.009604610159578685 0.8098206957201488 0.0015959872321021917
17000 34 0.04045603865637825 0.060752329411052214 0.013616260958217998 0.009628117600913685 0.8094070418865223 0.0015959872321021917
17500 35 0.04057392028168462 0.06091937592624259 0.013644100594509698 0.00964780425677493 0.809091752980262 0.0015959872321021917
18000 36 0.04066703855529312 0.061058416684033716 0.013666668712304599 0.009663762320028394 0.8087605042016809 0.0015959872321021917
18500 37 0.04075045048565393 0.06120921638095645 0.013691585138949748 0.009681378985125345 0.8083904957331772 0.0015959872321021917
19000 38 0.04083007033209359 0.0613661808234831 0.01371606147512148 0.00969868776083971 0.8080146244544331 0.0015959872321021917
19500 39 0.04091616803198387 0.061525712750352686 0.013736956205424575 0.009713462102483843 0.8077224610774544 0.0015959872321021917
20000 40 0.04096929895459274 0.06161789170042108 0.013757116002031944 0.009727718076692886 0.8074250863135953 0.0015959872321021917
20500 41 0.04102421111295172 0.06168608658943228 0.013780491242276116 0.009744244952534521 0.8071664712396589 0.0015959872321021917
21000 42 0.04103054980735935 0.0616953406657903 0.013792915535581965 0.009753030236866503 0.8069518272425251 0.0015959872321021917
21500 43 0.04113477297259747 0.06183125730306657 0.013818980894075699 0.009771460982364876 0.8066736694677872 0.0015959872321021917
22000 44 0.04120781407528037 0.061927360538146944 0.013836534937603312 0.009783873099254464 0.8064834538466551 0.0015959872321021917
22500 45 0.04127905654733278 0.06199781676160653 0.013850349951594919 0.009793641194327476 0.806355449156407 0.0015959872321021917
23000 46 0.04136341128020976 0.06212810441559436 0.013865397392860896 0.009804281642651156 0.8061795648491956 0.0015959872321021917
23500 47 0.0413935009442776 0.062190183264666694 0.013880151730392603 0.009814715091898405 0.8059027099211781 0.0015959872321021917
24000 48 0.04138021473464931 0.062125961366332506 0.013887421050900147 0.009819855279048393 0.8057958764901311 0.0015959872321021917
24500 49 0.04144283282501318 0.06218319748070922 0.013904946392875145 0.009832248261569572 0.8056330206501207 0.0015959872321021917
25000 50 0.04150123439637483 0.06225391810635046 0.013912215713382692 0.009837388448719559 0.8054792847371509 0.0015959872321021917
Loading

0 comments on commit dabad43

Please sign in to comment.