diff --git a/docs/async.html b/docs/async.html index ea90d7e..c86cd38 100644 --- a/docs/async.html +++ b/docs/async.html @@ -2,11 +2,13 @@ title: Async + keywords: fastai sidebar: home_sidebar summary: "Async toolset" description: "Async toolset" +nb_path: "nbs/03_async.ipynb" ---
Assume we have certain function, will produce a weight that will bring each cross entropy loss to the same constant scale ${a}$.
+Assume we have certain function, will produce a weight that will bring each cross entropy loss to the same constant scale ${a}$.
$L_{i}.f(c_{i})=a$
$f(c_{i})a.log_{10}(c_{i})=a$
Here we can get how to calculate $f(c_{i})$
@@ -635,6 +685,8 @@class
MultiTaskCE
Let's make this adjustment into the loss function, an upgraded version of CrossEntropy
@@ -750,6 +820,8 @@class
MultiTaskCE
class
MultiTaskCE
class
MultiTaskCE
class
MultiTaskCE
class
MultiTaskCE
pct_to_float
ensure_pct
detect_number_column
class
DataFilter
class
DataFilter
class
DataFilter
class
DataFilter
class
LayerTorch
class
Recursi
class
Recursi
class
RecursiveFi
class
RecursiveFi