API
Index
Functions
Maxnet.complexity
— MethodGet the number of non-zero coefficients in the model
Maxnet.default_features
— Methoddefault_features(np)
Takes the number of presences np
and returns a Vector
of AbstractFeatureClass
s that are used my maxent as default.
If np
is less than ten, then only LinearFeature
and CategoricalFeature
are used. If it is at least 10, then QuadraticFeature
is additionally used. If it is at least 15, then HingeFeature
is additionally used. If it is at least 80, then ProductFeature
is additionally used.
Maxnet.maxnet
— Methodmaxnet(
p_a, X;
features, regularization_multiplier, regularization_function,
addsamplestobackground, weight_factor,
kw...
)
Fit a model using the maxnet algorithm.
Arguments
p_a
: ABitVector
where presences aretrue
and background samples arefalse
X
: A Tables.jl-compatible table of predictors. Categorical predictors should beCategoricalVector
s
Keywords
features
: Either aVector
ofAbstractFeatureClass
to be used in the model, or aString
where "l" = linear and categorical, "q" = quadratic, "p" = product, "t" = threshold, "h" = hinge (e.g. "lqh"); or By default, the features are based on the number of presences are used. Seedefault_features
regularization_multiplier
: A constant to adjust regularization, where a higherregularization_multiplier
results in a higher penalization for featuresregularization_function
: A function to compute a regularization for each feature. A defaultregularization_function
is built in.addsamplestobackground
: A boolean, wheretrue
adds the background samples to the predictors. Defaults totrue
.n_knots
: the number of knots used for Threshold and Hinge features. Defaults to 50. Ignored if there are neither Threshold nor Hinge featuresweight_factor
: AFloat64
value to adjust the weight of the background samples. Defaults to 100.0.kw...
: Further arguments to be passed toGLMNet.glmnet
Returns
model
: A model of typeMaxnetModel
Examples
using Maxnet
p_a, env = Maxnet.bradypus();
bradypus_model = maxnet(p_a, env; features = "lq")
Fit Maxnet model
Features classes: Maxnet.AbstractFeatureClass[LinearFeature(), CategoricalFeature(), QuadraticFeature()]
Entropy: 6.114650341746531
Model complexity: 21
Variables selected: [:frs6190_ann, :h_dem, :pre6190_l1, :pre6190_l10, :pre6190_l4, :pre6190_l7, :tmn6190_ann, :vap6190_ann, :ecoreg, :cld6190_ann, :dtr6190_ann, :tmx6190_ann]
StatsAPI.predict
— Methodpredict(m, X; link, clamp)
Use a maxnet model to predict on new data.
Arguments
m
: a MaxnetModel as returned bymaxnet
X
: aTables.jl
-compatible table of predictors. All columns that were used to fitm
should be present inX
Keywords
link
: the link function used. Defaults to CloglogLink(), which is the default on the Maxent Java appliation since version 4.3. Alternatively, LogitLink() was the Maxent default on earlier versions. To get exponential output, which can be interpreted as predicted abundance, use LogLink() IdentityLink() returns the exponent without any transformation.clamp
: Iftrue
, values inx
will be clamped to the range the model was trained on. Defaults tofalse
.
Returns
A Vector
with the resulting predictions.
Example
using Maxnet
p_a, env = Maxnet.bradypus();
bradypus_model = maxnet(p_a, env; features = "lq")
prediction = predict(bradypus_model, env)
Types
Maxnet.MaxnetBinaryClassifier
— TypeMaxnetBinaryClassifier
A model type for constructing a Maxnet, based on Maxnet.jl, and implementing the MLJ model interface.
From MLJ, the type can be imported using
MaxnetBinaryClassifier = @load MaxnetBinaryClassifier pkg=Maxnet
Do model = MaxnetBinaryClassifier()
to construct an instance with default hyper-parameters. Provide keyword arguments to override hyper-parameter defaults, as in MaxnetBinaryClassifier(features=...)
.
The keywords link
, and clamp
are passed to predict
, while all other keywords are passed to maxnet
. See the documentation of these functions for the meaning of these parameters and their defaults.
Example
using MLJBase
p_a, env = Maxnet.bradypus()
mach = machine(MaxnetBinaryClassifier(features = "lqp"), env, categorical(p_a), scitype_check_level = 0)
fit!(mach, verbosity = 0)
yhat = MLJBase.predict(mach, env)