**1617**

###
**City simulation software for modeling, planning, and strategic assessment of territorial city units**

Svítek, M.; Přibyl, O.; Vorel, J.; Garlík, B.; Resler, Jaroslav; Kozhevnikov, S.; Krč, Pavel; Geletič, Jan; Daniel, Milan; Dostál, R.; Janča, T.; Myška, V.; Aralkina, O.; Pereira, A. M.

2021 - English
SVÍTEK, M., PŘIBYL, O., VOREL, J., GARLÍK, B., RESLER, J., KOZHEVNIKOV, S., KRČ, P., GELETIČ, J., DANIEL, M., DOSTÁL, R., JANČA, T., MYŠKA, V., ARALKINA, O., PEREIRA, A. M. City simulation software for modeling, planning, and strategic assessment of territorial city units. 1.1. Prague: CTU & ICS CAS, 2021. Technical Report. ABSTRACT: The Smart Resilience City concept is a new vision of a city as a digital platform and eco-system of smart services where agents of people, things, documents, robots, and other entities can directly negotiate with each other on resource demand principals providing the best possible solution. It creates the smart environment making possible self-organization in sustainable or, when needed, resilient way of individuals, groups and the whole system objectives.
Keywords:
*Smart city; City simulation; Energy resource-demand modelling; Environmental modelling; Synthetic population; Transport modelling*
Available on request at various institutes of the ASCR
City simulation software for modeling, planning, and strategic assessment of territorial city units

SVÍTEK, M., PŘIBYL, O., VOREL, J., GARLÍK, B., RESLER, J., KOZHEVNIKOV, S., KRČ, P., GELETIČ, J., DANIEL, M., DOSTÁL, R., JANČA, T., MYŠKA, V., ARALKINA, O., PEREIRA, A. M. City simulation software ...

###
**Visual Images Segmentation based on Uniform Textures Extraction**

Goltsev, A.; Gritsenko, V.; Húsek, Dušan

2020 - English
A new effective procedure for partial texture segmentation of visual images is proposed. The procedure segments any input image into a number of non-overlapping homogeneous ne-grained texture areas. The main advantages of the proposed procedure are as follows. It is completely unsupervised, that is, it processes the input image without any prior knowledge of either the type of textures or the number of texture segments in the image. In addition, the procedure segments arbitrary images of all types. This means that no changes to the procedure parameters are required to switch from one image type to another. Another major advantage of the procedure is that in most cases it extracts the uniform ne-grained texture segments present in the image, just as humans do. This result is supported by series of experiments that demonstrate the ability of the procedure to delineate uniform ne-grained texture segments over a wide range of images. At a minimum, image processing according to the proposed technique leads to a signficant reduction in the uncertainty of the internal structure of the analyzed image.
Keywords:
*Texture feature; Texture window; Homogeneous ne-grained texture segment; Texture segment extraction; Texture segmentation*
Available at various institutes of the ASCR
Visual Images Segmentation based on Uniform Textures Extraction

A new effective procedure for partial texture segmentation of visual images is proposed. The procedure segments any input image into a number of non-overlapping homogeneous ne-grained texture areas. ...

###
**Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations**

Vlček, Jan; Lukšan, Ladislav

2020 - English
Limited-memory variable metric methods based on the well-known BFGS update are widely used for large scale optimization. The block version of the BFGS update, derived by Schnabel (1983), Hu and Storey (1991) and Vlček and Lukšan (2019), satisfies the quasi-Newton equations with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the quasi-Newton equations as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously combined with methods based on vector corrections for conjugacy (Vlček and Lukšan, 2015). Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.
Keywords:
*unconstrained minimization; variable metric methods; limited-memory methods; variationally derived methods; global convergence; numerical results*
Available in a digital repository NRGL
Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations

Limited-memory variable metric methods based on the well-known BFGS update are widely used for large scale optimization. The block version of the BFGS update, derived by Schnabel (1983), Hu and Storey ...

###
**Linear-time Algorithms for Largest Inscribed Quadrilateral**

Keikha, Vahideh

2020 - English
Let P be a convex polygon of n vertices. We present a linear-time algorithm for the problem of computing the largest-area inscribed quadrilateral of P. We also design the parallel version of the algorithm with O(log n) time and O(n) work in CREW PRAM model, which is quite work optimal. Our parallel algorithm also computes all the antipodal pairs of a convex polygon with O(log n) time and O(log2n+s) work, where s is the number of antipodal pairs, that we hope is of independent interest. We also discuss several approximation algorithms (both constant factor and approximation scheme) for computing the largest-inscribed k-gons for constant values of k, in both area and perimeter measures.
Keywords:
*Maximum-area quadrilateral; extreme area k-gon*
Available in a digital repository NRGL
Linear-time Algorithms for Largest Inscribed Quadrilateral

Let P be a convex polygon of n vertices. We present a linear-time algorithm for the problem of computing the largest-area inscribed quadrilateral of P. We also design the parallel version of the ...

###
**Least Weighted Absolute Value Estimator with an Application to Investment Data**

Vidnerová, Petra; Kalina, Jan

2020 - English
While linear regression represents the most fundamental model in current econometrics, the least squares (LS) estimator of its parameters is notoriously known to be vulnerable to the presence of outlying measurements (outliers) in the data. The class of M-estimators, thoroughly investigated since the groundbreaking work by Huber in 1960s, belongs to the classical robust estimation methodology (Jurečková et al., 2019). M-estimators are nevertheless not robust with respect to leverage points, which are defined as values outlying on the horizontal axis (i.e. outlying in one or more regressors). The least trimmed squares estimator seems therefore a more suitable highly robust method, i.e. with a high breakdown point (Rousseeuw & Leroy, 1987). Its version with weights implicitly assigned to individual observations, denoted as the least weighted squares estimator, was proposed and investigated in Víšek (2011). A trimmed estimator based on the 𝐿1-norm is available as the least trimmed absolute value estimator (Hawkins & Olive, 1999), which has not however acquired attention of practical econometricians. Moreover, to the best of our knowledge, its version with weights implicitly assigned to individual observations seems to be still lacking.
Keywords:
*robust regression; regression median; implicit weighting; computational aspects; nonparametric bootstrap*
Fulltext is available at external website.
Least Weighted Absolute Value Estimator with an Application to Investment Data

While linear regression represents the most fundamental model in current econometrics, the least squares (LS) estimator of its parameters is notoriously known to be vulnerable to the presence of ...

###
**On the Effect of Human Resources on Tourist Infrastructure: New Ideas on Heteroscedastic Modeling Using Regression Quantiles**

Kalina, Jan; Janáček, Patrik

2020 - English
Tourism represents an important sector of the economy in many countries around the world. In this work, we are interested in the effect of the Human Resources and Labor Market pillar of the Travel and Tourism Competitiveness Index on tourist service infrastructure across 141 countries of the world. A regression analysis requires to handle heteroscedasticity in these data, which is not an uncommon situation in various available human capital studies. Our first task is focused on testing significance of individual variables in the model. It is illustrated here that significance tests are influenced by heteroscedasticity, which remains true also for tests for regression quantiles or robust regression estimators, resistant to a possible contamination of data by outliers. Only if a suitable model is considered, which takes heteroscedasticity into account, the effect of the Human Resources and Labor Market pillar turns out to be significant. Further, we propose and present a new diagnostic tool denoted as aquintile plot, allowing to interpret immediately the heteroscedastic structure of the linear regression model for possibly contaminated data.
Keywords:
*tourism infrastructure; human resources; regression; robustness; regression quantiles*
Fulltext is available at external website.
On the Effect of Human Resources on Tourist Infrastructure: New Ideas on Heteroscedastic Modeling Using Regression Quantiles

Tourism represents an important sector of the economy in many countries around the world. In this work, we are interested in the effect of the Human Resources and Labor Market pillar of the Travel and ...

###
**Regression for High-Dimensional Data: From Regularization to Deep Learning**

Kalina, Jan; Vidnerová, Petra

2020 - English
Regression modeling is well known as a fundamental task in current econometrics. However, classical estimation tools for the linear regression model are not applicable to highdimensional data. Although there is not an agreement about a formal definition of high dimensional data, usually these are understood either as data with the number of variables p exceeding (possibly largely) the number of observations n, or as data with a large p in the order of (at least) thousands. In both situations, which appear in various field including econometrics, the analysis of the data is difficult due to the so-called curse of dimensionality (cf. Kalina (2013) for discussion). Compared to linear regression, nonlinear regression modeling with an unknown shape of the relationship of the response on the regressors requires even more intricate methods.
Keywords:
*regression; neural networks; robustness; high-dimensional data; regularization*
Fulltext is available at external website.
Regression for High-Dimensional Data: From Regularization to Deep Learning

Regression modeling is well known as a fundamental task in current econometrics. However, classical estimation tools for the linear regression model are not applicable to highdimensional data. ...

###
**The scalar-valued score functions of continuous probability distribution**

Fabián, Zdeněk

2019 - English
In this report we give theoretical basis of probability theory of continuous random variables based on scalar valued score functions. We maintain consistently the following point of view: It is not the observed value, which is to be used in probabilistic and statistical considerations, but its 'treated form', the value of the scalar-valued score function of distribution of the assumed model. Actually, the opinion that an observed value of random variable should be 'treated' with respect to underlying model is one of main ideas of the inference based on likelihood in classical statistics. However, a vector nature of Fisher score functions of classical statistics does not enable a consistent use of this point of view. Instead, various inference functions are suggested and used in solutions of various statistical problems. Inference function of this report is the scalar-valued score function of distribution.
Keywords:
*Shortcomings of probability theory; Scalar-valued score functions; Characteristics of continous random variables; Parametric estimation; Transformed distributions; Skew-symmetric distributions*
Available at various institutes of the ASCR
The scalar-valued score functions of continuous probability distribution

In this report we give theoretical basis of probability theory of continuous random variables based on scalar valued score functions. We maintain consistently the following point of view: It is not ...

###
**Lexicalized Syntactic Analysis by Restarting Automata**

Mráz, F.; Otto, F.; Pardubská, D.; Plátek, Martin

2019 - English
We study h-lexicalized two-way restarting automata that can rewrite at most i times per cycle for some i ≥ 1 (hRLWW(i)-automata). This model is considered useful for the study of lexical (syntactic) disambiguation, which is a concept from linguistics. It is based on certain reduction patterns. We study lexical disambiguation through the formal notion of h-lexicalized syntactic analysis (hLSA). The hLSA is composed of a basic language and the corresponding h-proper language, which is obtained from the basic language by mapping all basic symbols to input symbols. We stress the sensitivity of hLSA by hRLWW(i)-automata to the size of their windows, the number of possible rewrites per cycle, and the degree of (non-)monotonicity. We introduce the concepts of contextually transparent languages (CTL) and contextually transparent lexicalized analyses based on very special reduction patterns, and we present two-dimensional hierarchies of their subclasses based on the size of windows and on the degree of synchronization. The bottoms of these hierarchies correspond to the context-free languages. CTL creates a proper subclass of context-sensitive languages with syntactically natural properties.
Keywords:
*Restarting automaton; h-lexicalization; lexical disambiguation*
Fulltext is available at external website.
Lexicalized Syntactic Analysis by Restarting Automata

We study h-lexicalized two-way restarting automata that can rewrite at most i times per cycle for some i ≥ 1 (hRLWW(i)-automata). This model is considered useful for the study of lexical (syntactic) ...

###
**Laplacian preconditioning of elliptic PDEs: Localization of the eigenvalues of the discretized operator**

Gergelits, Tomáš; Mardal, K.-A.; Nielsen, B. F.; Strakoš, Z.

2019 - English
This contribution represents an extension of our earlier studies on the paradigmatic example of the inverse problem of the diffusion parameter estimation from spatio-temporal measurements of fluorescent particle concentration, see [6, 1, 3, 4, 5]. More precisely, we continue to look for an optimal bleaching pattern used in FRAP (Fluorescence Recovery After Photobleaching), being the initial condition of the Fickian diffusion equation maximizing a sensitivity measure. As follows, we define an optimization problem and we show the special feature (so-called complementarity principle) of the optimal binary-valued initial conditions.
Keywords:
*second order elliptic PDEs; preconditioning by the inverse Laplacian; eigenvalues of the discretized preconditioned problem; nodal values of the coefficient function; Hall’s theorem; convergence of the conjugate gradient method*
Available in digital repository of the ASCR
Laplacian preconditioning of elliptic PDEs: Localization of the eigenvalues of the discretized operator

This contribution represents an extension of our earlier studies on the paradigmatic example of the inverse problem of the diffusion parameter estimation from spatio-temporal measurements of ...

NRGL provides central access to information on grey literature produced in the Czech Republic in the fields of science, research and education. You can find more information about grey literature and NRGL at service web

Send your suggestions and comments to nusl@techlib.cz

Provider

Other bases