Search results for: empirical orthogonal functions
4642 Modelling of Heat Generation in a 18650 Lithium-Ion Battery Cell under Varying Discharge Rates
Authors: Foo Shen Hwang, Thomas Confrey, Stephen Scully, Barry Flannery
Abstract:
Thermal characterization plays an important role in battery pack design. Lithium-ion batteries have to be maintained between 15-35 °C to operate optimally. Heat is generated (Q) internally within the batteries during both the charging and discharging phases. This can be quantified using several standard methods. The most common method of calculating the batteries heat generation is through the addition of both the joule heating effects and the entropic changes across the battery. In addition, such values can be derived by identifying the open-circuit voltage (OCV), nominal voltage (V), operating current (I), battery temperature (T) and the rate of change of the open-circuit voltage in relation to temperature (dOCV/dT). This paper focuses on experimental characterization and comparative modelling of the heat generation rate (Q) across several current discharge rates (0.5C, 1C, and 1.5C) of a 18650 cell. The analysis is conducted utilizing several non-linear mathematical functions methods, including polynomial, exponential, and power models. Parameter fitting is carried out over the respective function orders; polynomial (n = 3~7), exponential (n = 2) and power function. The generated parameter fitting functions are then used as heat source functions in a 3-D computational fluid dynamics (CFD) solver under natural convection conditions. Generated temperature profiles are analyzed for errors based on experimental discharge tests, conducted at standard room temperature (25°C). Initial experimental results display low deviation between both experimental and CFD temperature plots. As such, the heat generation function formulated could be easier utilized for larger battery applications than other methods available.Keywords: computational fluid dynamics, curve fitting, lithium-ion battery, voltage drop
Procedia PDF Downloads 964641 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process
Authors: Hong-Ming Chen
Abstract:
This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.Keywords: optimization, interest rate model, jump process, deterministic
Procedia PDF Downloads 1614640 Exploring the Effect of Accounting Information on Systematic Risk: An Empirical Evidence of Tehran Stock Exchange
Authors: Mojtaba Rezaei, Elham Heydari
Abstract:
This paper highlights the empirical results of analyzing the correlation between accounting information and systematic risk. This association is analyzed among financial ratios and systematic risk by considering the financial statement of 39 companies listed on the Tehran Stock Exchange (TSE) for five years (2014-2018). Financial ratios have been categorized into four groups and to describe the special features, as representative of accounting information we selected: Return on Asset (ROA), Debt Ratio (Total Debt to Total Asset), Current Ratio (current assets to current debt), Asset Turnover (Net sales to Total assets), and Total Assets. The hypotheses were tested through simple and multiple linear regression and T-student test. The findings illustrate that there is no significant relationship between accounting information and market risk. This indicates that in the selected sample, historical accounting information does not fully reflect the price of stocks.Keywords: accounting information, market risk, systematic risk, stock return, efficient market hypothesis, EMH, Tehran stock exchange, TSE
Procedia PDF Downloads 1354639 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing
Authors: Yehjune Heo
Abstract:
As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer
Procedia PDF Downloads 1374638 Conditions Required for New Sector Emergence: Results from a Systematic Literature Review
Authors: Laurie Prange-Martin, Romeo Turcan, Norman Fraser
Abstract:
The aim of this study is to identify the conditions required and describe the process of emergence for a new economic sector created from new or established businesses. A systematic literature review of English-language studies published from 1983 to 2016 was conducted using the following databases: ABI/INFORM Complete; Business Source Premiere; Google Scholar; Scopus; and Web of Science. The two main terms of business sector and emergence were used in the systematic literature search, along with another seventeen synonyms for each these main terms. From the search results, 65 publications met the requirements of an empirical study discussing and reporting the conditions of new sector emergence. A meta-analysis of the literature examined suggest that there are six favourable conditions and five key individuals or groups required for new sector emergence. In addition, the results from the meta-analysis showed that there are eighteen theories used in the literature to explain the phenomenon of new sector emergence, which can be grouped in three study disciplines. With such diversity in theoretical frameworks used in the 65 empirical studies, the authors of this paper propose the development of a new theory of sector emergence.Keywords: economic geography, new sector emergence, economic diversification, regional economies
Procedia PDF Downloads 2714637 Strategic Citizen Participation in Applied Planning Investigations: How Planners Use Etic and Emic Community Input Perspectives to Fill-in the Gaps in Their Analysis
Authors: John Gaber
Abstract:
Planners regularly use citizen input as empirical data to help them better understand community issues they know very little about. This type of community data is based on the lived experiences of local residents and is known as "emic" data. What is becoming more common practice for planners is their use of data from local experts and stakeholders (known as "etic" data or the outsider perspective) to help them fill in the gaps in their analysis of applied planning research projects. Utilizing international Health Impact Assessment (HIA) data, I look at who planners invite to their citizen input investigations. Research presented in this paper shows that planners access a wide range of emic and etic community perspectives in their search for the “community’s view.” The paper concludes with how planners can chart out a new empirical path in their execution of emic/etic citizen participation strategies in their applied planning research projects.Keywords: citizen participation, emic data, etic data, Health Impact Assessment (HIA)
Procedia PDF Downloads 4844636 Face Recognition Using Discrete Orthogonal Hahn Moments
Authors: Fatima Akhmedova, Simon Liao
Abstract:
One of the most critical decision points in the design of a face recognition system is the choice of an appropriate face representation. Effective feature descriptors are expected to convey sufficient, invariant and non-redundant facial information. In this work, we propose a set of Hahn moments as a new approach for feature description. Hahn moments have been widely used in image analysis due to their invariance, non-redundancy and the ability to extract features either globally and locally. To assess the applicability of Hahn moments to Face Recognition we conduct two experiments on the Olivetti Research Laboratory (ORL) database and University of Notre-Dame (UND) X1 biometric collection. Fusion of the global features along with the features from local facial regions are used as an input for the conventional k-NN classifier. The method reaches an accuracy of 93% of correctly recognized subjects for the ORL database and 94% for the UND database.Keywords: face recognition, Hahn moments, recognition-by-parts, time-lapse
Procedia PDF Downloads 3774635 5G Future Hyper-Dense Networks: An Empirical Study and Standardization Challenges
Authors: W. Hashim, H. Burok, N. Ghazaly, H. Ahmad Nasir, N. Mohamad Anas, A. F. Ismail, K. L. Yau
Abstract:
Future communication networks require devices that are able to work on a single platform but support heterogeneous operations which lead to service diversity and functional flexibility. This paper proposes two cognitive mechanisms termed cognitive hybrid function which is applied in multiple broadband user terminals in order to maintain reliable connectivity and preventing unnecessary interferences. By employing such mechanisms especially for future hyper-dense network, we can observe their performances in terms of optimized speed and power saving efficiency. Results were obtained from several empirical laboratory studies. It was found that selecting reliable network had shown a better optimized speed performance up to 37% improvement as compared without such function. In terms of power adjustment, our evaluation of this mechanism can reduce the power to 5dB while maintaining the same level of throughput at higher power performance. We also discuss the issues impacting future telecommunication standards whenever such devices get in place.Keywords: dense network, intelligent network selection, multiple networks, transmit power adjustment
Procedia PDF Downloads 3784634 Price to Earnings Growth (PEG) Predicting Future Returns Better than the Price to Earnings (PE) Ratio
Authors: Lindrianasari Stefanie, Aminah Khairudin
Abstract:
This study aims to provide empirical evidence regarding the ability of Price to Earnings Ratio and PEG Ratio in predicting future stock returns issuers. The samples used in this study are stocks that go into LQ45. The main contribution is to assign empirical evidence if the PEG Ratio can provide optimum return compared to Price to Earnings Ratio. This study used a sample of the entire company into the group LQ45 with the period of observation. The data used is limited to the financial statements of a company incorporated in LQ45 period July 2013-July 2014, using the financial statements and the position of the company's closing stock price at the end of 2010 as a reference benchmark for the growth of the company's stock price compared to the closing price of 2013. This study found that the method of PEG Ratio can outperform the method of PE ratio in predicting future returns on the stock portfolio of LQ45.Keywords: price to earnings growth, price to earnings ratio, future returns, stock price
Procedia PDF Downloads 4134633 Understanding Personal Well-Being among Entrepreneurial Breadwinners: Bibliographic and Empirical Analyses of Relative Resource Theory
Authors: E. Fredrick Rice
Abstract:
Over the past three decades, a substantial body of academic literature has asserted that the pressure to maintain household income can negatively affect the personal well-being of breadwinners. Given that scholars have failed to thoroughly explore this phenomenon with breadwinners who are also business owners, theory has been underdeveloped in the entrepreneurial context. To identify the most appropriate theories to apply to entrepreneurs, the current paper utilized two approaches. First, a comprehensive bibliographic analysis was conducted focusing on works at the intersection of breadwinner status and well-being. Co-authorship and journal citation patterns highlighted relative resource theory as a boundary spanning approach with promising applications in the entrepreneurial space. To build upon this theory, regression analysis was performed using data from the Panel Study of Entrepreneurial Dynamics (PSED). Empirical results showed evidence for the effects of breadwinner status and household income on entrepreneurial well-being. Further, the findings suggest that it is not merely income or job status that predicts well-being, but one’s relative financial contribution compared to that of one’s non-breadwinning organizationally employed partner. This paper offers insight into how breadwinner status can be studied in relation to the entrepreneurial personality.Keywords: breadwinner, entrepreneurship, household income, well-being.
Procedia PDF Downloads 1724632 Key Frame Based Video Summarization via Dependency Optimization
Authors: Janya Sainui
Abstract:
As a rapid growth of digital videos and data communications, video summarization that provides a shorter version of the video for fast video browsing and retrieval is necessary. Key frame extraction is one of the mechanisms to generate video summary. In general, the extracted key frames should both represent the entire video content and contain minimum redundancy. However, most of the existing approaches heuristically select key frames; hence, the selected key frames may not be the most different frames and/or not cover the entire content of a video. In this paper, we propose a method of video summarization which provides the reasonable objective functions for selecting key frames. In particular, we apply a statistical dependency measure called quadratic mutual informaion as our objective functions for maximizing the coverage of the entire video content as well as minimizing the redundancy among selected key frames. The proposed key frame extraction algorithm finds key frames as an optimization problem. Through experiments, we demonstrate the success of the proposed video summarization approach that produces video summary with better coverage of the entire video content while less redundancy among key frames comparing to the state-of-the-art approaches.Keywords: video summarization, key frame extraction, dependency measure, quadratic mutual information
Procedia PDF Downloads 2674631 Path Integrals and Effective Field Theory of Large Scale Structure
Authors: Revant Nayar
Abstract:
In this work, we recast the equations describing large scale structure, and by extension all nonlinear fluids, in the path integral formalism. We first calculate the well known two and three point functions using Schwinger Keldysh formalism used commonly to perturbatively solve path integrals in non- equilibrium systems. Then we include EFT corrections due to pressure, viscosity, and noise as effects on the time-dependent propagator. We are able to express results for arbitrary two and three point correlation functions in LSS in terms of differential operators acting on a triple K master intergral. We also, for the first time, get analytical results for more general initial conditions deviating from the usual power law P∝kⁿ by introducing a mass scale in the initial conditions. This robust field theoretic formalism empowers us with tools from strongly coupled QFT to study the strongly non-linear regime of LSS and turbulent fluid dynamics such as OPE and holographic duals. These could be used to capture fully the strongly non-linear dynamics of fluids and move towards solving the open problem of classical turbulence.Keywords: quantum field theory, cosmology, effective field theory, renormallisation
Procedia PDF Downloads 1354630 Self-denigration in Doctoral Defense Sessions: Scale Development and Validation
Authors: Alireza Jalilifar, Nadia Mayahi
Abstract:
The dissertation defense as a complicated conflict-prone context entails the adoption of elegant interactional strategies, one of which is self-denigration. This study aimed to develop and validate a self-denigration model that fits the context of doctoral defense sessions in applied linguistics. Two focus group discussions provided the basis for developing this conceptual model, which assumed 10 functions for self-denigration, namely good manners, modesty, affability, altruism, assertiveness, diffidence, coercive self-deprecation, evasion, diplomacy, and flamboyance. These functions were used to design a 40-item questionnaire on the attitudes of applied linguists concerning self-denigration in defense sessions. The confirmatory factor analysis of the questionnaire indicated the predictive ability of the measurement model. The findings of this study suggest that self-denigration in doctoral defense sessions is the social representation of the participants’ values, ideas and practices adopted as a negotiation strategy and a conflict management policy for the purpose of establishing harmony and maintaining resilience. This study has implications for doctoral students and academics and illuminates further research on self-denigration in other contexts.Keywords: academic discourse, politeness, self-denigration, grounded theory, dissertation defense
Procedia PDF Downloads 1394629 Reduction of Impulsive Noise in OFDM System using Adaptive Algorithm
Authors: Alina Mirza, Sumrin M. Kabir, Shahzad A. Sheikh
Abstract:
The Orthogonal Frequency Division Multiplexing (OFDM) with high data rate, high spectral efficiency and its ability to mitigate the effects of multipath makes them most suitable in wireless application. Impulsive noise distorts the OFDM transmission and therefore methods must be investigated to suppress this noise. In this paper, a State Space Recursive Least Square (SSRLS) algorithm based adaptive impulsive noise suppressor for OFDM communication system is proposed. And a comparison with another adaptive algorithm is conducted. The state space model-dependent recursive parameters of proposed scheme enables to achieve steady state mean squared error (MSE), low bit error rate (BER), and faster convergence than that of some of existing algorithm.Keywords: OFDM, impulsive noise, SSRLS, BER
Procedia PDF Downloads 4584628 An Experimental Analysis of Squeeze Casting Parameters for 2017 a Wrought Al Alloy
Authors: Mohamed Ben Amar, Najib Souissi, Chedly Bradai
Abstract:
A Taguchi design investigation has been made into the relationship between the ductility and process variables in a squeeze cast 2017A wrought aluminium alloy. The considered process parameters were: squeeze pressure, melt temperature and die preheating temperature. An orthogonal array (OA), main effect, signal-to-noise (S/N) ratio, and the analysis of variance (ANOVA) are employed to analyze the effect of casting parameters. The results have shown that the selected parameters significantly affect the ductility of 2017A wrought Al alloy castings. Optimal squeeze cast process parameters were provided to illustrate the proposed approach and the results were proven to be trustworthy through practical experiments.Keywords: Taguchi method, squeeze casting, process parameters, ductility, microstructure
Procedia PDF Downloads 4004627 A Practical and Efficient Evaluation Function for 3D Model Based Vehicle Matching
Authors: Yuan Zheng
Abstract:
3D model-based vehicle matching provides a new way for vehicle recognition, localization and tracking. Its key is to construct an evaluation function, also called fitness function, to measure the degree of vehicle matching. The existing fitness functions often poorly perform when the clutter and occlusion exist in traffic scenarios. In this paper, we present a practical and efficient fitness function. Unlike the existing evaluation functions, the proposed fitness function is to study the vehicle matching problem from both local and global perspectives, which exploits the pixel gradient information as well as the silhouette information. In view of the discrepancy between 3D vehicle model and real vehicle, a weighting strategy is introduced to differently treat the fitting of the model’s wireframes. Additionally, a normalization operation for the model’s projection is performed to improve the accuracy of the matching. Experimental results on real traffic videos reveal that the proposed fitness function is efficient and robust to the cluttered background and partial occlusion.Keywords: 3D-2D matching, fitness function, 3D vehicle model, local image gradient, silhouette information
Procedia PDF Downloads 3994626 Optimized Text Summarization Model on Mobile Screens for Sight-Interpreters: An Empirical Study
Authors: Jianhua Wang
Abstract:
To obtain key information quickly from long texts on small screens of mobile devices, sight-interpreters need to establish optimized summarization model for fast information retrieval. Four summarization models based on previous studies were studied including title+key words (TKW), title+topic sentences (TTS), key words+topic sentences (KWTS) and title+key words+topic sentences (TKWTS). Psychological experiments were conducted on the four models for three different genres of interpreting texts to establish the optimized summarization model for sight-interpreters. This empirical study shows that the optimized summarization model for sight-interpreters to quickly grasp the key information of the texts they interpret is title+key words (TKW) for cultural texts, title+key words+topic sentences (TKWTS) for economic texts and topic sentences+key words (TSKW) for political texts.Keywords: different genres, mobile screens, optimized summarization models, sight-interpreters
Procedia PDF Downloads 3164625 The Mediating Effect of Individual Readiness for Change in the Relationship between Organisational Culture and Individual Commitment to Change
Authors: Mohamed Haffar, Lois Farquharson, Gbola Gbadamosi, Wafi Al-Karaghouli, Ramadane Djbarni
Abstract:
A few recent research studies and mostly conceptual in nature have paid attention to the relationship between organizational culture (OC), individual readiness for change (IRFC) and individual affective commitment to change (IACC). Surprisingly enough, there is a lack of empirical studies investigating the influence of all four OC types on IRFC and IACC. Moreover, there is a very limited research investigating the mediating role of individual readiness for change between OC types and individual affective commitment to change. Therefore, this study is proposed to fill this gap by providing empirical evidence leading to advancement in the understanding of direct and indirect influences of OC on individual affective commitment to change. To achieve this, a questionnaire based survey was developed and self-administered to 226 middle managers in Algerian manufacturing organizations (AMOs). The results of this study indicated that group culture and adhocracy culture positively affect the IACC. Furthermore, the findings of this study show support for the mediating roles of self-efficacy and personally valence in the relationship between OC and IACC.Keywords: individual readiness for change, individual commitment to change, organisational culture, manufacturing organisations
Procedia PDF Downloads 5034624 Cultural Heritage, War and Heritage Legislations: An Empirical Review
Authors: Gebrekiros Welegebriel Asfaw
Abstract:
The conservation of cultural heritage during times of war is a topic of significant importance and concern in the field of heritage studies. The destruction, looting, and illicit acts against cultural heritages have devastating consequences. International and national legislations have been put in place to address these issues and provide a legal framework for protecting cultural heritage during armed conflicts. Thus, the aim of this review is to examine the existing heritage legislations and evaluate their effectiveness in protecting cultural heritage during times of war with a special insight of the Tigray war. The review is based on a comprehensive empirical analysis of existing heritage legislations related to the protection of cultural heritage during war, with a special focus on the Tigray war. The review reveals that there are several international and national legislations in place to protect cultural heritage during times of war. However, the implementation of these legislations has been insufficient and ineffective in the case of the Tigray war. The priceless cultural heritages in Tigray, which were once the centers of investment and world pride were, have been subjected to destruction, looting, and other illicit acts, in violation of both international conventions such as the UNESCO Convention and national legislations. Therefore, there is a need for consistent intervention and enforcement of different legislations from the international community and organizations to rehabilitate, repatriate, and reinstitute the irreplaceable heritages of Tigray.Keywords: cultural heritage, heritage legislations, tigray, war
Procedia PDF Downloads 1584623 An Interpolation Tool for Data Transfer in Two-Dimensional Ice Accretion Problems
Authors: Marta Cordero-Gracia, Mariola Gomez, Olivier Blesbois, Marina Carrion
Abstract:
One of the difficulties in icing simulations is for extended periods of exposure, when very large ice shapes are created. As well as being large, they can have complex shapes, such as a double horn. For icing simulations, these configurations are currently computed in several steps. The icing step is stopped when the ice shapes become too large, at which point a new mesh has to be created to allow for further CFD and ice growth simulations to be performed. This can be very costly, and is a limiting factor in the simulations that can be performed. A way to avoid the costly human intervention in the re-meshing step of multistep icing computation is to use mesh deformation instead of re-meshing. The aim of the present work is to apply an interpolation method based on Radial Basis Functions (RBF) to transfer deformations from surface mesh to volume mesh. This deformation tool has been developed specifically for icing problems. It is able to deal with localized, sharp and large deformations, unlike the tools traditionally used for more smooth wing deformations. This tool will be presented along with validation on typical two-dimensional icing shapes.Keywords: ice accretion, interpolation, mesh deformation, radial basis functions
Procedia PDF Downloads 3144622 Development of a Fuzzy Logic Based Model for Monitoring Child Pornography
Authors: Mariam Ismail, Kazeem Rufai, Jeremiah Balogun
Abstract:
A study was conducted to apply fuzzy logic to the development of a monitoring model for child pornography based on associated risk factors, which can be used by forensic experts or integrated into forensic systems for the early detection of child pornographic activities. A number of methods were adopted in the study, which includes an extensive review of related works was done in order to identify the factors that are associated with child pornography following which they were validated by an expert sex psychologist and guidance counselor, and relevant data was collected. Fuzzy membership functions were used to fuzzify the associated variables identified alongside the risk of the occurrence of child pornography based on the inference rules that were provided by the experts consulted, and the fuzzy logic expert system was simulated using the Fuzzy Logic Toolbox available in the MATLAB Software Release 2016. The results of the study showed that there were 4 categories of risk factors required for assessing the risk of a suspect committing child pornography offenses. The results of the study showed that 2 and 3 triangular membership functions were used to formulate the risk factors based on the 2 and 3 number of labels assigned, respectively. The results of the study showed that 5 fuzzy logic models were formulated such that the first 4 was used to assess the impact of each category on child pornography while the last one takes the 4 outputs from the 4 fuzzy logic models as inputs required for assessing the risk of child pornography. The following conclusion was made; there were factors that were related to personal traits, social traits, history of child pornography crimes, and self-regulatory deficiency traits by the suspects required for the assessment of the risk of child pornography crimes committed by a suspect. Using the values of the identified risk factors selected for this study, the risk of child pornography can be easily assessed from their values in order to determine the likelihood of a suspect perpetuating the crime.Keywords: fuzzy, membership functions, pornography, risk factors
Procedia PDF Downloads 1314621 OFDM Radar for High Accuracy Target Tracking
Authors: Mahbube Eghtesad
Abstract:
For a number of years, the problem of simultaneous detection and tracking of a target has been one of the most relevant and challenging issues in a wide variety of military and civilian systems. We develop methods for detecting and tracking a target using an orthogonal frequency division multiplexing (OFDM) based radar. As a preliminary step we introduce the target trajectory and Gaussian noise model in discrete time form. Then resorting to match filter and Kalman filter we derive a detector and target tracker. After that we propose an OFDM radar in order to achieve further improvement in tracking performance. The motivation for employing multiple frequencies is that the different scattering centers of a target resonate differently at each frequency. Numerical examples illustrate our analytical results, demonstrating the achieved performance improvement due to the OFDM signaling method.Keywords: matched filter, target trashing, OFDM radar, Kalman filter
Procedia PDF Downloads 3994620 Trainability of Executive Functions during Preschool Age Analysis of Inhibition of 5-Year-Old Children
Authors: Christian Andrä, Pauline Hähner, Sebastian Ludyga
Abstract:
Introduction: In the recent past, discussions on the importance of physical activity for child development have contributed to a growing interest in executive functions, which refer to cognitive processes. By controlling, modulating and coordinating sub-processes, they make it possible to achieve superior goals. Major components include working memory, inhibition and cognitive flexibility. While executive functions can be trained easily in school children, there are still research deficits regarding the trainability during preschool age. Methodology: This quasi-experimental study with pre- and post-design analyzes 23 children [age: 5.0 (mean value) ± 0.7 (standard deviation)] from four different sports groups. The intervention group was made up of 13 children (IG: 4.9 ± 0.6), while the control group consisted of ten children (CG: 5.1 ± 0.9). Between pre-test and post-test, children from the intervention group participated special games that train executive functions (i.e., changing rules of the game, introduction of new stimuli in familiar games) for ten units of their weekly sports program. The sports program of the control group was not modified. A computer-based version of the Eriksen Flanker Task was employed in order to analyze the participants’ inhibition ability. In two rounds, the participants had to respond 50 times and as fast as possible to a certain target (direction of sight of a fish; the target was always placed in a central position between five fish). Congruent (all fish have the same direction of sight) and incongruent (central fish faces opposite direction) stimuli were used. Relevant parameters were response time and accuracy. The main objective was to investigate whether children from the intervention group show more improvement in the two parameters than the children from the control group. Major findings: The intervention group revealed significant improvements in congruent response time (pre: 1.34 s, post: 1.12 s, p<.01), while the control group did not show any statistically relevant difference (pre: 1.31 s, post: 1.24 s). Likewise, the comparison of incongruent response times indicates a comparable result (IG: pre: 1.44 s, post: 1.25 s, p<.05 vs. CG: pre: 1.38 s, post: 1.38 s). In terms of accuracy for congruent stimuli, the intervention group showed significant improvements (pre: 90.1 %, post: 95.9 %, p<.01). In contrast, no significant improvement was found for the control group (pre: 88.8 %, post: 92.9 %). Vice versa, the intervention group did not display any significant results for incongruent stimuli (pre: 74.9 %, post: 83.5 %), while the control group revealed a significant difference (pre: 68.9 %, post: 80.3 %, p<.01). The analysis of three out of four criteria demonstrates that children who took part in a special sports program improved more than children who did not. The contrary results for the last criterion could be caused by the control group’s low results from the pre-test. Conclusion: The findings illustrate that inhibition can be trained as early as in preschool age. The combination of familiar games with increased requirements for attention and control processes appears to be particularly suitable.Keywords: executive functions, flanker task, inhibition, preschool children
Procedia PDF Downloads 2534619 Assessment of Artists’ Socioeconomic and Working Conditions: The Empirical Case of Lithuania
Authors: Rusne Kregzdaite, Erika Godlevska, Morta Vidunaite
Abstract:
The main aim of this research is to explore existing methodologies for artists’ labour force and create artists’ socio-economic and creative conditions in an assessment model. Artists have dual aims in their creative working process: 1) income and 2) artistic self-expression. The valuation of their conditions takes into consideration both sides: the factors related to income and the satisfaction of the creative process and its result. The problem addressed in the study: tangible and intangible artists' criteria used for assessments creativity conditions. The proposed model includes objective factors (working time, income, etc.) and subjective factors (salary covering essential needs, self-satisfaction). Other intangible indicators are taken into account: the impact on the common culture, social values, and the possibility to receive awards, to represent the country in the international market. The empirical model consists of 59 separate indicators, grouped into eight categories. The deviation of each indicator from the general evaluation allows for identifying the strongest and the weakest components of artists’ conditions.Keywords: artist conditions, artistic labour force, cultural policy, indicator, assessment model
Procedia PDF Downloads 1524618 Aggregate Fluctuations and the Global Network of Input-Output Linkages
Authors: Alexander Hempfing
Abstract:
The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.Keywords: economic integration, industrial organization, input-output economics, network economics, production networks
Procedia PDF Downloads 2794617 Indoor Visible Light Communication Channel Characterization for User Mobility: A Use-Case Study
Authors: Pooja Sanathkumar, Srinidhi Murali, Sethuraman TV, Saravanan M, Paventhan Arumugam, Ashwin Ashok
Abstract:
The last decade has witnessed a significant interest in visible light communication (VLC) technology, as VLC can potentially achieve high data rate links and secure communication channels. However, the use of VLC under mobile settings is fundamentally limited as its a line-of-sight (LOS) technology and there has been limited breakthroughs in realizing VLC for mobile settings. In this regard, this work targets to study the VLC channel under mobility. Through a use-case study analysis with experiment data traces this paper presents an empirical VLC channel study considering the application of VLC for smart lighting in an indoor room environment. This paper contributes a calibration study of a prototype VLC smart lighting system in an indoor environment and through the inferences gained from the calibration, and considering a user is carrying a mobile device fit with a VLC receiver, this work presents recommendations for user's position adjustments, with the goal to ensure maximum connectivity across the room.Keywords: visible light communication, mobility, empirical study, channel characterization
Procedia PDF Downloads 1274616 Handwriting Velocity Modeling by Artificial Neural Networks
Authors: Mohamed Aymen Slim, Afef Abdelkrim, Mohamed Benrejeb
Abstract:
The handwriting is a physical demonstration of a complex cognitive process learnt by man since his childhood. People with disabilities or suffering from various neurological diseases are facing so many difficulties resulting from problems located at the muscle stimuli (EMG) or signals from the brain (EEG) and which arise at the stage of writing. The handwriting velocity of the same writer or different writers varies according to different criteria: age, attitude, mood, writing surface, etc. Therefore, it is interesting to reconstruct an experimental basis records taking, as primary reference, the writing speed for different writers which would allow studying the global system during handwriting process. This paper deals with a new approach of the handwriting system modeling based on the velocity criterion through the concepts of artificial neural networks, precisely the Radial Basis Functions (RBF) neural networks. The obtained simulation results show a satisfactory agreement between responses of the developed neural model and the experimental data for various letters and forms then the efficiency of the proposed approaches.Keywords: Electro Myo Graphic (EMG) signals, experimental approach, handwriting process, Radial Basis Functions (RBF) neural networks, velocity modeling
Procedia PDF Downloads 4414615 Evaluation of Tensile Strength of Natural Fibres Reinforced Epoxy Composites Using Fly Ash as Filler Material
Authors: Balwinder Singh, Veerpaul Kaur Mann
Abstract:
A composite material is formed by the combination of two or more phases or materials. Natural minerals-derived Basalt fiber is a kind of fiber being introduced in the polymer composite industry due to its good mechanical properties similar to synthetic fibers and low cost, environment friendly. Also, there is a rising trend towards the use of industrial wastes as fillers in polymer composites with the aim of improving the properties of the composites. The mechanical properties of the fiber-reinforced polymer composites are influenced by various factors like fiber length, fiber weight %, filler weight %, filler size, etc. Thus, a detailed study has been done on the characterization of short-chopped Basalt fiber-reinforced polymer matrix composites using fly ash as filler. Taguchi’s L9 orthogonal array has been used to develop the composites by considering fiber length (6, 9 and 12 mm), fiber weight % (25, 30 and 35 %) and filler weight % (0, 5 and 10%) as input parameters with their respective levels and a thorough analysis on the mechanical characteristics (tensile strength and impact strength) has been done using ANOVA analysis with the help of MINITAB14 software. The investigation revealed that fiber weight is the most significant parameter affecting tensile strength, followed by fiber length and fiber weight %, respectively, while impact characterization showed that fiber length is the most significant factor, followed by fly ash weight, respectively. Introduction of fly ash proved to be beneficial in both the characterization with enhanced values upto 5% fly ash weight. The present study on the natural fibres reinforced epoxy composites using fly ash as filler material to study the effect of input parameters on the tensile strength in order to maximize tensile strength of the composites. Fabrication of composites based on Taguchi L9 orthogonal array design of experiments by using three factors fibre type, fibre weight % and fly ash % with three levels of each factor. The Optimization of composition of natural fibre reinforces composites using ANOVA for obtaining maximum tensile strength on fabricated composites revealed that the natural fibres along with fly ash can be successfully used with epoxy resin to prepare polymer matrix composites with good mechanical properties. Paddy- Paddy fibre gives high elasticity to the fibre composite due to presence of approximately hexagonal structure of cellulose present in paddy fibre. Coir- Coir fibre gives less tensile strength than paddy fibre as Coir fibre is brittle in nature when it pulls breakage occurs showing less tensile strength. Banana- Banana fibre has the least tensile strength in comparison to the paddy & coir fibre due to less cellulose content. Higher fibre weight leads to reduction in tensile strength due to increased nuclei of air pockets. Increasing fly ash content reduces tensile strength due to nonbonding of fly ash particles with natural fibre. Fly ash is also not very strong as compared to the epoxy resin leading to reduction in tensile strength.Keywords: tensile strength and epoxy resin. basalt Fiber, taguchi, polymer matrix, natural fiber
Procedia PDF Downloads 494614 Elastohydrodynamic Lubrication Study Using Discontinuous Finite Volume Method
Authors: Prawal Sinha, Peeyush Singh, Pravir Dutt
Abstract:
Problems in elastohydrodynamic lubrication have attracted a lot of attention in the last few decades. Solving a two-dimensional problem has always been a big challenge. In this paper, a new discontinuous finite volume method (DVM) for two-dimensional point contact Elastohydrodynamic Lubrication (EHL) problem has been developed and analyzed. A complete algorithm has been presented for solving such a problem. The method presented is robust and easily parallelized in MPI architecture. GMRES technique is implemented to solve the matrix obtained after the formulation. A new approach is followed in which discontinuous piecewise polynomials are used for the trail functions. It is natural to assume that the advantages of using discontinuous functions in finite element methods should also apply to finite volume methods. The nature of the discontinuity of the trail function is such that the elements in the corresponding dual partition have the smallest support as compared with the Classical finite volume methods. Film thickness calculation is done using singular quadrature approach. Results obtained have been presented graphically and discussed. This method is well suited for solving EHL point contact problem and can probably be used as commercial software.Keywords: elastohydrodynamic, lubrication, discontinuous finite volume method, GMRES technique
Procedia PDF Downloads 2584613 The Shannon Entropy and Multifractional Markets
Authors: Massimiliano Frezza, Sergio Bianchi, Augusto Pianese
Abstract:
Introduced by Shannon in 1948 in the field of information theory as the average rate at which information is produced by a stochastic set of data, the concept of entropy has gained much attention as a measure of uncertainty and unpredictability associated with a dynamical system, eventually depicted by a stochastic process. In particular, the Shannon entropy measures the degree of order/disorder of a given signal and provides useful information about the underlying dynamical process. It has found widespread application in a variety of fields, such as, for example, cryptography, statistical physics and finance. In this regard, many contributions have employed different measures of entropy in an attempt to characterize the financial time series in terms of market efficiency, market crashes and/or financial crises. The Shannon entropy has also been considered as a measure of the risk of a portfolio or as a tool in asset pricing. This work investigates the theoretical link between the Shannon entropy and the multifractional Brownian motion (mBm), stochastic process which recently is the focus of a renewed interest in finance as a driving model of stochastic volatility. In particular, after exploring the current state of research in this area and highlighting some of the key results and open questions that remain, we show a well-defined relationship between the Shannon (log)entropy and the memory function H(t) of the mBm. In details, we allow both the length of time series and time scale to change over analysis to study how the relation modify itself. On the one hand, applications are developed after generating surrogates of mBm trajectories based on different memory functions; on the other hand, an empirical analysis of several international stock indexes, which confirms the previous results, concludes the work.Keywords: Shannon entropy, multifractional Brownian motion, Hurst–Holder exponent, stock indexes
Procedia PDF Downloads 111