Search results for: multidimensional functions
2423 Secure Cryptographic Operations on SIM Card for Mobile Financial Services
Authors: Kerem Ok, Serafettin Senturk, Serdar Aktas, Cem Cevikbas
Abstract:
Mobile technology is very popular nowadays and it provides a digital world where users can experience many value-added services. Service Providers are also eager to offer diverse value-added services to users such as digital identity, mobile financial services and so on. In this context, the security of data storage in smartphones and the security of communication between the smartphone and service provider are critical for the success of these services. In order to provide the required security functions, the SIM card is one acceptable alternative. Since SIM cards include a Secure Element, they are able to store sensitive data, create cryptographically secure keys, encrypt and decrypt data. In this paper, we design and implement a SIM and a smartphone framework that uses a SIM card for secure key generation, key storage, data encryption, data decryption and digital signing for mobile financial services. Our frameworks show that the SIM card can be used as a controlled Secure Element to provide required security functions for popular e-services such as mobile financial services.Keywords: SIM card, mobile financial services, cryptography, secure data storage
Procedia PDF Downloads 3122422 Artificial Intelligent Methodology for Liquid Propellant Engine Design Optimization
Authors: Hassan Naseh, Javad Roozgard
Abstract:
This paper represents the methodology based on Artificial Intelligent (AI) applied to Liquid Propellant Engine (LPE) optimization. The AI methodology utilized from Adaptive neural Fuzzy Inference System (ANFIS). In this methodology, the optimum objective function means to achieve maximum performance (specific impulse). The independent design variables in ANFIS modeling are combustion chamber pressure and temperature and oxidizer to fuel ratio and output of this modeling are specific impulse that can be applied with other objective functions in LPE design optimization. To this end, the LPE’s parameter has been modeled in ANFIS methodology based on generating fuzzy inference system structure by using grid partitioning, subtractive clustering and Fuzzy C-Means (FCM) clustering for both inferences (Mamdani and Sugeno) and various types of membership functions. The final comparing optimization results shown accuracy and processing run time of the Gaussian ANFIS Methodology between all methods.Keywords: ANFIS methodology, artificial intelligent, liquid propellant engine, optimization
Procedia PDF Downloads 5872421 Dynamic Reroute Modeling for Emergency Evacuation: Case Study of Brunswick City, Germany
Authors: Yun-Pang Flötteröd, Jakob Erdmann
Abstract:
The human behaviors during evacuations are quite complex. One of the critical behaviors which affect the efficiency of evacuation is route choice. Therefore, the respective simulation modeling work needs to function properly. In this paper, Simulation of Urban Mobility’s (SUMO) current dynamic route modeling during evacuation, i.e. the rerouting functions, is examined with a real case study. The result consistency of the simulation and the reality is checked as well. Four influence factors (1) time to get information, (2) probability to cancel a trip, (3) probability to use navigation equipment, and (4) rerouting and information updating period are considered to analyze possible traffic impacts during the evacuation and to examine the rerouting functions in SUMO. Furthermore, some behavioral characters of the case study are analyzed with use of the corresponding detector data and applied in the simulation. The experiment results show that the dynamic route modeling in SUMO can deal with the proposed scenarios properly. Some issues and function needs related to route choice are discussed and further improvements are suggested.Keywords: evacuation, microscopic traffic simulation, rerouting, SUMO
Procedia PDF Downloads 1942420 Pragmatic Discoursal Study of Hedging Constructions in English Language
Authors: Mohammed Hussein Ahmed, Bahar Mohammed Kareem
Abstract:
This study is concerned with the pragmatic discoursal study of hedging constructions in English language. Hedging is a mitigated word used to lessen the impact of the utterance uttered by the speakers. Hedging could be either adverbs, adjectives, verbs and sometimes it may consist of clauses. It aims at finding out the extent to which speakers and participants of the discourse use hedging constructions during their conversations. The study also aims at finding out whether or not there are any significant differences in the types and functions of the frequency of hedging constructions employed by male and female. It is hypothesized that hedging constructions are frequent in English discourse more than any other languages due to its formality and that the frequency of the types and functions are influenced by the gender of the participants. To achieve the aims of the study, two types of procedures have been followed: theoretical and practical. The theoretical procedure consists of presenting a theoretical background of hedging topic which includes its definitions, etymology and theories. The practical procedure consists of selecting a sample of texts and analyzing them according to an adopted model. A number of conclusions will be drawn based on the findings of the study.Keywords: hedging, pragmatics, politeness, theoretical
Procedia PDF Downloads 5872419 Modelling the Effects of External Factors Affecting Concrete Carbonation
Authors: Abhishek Mangal, Kunal Tongaria, S. Mandal, Devendra Mohan
Abstract:
Carbonation of reinforced concrete structures has emerged as one of the major challenges for Civil engineers across the world. With increasing emissions from various activities, carbon dioxide concentration in the atmosphere has been eve rising, enhancing its penetration in porous concrete, reaching steel bars and ultimately leading to premature failure. Several literatures have been published dealing with the various interdependent variables related to carbonation. However, with innumerable variability a generalization of these data proves to be a troublesome task. This paper looks into this carbonation anomaly in concrete structures caused by various external variables such as relative humidity, concentration of CO2, curing period and ambient temperature. Significant discussions and comparisons have been presented on the basis of various studies conducted with an aim to predict the depth of carbonation as a function of these multidimensional parameters using various numerical and statistical modelling techniques.Keywords: carbonation, curing, exposure conditions, relative humidity
Procedia PDF Downloads 2532418 Investigating a Deterrence Function for Work Trips for Perth Metropolitan Area
Authors: Ali Raouli, Amin Chegenizadeh, Hamid Nikraz
Abstract:
The Perth metropolitan area and its surrounding regions have been expanding rapidly in recent decades and it is expected that this growth will continue in the years to come. With this rapid growth and the resulting increase in population, consideration should be given to strategic planning and modelling for the future expansion of Perth. The accurate estimation of projected traffic volumes has always been a major concern for the transport modelers and planners. Development of a reliable strategic transport model depends significantly on the inputs data into the model and the calibrated parameters of the model to reflect the existing situation. Trip distribution is the second step in four-step modelling (FSM) which is complex due to its behavioral nature. Gravity model is the most common method for trip distribution. The spatial separation between the Origin and Destination (OD) zones will be reflected in gravity model by applying deterrence functions which provide an opportunity to include people’s behavior in choosing their destinations based on distance, time and cost of their journeys. Deterrence functions play an important role for distribution of the trips within a study area and would simulate the trip distances and therefore should be calibrated for any particular strategic transport model to correctly reflect the trip behavior within the modelling area. This paper aims to review the most common deterrence functions and propose a calibrated deterrence function for work trips within the Perth Metropolitan Area based on the information obtained from the latest available Household data and Perth and Region Travel Survey (PARTS) data. As part of this study, a four-step transport model using EMME software has been developed for Perth Metropolitan Area to assist with the analysis and findings.Keywords: deterrence function, four-step modelling, origin destination, transport model
Procedia PDF Downloads 1682417 Effects of Oral L-Carnitine on Liver Functions after Trans arterial Chemoembolization in Hepatocellular Carcinoma Patients
Authors: Ali Kassem, Aly Taha, Abeer Hassan, Kazuhide Higuchi
Abstract:
Introduction: Trans arterial chemoembolization (TACE) for hepatocellular carcinoma (HCC) is usually followed by hepatic dysfunction that limits its efficacy. L-carnitine is recently studied as hepatoprotective agent. Our aim is to evaluate the L-carnitine effects against the deterioration of liver functions after TACE. Method: 53 patients with intermediate stage HCC were assigned into two groups; L-carnitine group (26 patients) who received L-carnitine 300 mg tablet twice daily from 2 weeks before to 12 weeks after TACE and control group (27 patients) without L-carnitine therapy. 28 of studied patients received branched chain amino acids granules. Results: There were significant differences between L-carnitine Vs. control group in mean serum albumin change from baseline to 1 week and 4 weeks after TACE (p < 0.05). L-Carnitine maintained Child-Pugh score at 1 week after TACE and exhibited improvement at 4 weeks after TACE (p < 0.01 Vs 1 week after TACE). Control group has significant Child-Pugh score deterioration from baseline to 1 week after TACE (p < 0.05) and 12 weeks after TACE (p < 0.05). There were significant differences between L-carnitine and control groups in mean Child-Pugh score change from baseline to 4 weeks (p < 0.05) and 12 weeks after TACE (p < 0.05). L-carnitine displayed improvement in (PT) from baseline to 1 week, 4 w (p < 0.05) and 12 weeks after TACE. PT in control group declined less than baseline along all follow up intervals. Total bilirubin in L-carnitine group decreased at 1 week post TACE while in control group, it significantly increased at 1 week (p = 0.01). ALT and C-reactive protein elevation were suppressed at 1 week after TACE in Lcarnitine group. The hepatoprotective effects of L-carnitine were enhanced by concomitant use of branched chain amino acids. Conclusion: L-carnitine and BCAA combination therapy offer a novel supportive strategy after TACE in HCC patients.Keywords: hepatocellular carcinoma, L-carnitine, liver functions , trans-arterial embolization
Procedia PDF Downloads 1552416 Transition Dynamic Analysis of the Urban Disparity in Iran “Case Study: Iran Provinces Center”
Authors: Marzieh Ahmadi, Ruhullah Alikhan Gorgani
Abstract:
The usual methods of measuring regional inequalities can not reflect the internal changes of the country in terms of their displacement in different development groups, and the indicators of inequalities are not effective in demonstrating the dynamics of the distribution of inequality. For this purpose, this paper examines the dynamics of the urban inertial transport in the country during the period of 2006-2016 using the CIRD multidimensional index and stochastic kernel density method. it firstly selects 25 indicators in five dimensions including macroeconomic conditions, science and innovation, environmental sustainability, human capital and public facilities, and two-stage Principal Component Analysis methodology are developed to create a composite index of inequality. Then, in the second stage, using a nonparametric analytical approach to internal distribution dynamics and a stochastic kernel density method, the convergence hypothesis of the CIRD index of the Iranian provinces center is tested, and then, based on the ergodic density, long-run equilibrium is shown. Also, at this stage, for the purpose of adopting accurate regional policies, the distribution dynamics and process of convergence or divergence of the Iranian provinces for each of the five. According to the results of the first Stage, in 2006 & 2016, the highest level of development is related to Tehran and zahedan is at the lowest level of development. The results show that the central cities of the country are at the highest level of development due to the effects of Tehran's knowledge spillover and the country's lower cities are at the lowest level of development. The main reason for this may be the lack of access to markets in the border provinces. Based on the results of the second stage, which examines the dynamics of regional inequality transmission in the country during 2006-2016, the first year (2006) is not multifaceted and according to the kernel density graph, the CIRD index of about 70% of the cities. The value is between -1.1 and -0.1. The rest of the sequence on the right is distributed at a level higher than -0.1. In the kernel distribution, a convergence process is observed and the graph points to a single peak. Tends to be a small peak at about 3 but the main peak at about-0.6. According to the chart in the final year (2016), the multidimensional pattern remains and there is no mobility in the lower level groups, but at the higher level, the CIRD index accounts for about 45% of the provinces at about -0.4 Take it. That this year clearly faces the twin density pattern, which indicates that the cities tend to be closely related to each other in terms of development, so that the cities are low in terms of development. Also, according to the distribution dynamics results, the provinces of Iran follow the single-density density pattern in 2006 and the double-peak density pattern in 2016 at low and moderate inequality index levels and also in the development index. The country diverges during the years 2006 to 2016.Keywords: Urban Disparity, CIRD Index, Convergence, Distribution Dynamics, Random Kernel Density
Procedia PDF Downloads 1242415 Modified Weibull Approach for Bridge Deterioration Modelling
Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight
Abstract:
State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models
Procedia PDF Downloads 7272414 Modelling of Heat Generation in a 18650 Lithium-Ion Battery Cell under Varying Discharge Rates
Authors: Foo Shen Hwang, Thomas Confrey, Stephen Scully, Barry Flannery
Abstract:
Thermal characterization plays an important role in battery pack design. Lithium-ion batteries have to be maintained between 15-35 °C to operate optimally. Heat is generated (Q) internally within the batteries during both the charging and discharging phases. This can be quantified using several standard methods. The most common method of calculating the batteries heat generation is through the addition of both the joule heating effects and the entropic changes across the battery. In addition, such values can be derived by identifying the open-circuit voltage (OCV), nominal voltage (V), operating current (I), battery temperature (T) and the rate of change of the open-circuit voltage in relation to temperature (dOCV/dT). This paper focuses on experimental characterization and comparative modelling of the heat generation rate (Q) across several current discharge rates (0.5C, 1C, and 1.5C) of a 18650 cell. The analysis is conducted utilizing several non-linear mathematical functions methods, including polynomial, exponential, and power models. Parameter fitting is carried out over the respective function orders; polynomial (n = 3~7), exponential (n = 2) and power function. The generated parameter fitting functions are then used as heat source functions in a 3-D computational fluid dynamics (CFD) solver under natural convection conditions. Generated temperature profiles are analyzed for errors based on experimental discharge tests, conducted at standard room temperature (25°C). Initial experimental results display low deviation between both experimental and CFD temperature plots. As such, the heat generation function formulated could be easier utilized for larger battery applications than other methods available.Keywords: computational fluid dynamics, curve fitting, lithium-ion battery, voltage drop
Procedia PDF Downloads 952413 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process
Authors: Hong-Ming Chen
Abstract:
This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.Keywords: optimization, interest rate model, jump process, deterministic
Procedia PDF Downloads 1612412 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing
Authors: Yehjune Heo
Abstract:
As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer
Procedia PDF Downloads 1362411 Key Frame Based Video Summarization via Dependency Optimization
Authors: Janya Sainui
Abstract:
As a rapid growth of digital videos and data communications, video summarization that provides a shorter version of the video for fast video browsing and retrieval is necessary. Key frame extraction is one of the mechanisms to generate video summary. In general, the extracted key frames should both represent the entire video content and contain minimum redundancy. However, most of the existing approaches heuristically select key frames; hence, the selected key frames may not be the most different frames and/or not cover the entire content of a video. In this paper, we propose a method of video summarization which provides the reasonable objective functions for selecting key frames. In particular, we apply a statistical dependency measure called quadratic mutual informaion as our objective functions for maximizing the coverage of the entire video content as well as minimizing the redundancy among selected key frames. The proposed key frame extraction algorithm finds key frames as an optimization problem. Through experiments, we demonstrate the success of the proposed video summarization approach that produces video summary with better coverage of the entire video content while less redundancy among key frames comparing to the state-of-the-art approaches.Keywords: video summarization, key frame extraction, dependency measure, quadratic mutual information
Procedia PDF Downloads 2662410 Path Integrals and Effective Field Theory of Large Scale Structure
Authors: Revant Nayar
Abstract:
In this work, we recast the equations describing large scale structure, and by extension all nonlinear fluids, in the path integral formalism. We first calculate the well known two and three point functions using Schwinger Keldysh formalism used commonly to perturbatively solve path integrals in non- equilibrium systems. Then we include EFT corrections due to pressure, viscosity, and noise as effects on the time-dependent propagator. We are able to express results for arbitrary two and three point correlation functions in LSS in terms of differential operators acting on a triple K master intergral. We also, for the first time, get analytical results for more general initial conditions deviating from the usual power law P∝kⁿ by introducing a mass scale in the initial conditions. This robust field theoretic formalism empowers us with tools from strongly coupled QFT to study the strongly non-linear regime of LSS and turbulent fluid dynamics such as OPE and holographic duals. These could be used to capture fully the strongly non-linear dynamics of fluids and move towards solving the open problem of classical turbulence.Keywords: quantum field theory, cosmology, effective field theory, renormallisation
Procedia PDF Downloads 1352409 Self-denigration in Doctoral Defense Sessions: Scale Development and Validation
Authors: Alireza Jalilifar, Nadia Mayahi
Abstract:
The dissertation defense as a complicated conflict-prone context entails the adoption of elegant interactional strategies, one of which is self-denigration. This study aimed to develop and validate a self-denigration model that fits the context of doctoral defense sessions in applied linguistics. Two focus group discussions provided the basis for developing this conceptual model, which assumed 10 functions for self-denigration, namely good manners, modesty, affability, altruism, assertiveness, diffidence, coercive self-deprecation, evasion, diplomacy, and flamboyance. These functions were used to design a 40-item questionnaire on the attitudes of applied linguists concerning self-denigration in defense sessions. The confirmatory factor analysis of the questionnaire indicated the predictive ability of the measurement model. The findings of this study suggest that self-denigration in doctoral defense sessions is the social representation of the participants’ values, ideas and practices adopted as a negotiation strategy and a conflict management policy for the purpose of establishing harmony and maintaining resilience. This study has implications for doctoral students and academics and illuminates further research on self-denigration in other contexts.Keywords: academic discourse, politeness, self-denigration, grounded theory, dissertation defense
Procedia PDF Downloads 1372408 A Practical and Efficient Evaluation Function for 3D Model Based Vehicle Matching
Authors: Yuan Zheng
Abstract:
3D model-based vehicle matching provides a new way for vehicle recognition, localization and tracking. Its key is to construct an evaluation function, also called fitness function, to measure the degree of vehicle matching. The existing fitness functions often poorly perform when the clutter and occlusion exist in traffic scenarios. In this paper, we present a practical and efficient fitness function. Unlike the existing evaluation functions, the proposed fitness function is to study the vehicle matching problem from both local and global perspectives, which exploits the pixel gradient information as well as the silhouette information. In view of the discrepancy between 3D vehicle model and real vehicle, a weighting strategy is introduced to differently treat the fitting of the model’s wireframes. Additionally, a normalization operation for the model’s projection is performed to improve the accuracy of the matching. Experimental results on real traffic videos reveal that the proposed fitness function is efficient and robust to the cluttered background and partial occlusion.Keywords: 3D-2D matching, fitness function, 3D vehicle model, local image gradient, silhouette information
Procedia PDF Downloads 3992407 An Interpolation Tool for Data Transfer in Two-Dimensional Ice Accretion Problems
Authors: Marta Cordero-Gracia, Mariola Gomez, Olivier Blesbois, Marina Carrion
Abstract:
One of the difficulties in icing simulations is for extended periods of exposure, when very large ice shapes are created. As well as being large, they can have complex shapes, such as a double horn. For icing simulations, these configurations are currently computed in several steps. The icing step is stopped when the ice shapes become too large, at which point a new mesh has to be created to allow for further CFD and ice growth simulations to be performed. This can be very costly, and is a limiting factor in the simulations that can be performed. A way to avoid the costly human intervention in the re-meshing step of multistep icing computation is to use mesh deformation instead of re-meshing. The aim of the present work is to apply an interpolation method based on Radial Basis Functions (RBF) to transfer deformations from surface mesh to volume mesh. This deformation tool has been developed specifically for icing problems. It is able to deal with localized, sharp and large deformations, unlike the tools traditionally used for more smooth wing deformations. This tool will be presented along with validation on typical two-dimensional icing shapes.Keywords: ice accretion, interpolation, mesh deformation, radial basis functions
Procedia PDF Downloads 3132406 Development of a Fuzzy Logic Based Model for Monitoring Child Pornography
Authors: Mariam Ismail, Kazeem Rufai, Jeremiah Balogun
Abstract:
A study was conducted to apply fuzzy logic to the development of a monitoring model for child pornography based on associated risk factors, which can be used by forensic experts or integrated into forensic systems for the early detection of child pornographic activities. A number of methods were adopted in the study, which includes an extensive review of related works was done in order to identify the factors that are associated with child pornography following which they were validated by an expert sex psychologist and guidance counselor, and relevant data was collected. Fuzzy membership functions were used to fuzzify the associated variables identified alongside the risk of the occurrence of child pornography based on the inference rules that were provided by the experts consulted, and the fuzzy logic expert system was simulated using the Fuzzy Logic Toolbox available in the MATLAB Software Release 2016. The results of the study showed that there were 4 categories of risk factors required for assessing the risk of a suspect committing child pornography offenses. The results of the study showed that 2 and 3 triangular membership functions were used to formulate the risk factors based on the 2 and 3 number of labels assigned, respectively. The results of the study showed that 5 fuzzy logic models were formulated such that the first 4 was used to assess the impact of each category on child pornography while the last one takes the 4 outputs from the 4 fuzzy logic models as inputs required for assessing the risk of child pornography. The following conclusion was made; there were factors that were related to personal traits, social traits, history of child pornography crimes, and self-regulatory deficiency traits by the suspects required for the assessment of the risk of child pornography crimes committed by a suspect. Using the values of the identified risk factors selected for this study, the risk of child pornography can be easily assessed from their values in order to determine the likelihood of a suspect perpetuating the crime.Keywords: fuzzy, membership functions, pornography, risk factors
Procedia PDF Downloads 1292405 Trainability of Executive Functions during Preschool Age Analysis of Inhibition of 5-Year-Old Children
Authors: Christian Andrä, Pauline Hähner, Sebastian Ludyga
Abstract:
Introduction: In the recent past, discussions on the importance of physical activity for child development have contributed to a growing interest in executive functions, which refer to cognitive processes. By controlling, modulating and coordinating sub-processes, they make it possible to achieve superior goals. Major components include working memory, inhibition and cognitive flexibility. While executive functions can be trained easily in school children, there are still research deficits regarding the trainability during preschool age. Methodology: This quasi-experimental study with pre- and post-design analyzes 23 children [age: 5.0 (mean value) ± 0.7 (standard deviation)] from four different sports groups. The intervention group was made up of 13 children (IG: 4.9 ± 0.6), while the control group consisted of ten children (CG: 5.1 ± 0.9). Between pre-test and post-test, children from the intervention group participated special games that train executive functions (i.e., changing rules of the game, introduction of new stimuli in familiar games) for ten units of their weekly sports program. The sports program of the control group was not modified. A computer-based version of the Eriksen Flanker Task was employed in order to analyze the participants’ inhibition ability. In two rounds, the participants had to respond 50 times and as fast as possible to a certain target (direction of sight of a fish; the target was always placed in a central position between five fish). Congruent (all fish have the same direction of sight) and incongruent (central fish faces opposite direction) stimuli were used. Relevant parameters were response time and accuracy. The main objective was to investigate whether children from the intervention group show more improvement in the two parameters than the children from the control group. Major findings: The intervention group revealed significant improvements in congruent response time (pre: 1.34 s, post: 1.12 s, p<.01), while the control group did not show any statistically relevant difference (pre: 1.31 s, post: 1.24 s). Likewise, the comparison of incongruent response times indicates a comparable result (IG: pre: 1.44 s, post: 1.25 s, p<.05 vs. CG: pre: 1.38 s, post: 1.38 s). In terms of accuracy for congruent stimuli, the intervention group showed significant improvements (pre: 90.1 %, post: 95.9 %, p<.01). In contrast, no significant improvement was found for the control group (pre: 88.8 %, post: 92.9 %). Vice versa, the intervention group did not display any significant results for incongruent stimuli (pre: 74.9 %, post: 83.5 %), while the control group revealed a significant difference (pre: 68.9 %, post: 80.3 %, p<.01). The analysis of three out of four criteria demonstrates that children who took part in a special sports program improved more than children who did not. The contrary results for the last criterion could be caused by the control group’s low results from the pre-test. Conclusion: The findings illustrate that inhibition can be trained as early as in preschool age. The combination of familiar games with increased requirements for attention and control processes appears to be particularly suitable.Keywords: executive functions, flanker task, inhibition, preschool children
Procedia PDF Downloads 2532404 Urbanization in Delhi: A Multiparameter Study
Authors: Ishu Surender, M. Amez Khair, Ishan Singh
Abstract:
Urbanization is a multidimensional phenomenon. It is an indication of the long-term process for the shift of economics to industrial from rural. The significance of urbanization in modernization, socio-economic development, and poverty eradication is relevant in modern times. This paper aims to study the urbanization index model in the capital of India, Delhi using aspects such as demographic aspect, infrastructural development aspect, and economic development aspect. The urbanization index of all the nine districts of Delhi will be determined using multiple parameters such as population density and the availability of health and education facilities. The definition of the urban area varies from city to city and requires periodic classification which makes direct comparisons difficult. The urbanization index calculated in this paper can be employed to measure the urbanization of a district and compare the level of urbanization in different districts.Keywords: multiparameter, population density, multiple regression, normalized urbanization index
Procedia PDF Downloads 1132403 Handwriting Velocity Modeling by Artificial Neural Networks
Authors: Mohamed Aymen Slim, Afef Abdelkrim, Mohamed Benrejeb
Abstract:
The handwriting is a physical demonstration of a complex cognitive process learnt by man since his childhood. People with disabilities or suffering from various neurological diseases are facing so many difficulties resulting from problems located at the muscle stimuli (EMG) or signals from the brain (EEG) and which arise at the stage of writing. The handwriting velocity of the same writer or different writers varies according to different criteria: age, attitude, mood, writing surface, etc. Therefore, it is interesting to reconstruct an experimental basis records taking, as primary reference, the writing speed for different writers which would allow studying the global system during handwriting process. This paper deals with a new approach of the handwriting system modeling based on the velocity criterion through the concepts of artificial neural networks, precisely the Radial Basis Functions (RBF) neural networks. The obtained simulation results show a satisfactory agreement between responses of the developed neural model and the experimental data for various letters and forms then the efficiency of the proposed approaches.Keywords: Electro Myo Graphic (EMG) signals, experimental approach, handwriting process, Radial Basis Functions (RBF) neural networks, velocity modeling
Procedia PDF Downloads 4402402 Stress and Social Support as Predictors of Quality of Life: A Case among Flood Victims in Malaysia
Authors: Najib Ahmad Marzuki, Che Su Mustaffa, Johana Johari, Nur Haffiza Rahaman
Abstract:
The purpose of this paper is to examine the effects and relationship of stress and social support towards the quality of life among flood victims in Malaysia. A total of 764 respondents took part in the survey via random sampling. The depression, anxiety, and stress scales were utilized to measure stress while The Multidimensional Scale of Perceived Social Support was used to measure the quality of life. The findings of this study indicate that there were significant correlations between variables in the study. The findings show a significant negative relation between stress and quality of life, and significant positive correlations between support from family as well as support from friends with the quality of life. Stress and support from family were found to be significant predictors and influences the quality of life among flood victims.Keywords: stress, social support, quality of life, flood victims
Procedia PDF Downloads 5572401 Elastohydrodynamic Lubrication Study Using Discontinuous Finite Volume Method
Authors: Prawal Sinha, Peeyush Singh, Pravir Dutt
Abstract:
Problems in elastohydrodynamic lubrication have attracted a lot of attention in the last few decades. Solving a two-dimensional problem has always been a big challenge. In this paper, a new discontinuous finite volume method (DVM) for two-dimensional point contact Elastohydrodynamic Lubrication (EHL) problem has been developed and analyzed. A complete algorithm has been presented for solving such a problem. The method presented is robust and easily parallelized in MPI architecture. GMRES technique is implemented to solve the matrix obtained after the formulation. A new approach is followed in which discontinuous piecewise polynomials are used for the trail functions. It is natural to assume that the advantages of using discontinuous functions in finite element methods should also apply to finite volume methods. The nature of the discontinuity of the trail function is such that the elements in the corresponding dual partition have the smallest support as compared with the Classical finite volume methods. Film thickness calculation is done using singular quadrature approach. Results obtained have been presented graphically and discussed. This method is well suited for solving EHL point contact problem and can probably be used as commercial software.Keywords: elastohydrodynamic, lubrication, discontinuous finite volume method, GMRES technique
Procedia PDF Downloads 2572400 Argument Representation in Non-Spatial Motion Bahasa Melayu Based Conceptual Structure Theory
Authors: Nurul Jamilah Binti Rosly
Abstract:
The typology of motion must be understood as a change from one location to another. But from a conceptual point of view, motion can also occur in non-spatial contexts associated with human and social factors. Therefore, from the conceptual point of view, the concept of non-spatial motion involves the movement of time, ownership, identity, state, and existence. Accordingly, this study will focus on the lexical as shared, accept, be, store, and exist as the study material. The data in this study were extracted from the Database of Languages and Literature Corpus Database, Malaysia, which was analyzed using semantics and syntax concepts using Conceptual Structure Theory - Ray Jackendoff (2002). Semantic representations are represented in the form of conceptual structures in argument functions that include functions [events], [situations], [objects], [paths] and [places]. The findings show that the mapping of these arguments comprises three main stages, namely mapping the argument structure, mapping the tree, and mapping the role of thematic items. Accordingly, this study will show the representation of non- spatial Malay language areas.Keywords: arguments, concepts, constituencies, events, situations, thematics
Procedia PDF Downloads 1292399 Prediction of Childbearing Orientations According to Couples' Sexual Review Component
Authors: Razieh Rezaeekalantari
Abstract:
Objective: The purpose of this study was to investigate the prediction of parenting orientations in terms of the components of couples' sexual review. Methods: This was a descriptive correlational research method. The population consisted of 500 couples referring to Sari Health Center. Two hundred and fifteen (215) people were selected randomly by using Krejcie-Morgan-sample-size-table. For data collection, the childbearing orientations scale and the Multidimensional Sexual Self-Concept Questionnaire were used. Result: For data analysis, the mean and standard deviation were used and to analyze the research hypothesis regression correlation and inferential statistics were used. Conclusion: The findings indicate that there is not a significant relationship between the tendency to childbearing and the predictive value of sexual review (r = 0.84) with significant level (sig = 219.19) (P < 0.05). So, with 95% confidence, we conclude that there is not a meaningful relationship between sexual orientation and tendency to child-rearing.Keywords: couples referring, health center, sexual review component, parenting orientations
Procedia PDF Downloads 2192398 Developing New Media Credibility Scale: A Multidimensional Perspective
Authors: Hanaa Farouk Saleh
Abstract:
The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.Keywords: credibility scale, media credibility components, new media credibility scale, scale development
Procedia PDF Downloads 3212397 The Influence of Modern Islamic Thought Liberalization to the Improvement of Science
Authors: Muhammad Ilham Agus Salim
Abstract:
The liberalization of Islamic thought is not only an impact on the views of Muslim community regarding worldview, but has touched the stage reconstruction of contemporary general science. It can be seen from the emergence of Western and Eastern intellectual movements that try to reconstruct contemporary science arguing that scientific culture is not currently able to deliver audiences to change the order of the better society. Such Islamic thought liberalization has a huge influence on the multidimensional crisis in various sectors such as the economic, culture, politic, ecology, and other sectors. Therefore, this paper examines the effects of the liberalization of contemporary Islamic thought towards on the development of modern science. The method used in this paper is based on textual study of Al -Qur'an, Hadith (prophetic tradition), and the history of contemporary Islamic thought and comparing it with the reality of the development of science today. So the influence of Islamic thought liberalization has created a crisis and stagnation of the development of scientific disciplines can be found.Keywords: liberalization, science, Islam, al-Qur’an textual studies
Procedia PDF Downloads 3982396 Conduction Transfer Functions for the Calculation of Heat Demands in Heavyweight Facade Systems
Authors: Mergim Gasia, Bojan Milovanovica, Sanjin Gumbarevic
Abstract:
Better energy performance of the building envelope is one of the most important aspects of energy savings if the goals set by the European Union are to be achieved in the future. Dynamic heat transfer simulations are being used for the calculation of building energy consumption because they give more realistic energy demands compared to the stationary calculations that do not take the building’s thermal mass into account. Software used for these dynamic simulation use methods that are based on the analytical models since numerical models are insufficient for longer periods. The analytical models used in this research fall in the category of the conduction transfer functions (CTFs). Two methods for calculating the CTFs covered by this research are the Laplace method and the State-Space method. The literature review showed that the main disadvantage of these methods is that they are inadequate for heavyweight façade elements and shorter time periods used for the calculation. The algorithms for both the Laplace and State-Space methods are implemented in Mathematica, and the results are compared to the results from EnergyPlus and TRNSYS since these software use similar algorithms for the calculation of the building’s energy demand. This research aims to check the efficiency of the Laplace and the State-Space method for calculating the building’s energy demand for heavyweight building elements and shorter sampling time, and it also gives the means for the improvement of the algorithms used by these methods. As the reference point for the boundary heat flux density, the finite difference method (FDM) is used. Even though the dynamic heat transfer simulations are superior to the calculation based on the stationary boundary conditions, they have their limitations and will give unsatisfactory results if not properly used.Keywords: Laplace method, state-space method, conduction transfer functions, finite difference method
Procedia PDF Downloads 1322395 Optimal Load Factors for Seismic Design of Buildings
Authors: Juan Bojórquez, Sonia E. Ruiz, Edén Bojórquez, David de León Escobedo
Abstract:
A life-cycle optimization procedure to establish the best load factors combinations for seismic design of buildings, is proposed. The expected cost of damage from future earthquakes within the life of the structure is estimated, and realistic cost functions are assumed. The functions include: Repair cost, cost of contents damage, cost associated with loss of life, cost of injuries and economic loss. The loads considered are dead, live and earthquake load. The study is performed for reinforced concrete buildings located in Mexico City. The buildings are modeled as multiple-degree-of-freedom frame structures. The parameter selected to measure the structural damage is the maximum inter-story drift. The structural models are subjected to 31 soft-soil ground motions recorded in the Lake Zone of Mexico City. In order to obtain the annual structural failure rates, a numerical integration method is applied.Keywords: load factors, life-cycle analysis, seismic design, reinforced concrete buildings
Procedia PDF Downloads 6172394 Effect of Risperidone and Haloperidol on Clinical Picture and Some Biochemical Parameters of Schizophrenic Libyan Patients
Authors: Mabruka S. Elashheb, Adullah Ali Bakush
Abstract:
Schizophrenia is referred to as a disorder, not a disease, because there has not been any clear, reliable, and specific etiological factor. Even if schizophrenia is not a very frequent disease, it is among the most burdensome and costly illnesses worldwide. Prevention of relapse is a major goal of maintenance treatment in patients with psychotic disorders. We performed a comparison of a newer, atypical antipsychotic drug, Risperidone, and an older, conventional neuroleptic drug, Haloperidol, in terms of the effect on the usual kidney and liver functions and negative and positive symptoms in patients with schizophrenia and schizoaffective disorder after three and five weeks of their treatments. It is apparent from the comparative data of Haloperidol and Risperidone treatments in schizophrenic patients that Resperidone had superior improvement of negative and positive symptoms of patients, no harmful effect on liver and kidney functions and greater efficacy and faster recovery from schizophrenic symptoms in patients. On the basis of our findings of the present study, we concluded that treatment with Risperidone is superior to Haloperidol in reducing the risk of relapse among outpatients with schizophrenic disorders.Keywords: schizophrenia, Haloperidol, Risperidone, positive and negative symptom
Procedia PDF Downloads 378