Search results for: classical reasoning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1283

Search results for: classical reasoning

893 Molecular Dynamics Simulations on Richtmyer-Meshkov Instability of Li-H2 Interface at Ultra High-Speed Shock Loads

Authors: Weirong Wang, Shenghong Huang, Xisheng Luo, Zhenyu Li

Abstract:

Material mixing process and related dynamic issues at extreme compressing conditions have gained more and more concerns in last ten years because of the engineering appealings in inertial confinement fusion (ICF) and hypervelocity aircraft developments. However, there lacks models and methods that can handle fully coupled turbulent material mixing and complex fluid evolution under conditions of high energy density regime up to now. In aspects of macro hydrodynamics, three numerical methods such as direct numerical simulation (DNS), large eddy simulation (LES) and Reynolds-averaged Navier–Stokes equations (RANS) has obtained relative acceptable consensus under the conditions of low energy density regime. However, under the conditions of high energy density regime, they can not be applied directly due to occurrence of dissociation, ionization, dramatic change of equation of state, thermodynamic properties etc., which may make the governing equations invalid in some coupled situations. However, in view of micro/meso scale regime, the methods based on Molecular Dynamics (MD) as well as Monte Carlo (MC) model are proved to be promising and effective ways to investigate such issues. In this study, both classical MD and first-principle based electron force field MD (eFF-MD) methods are applied to investigate Richtmyer-Meshkov Instability of metal Lithium and gas Hydrogen (Li-H2) interface mixing at different shock loading speed ranging from 3 km/s to 30 km/s. It is found that: 1) Classical MD method based on predefined potential functions has some limits in application to extreme conditions, since it cannot simulate the ionization process and its potential functions are not suitable to all conditions, while the eFF-MD method can correctly simulate the ionization process due to its ‘ab initio’ feature; 2) Due to computational cost, the eFF-MD results are also influenced by simulation domain dimensions, boundary conditions and relaxation time choices, etc., in computations. Series of tests have been conducted to determine the optimized parameters. 3) Ionization induced by strong shock compression has important effects on Li-H2 interface evolutions of RMI, indicating a new micromechanism of RMI under conditions of high energy density regime.

Keywords: first-principle, ionization, molecular dynamics, material mixture, Richtmyer-Meshkov instability

Procedia PDF Downloads 219
892 Implementation of the Recursive Formula for Evaluation of the Strength of Daniels' Bundle

Authors: Vaclav Sadilek, Miroslav Vorechovsky

Abstract:

The paper deals with the classical fiber bundle model of equal load sharing, sometimes referred to as the Daniels' bundle or the democratic bundle. Daniels formulated a multidimensional integral and also a recursive formula for evaluation of the strength cumulative distribution function. This paper describes three algorithms for evaluation of the recursive formula and also their implementations with source codes in high-level programming language Python. A comparison of the algorithms are provided with respect to execution time. Analysis of orders of magnitudes of addends in the recursion is also provided.

Keywords: equal load sharing, mpmath, python, strength of Daniels' bundle

Procedia PDF Downloads 395
891 Co-Evolutionary Fruit Fly Optimization Algorithm and Firefly Algorithm for Solving Unconstrained Optimization Problems

Authors: R. M. Rizk-Allah

Abstract:

This paper presents co-evolutionary fruit fly optimization algorithm based on firefly algorithm (CFOA-FA) for solving unconstrained optimization problems. The proposed algorithm integrates the merits of fruit fly optimization algorithm (FOA), firefly algorithm (FA) and elite strategy to refine the performance of classical FOA. Moreover, co-evolutionary mechanism is performed by applying FA procedures to ensure the diversity of the swarm. Finally, the proposed algorithm CFOA- FA is tested on several benchmark problems from the usual literature and the numerical results have demonstrated the superiority of the proposed algorithm for finding the global optimal solution.

Keywords: firefly algorithm, fruit fly optimization algorithm, unconstrained optimization problems

Procedia PDF Downloads 524
890 Some Aspects on Formation Initialization and Its Maintenance of Leo Satellites

Authors: Y. Johnson

Abstract:

Study of multi-satellite formation flight systems has drawn wide attention recently due to so many potential advantages. The present work aims to model the relative motion dynamics in terms of change in classical orbital parameters between the two satellites-chief and deputy- under Earth’s oblateness effect. The required impulsive thrust control is calculated to minimize these orbital parameter changes. The formation configuration is initialized by selecting a set of orbital parameters for the chief and deputy satellites such that bounded motion is maintained for a long time in a J_2-invariant relative non-circular orbit between the satellites. The solution of J_2-modified Hill’s equations is also derived in this paper.

Keywords: satellite, formation flight, j2 effect, control

Procedia PDF Downloads 261
889 Evaluation of the Digitalization in Graphic Design in Turkey

Authors: Veysel Seker

Abstract:

Graphic designing and virtual reality have been affected by digital development and technological development for the last decades. This study aims to compare and evaluate digitalization and virtual reality evaluation in traditional and classical methods of the graphic designing sector in Turkey. The qualitative and quantitative studies and research were discussed and identified according to the evaluated results of the literature surveys. Moreover, the study showed that the competency gap between graphic design schools and the field should be determined and well-studied. The competencies of traditional graphic designers will have a big challenge for the purpose of the transition into the developed and evaluated digital graphic design world.

Keywords: digitalization, evaluation, graphic designing, virtual reality

Procedia PDF Downloads 131
888 Study of Human Position in Architecture with Contextual Approach

Authors: E. Zarei, M. Bazaei, A. seifi, A. Keshavarzi

Abstract:

Contextuallism has been always the main component of urban science. It not only has great direct and indirect impact on behaviors, events and interactions, but also is one of the basic factors of an urban values and identity. Nowadays there might be some deficiencies in the cities. In the theories of environment designing, humanistic orientations with the focus on culture and cultural variables would enable us to transfer information. To communicate with the context in which human lives, he needs some common memories, understandable symbols and daily activities in that context. The configuration of a place can impact on human’s behaviors. The goal of this research is to review 7 projects in different parts of the world with various usages and some factors such as ‘sense of place’, ‘sense of belonging’ and ‘social and cultural relations’ will be discussed in these projects. The method used for research in this project is descriptive- analytic. Library information and Internet are the main sources of gathering information and the method of reasoning used in this project is inductive. The consequence of this research will be some data in the form of tables that has been extracted from mentioned projects.

Keywords: contextuallism with humanistic approach, sense of place, sense of belonging, social and cultural relations

Procedia PDF Downloads 385
887 Implementation of an Associative Memory Using a Restricted Hopfield Network

Authors: Tet H. Yeap

Abstract:

An analog restricted Hopfield Network is presented in this paper. It consists of two layers of nodes, visible and hidden nodes, connected by directional weighted paths forming a bipartite graph with no intralayer connection. An energy or Lyapunov function was derived to show that the proposed network will converge to stable states. By introducing hidden nodes, the proposed network can be trained to store patterns and has increased memory capacity. Training to be an associative memory, simulation results show that the associative memory performs better than a classical Hopfield network by being able to perform better memory recall when the input is noisy.

Keywords: restricted Hopfield network, Lyapunov function, simultaneous perturbation stochastic approximation

Procedia PDF Downloads 118
886 Sportomics Analysis of Metabolic Responses in Olympic Sprint Canoeists

Authors: A. Magno-França, A. M. Magalhães-Neto, F. Bachini, E. Cataldi, A. Bassini, L. C. Cameron

Abstract:

Sprint canoeing (SC) is part of the Olympic Games since 1936. Athletes compete in solo or double races of 200m and 1000m (40 sec and 240 sec, respectively). Due to its high intensity and duration, SC is extremely useful to study the blood kinetics of some metabolites in high energetic demand. Sportomics is a field of study combining “-omics” sciences with classical biochemical analyses in order to understand sports induced systemic changes. Here, we compare Sportomics findings during SC training sessions to describe metabolic responses of five top-level canoeists. Five Olympic world-class male athletes were evaluated during two days of training.

Keywords: biochemistry of exercise, metabolomics, injury markers, sportomics

Procedia PDF Downloads 511
885 Particle Swarm Optimization and Quantum Particle Swarm Optimization to Multidimensional Function Approximation

Authors: Diogo Silva, Fadul Rodor, Carlos Moraes

Abstract:

This work compares the results of multidimensional function approximation using two algorithms: the classical Particle Swarm Optimization (PSO) and the Quantum Particle Swarm Optimization (QPSO). These algorithms were both tested on three functions - The Rosenbrock, the Rastrigin, and the sphere functions - with different characteristics by increasing their number of dimensions. As a result, this study shows that the higher the function space, i.e. the larger the function dimension, the more evident the advantages of using the QPSO method compared to the PSO method in terms of performance and number of necessary iterations to reach the stop criterion.

Keywords: PSO, QPSO, function approximation, AI, optimization, multidimensional functions

Procedia PDF Downloads 565
884 Compressive Strength and Microstructure of Hybrid Alkaline Cements

Authors: Z. Abdollahnejad, P. Torgal, J. Barroso Aguiar

Abstract:

Publications on the field of alkali-activated binders, state that this new material is likely to have high potential to become an alternative to Portland cement. Classical alkali-activated cements could be made more eco-efficient if the use of sodium silicate is avoided. Besides, most alkali-activated cements suffer from severe efflorescence originated by the fact that alkaline and/or soluble silicates that are added during processing cannot be totally consumed. This paper presents experimental results on hybrid alkaline cements. Compressive strength results and efflorescence’s observations show that the new mixes already analyzed are promising. SEM results show that no traditional porous ITZ was detected in these binders.

Keywords: hybrid alkaline cements, compressive strength, efflorescence, SEM, ITZ

Procedia PDF Downloads 281
883 Experimental Investigations of a Modified Taylor-Couette Flow

Authors: Ahmed Esmael, Ali El Shrif

Abstract:

In this study the instability problem of a modified Taylor-Couette flow between two vertical coaxial cylinders of radius R1, R2 is considered. The modification is based on the wavy shape of the inner cylinder surface, where inner cylinders with different surface amplitude and wavelength are used. The study aims to discover the effect of the inner surface geometry on the instability phenomenon that undergoes Taylor-Couette flow. The study reveals that the transition processes depends strongly on the amplitude and wavelength of the inner cylinder surface and resulting in flow instabilities that are strongly different from that encountered in the case of the classical Taylor-Couette flow.

Keywords: hydrodynamic instability, Modified Taylor-Couette Flow, turbulence, Taylor vortices

Procedia PDF Downloads 424
882 Empowering the Sustainability of Community Health: An Application of the Theory of Maqasid Al-Shariah

Authors: Ahasanul Haque, Noor Hazilah Abd Manaf, Zohurul Anis, Tarekol Islam

Abstract:

Sustainable community health (SCH) is an example of a new healthcare concept formed from applying the Maqasid al-Shariah principle to hospital management and delivery services. Because the idea is novel, it needs comprehensive and ongoing investigation to be improved. However, there is a lack of research on the necessity of developing sustainable community health (SCH), particularly its organizational structure. Furthermore, there is a misconception about the order of components in Maqasid al-Shariah, particularly in a hospital setting. Furthermore, the use of medicines and treatment by conventional recommendations to carry out the treatment by the Maqasid al Shariah. As such, this study focuses on the essential prerequisite for establishing a sustainable community health system based on Maqasid al-Shariah. This study discusses the use of Maqasid al-Shariah in administration and treatment. In this qualitative research approach, a literature search and interviews with specialists are conducted. The gathered data is examined using content analysis, emphasizing inductive and deductive reasoning. The research reveals that the Shariah Advisory Council and Shariah Critical Point are necessary for sustainable community health. In conclusion, by discussing the causes for each instance, this research adds to the creation of methods for determining the level of Maasid al-Shariah in-hospital care.

Keywords: empowering, sustainability, community health, maqasid al shariah, hospital and malaysia

Procedia PDF Downloads 70
881 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation

Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong

Abstract:

Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).

Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation

Procedia PDF Downloads 181
880 Multiple-Lump-Type Solutions of the 2D Toda Equation

Authors: Jian-Ping Yu, Wen-Xiu Ma, Yong-Li Sun, Chaudry Masood Khalique

Abstract:

In this paper, a 2d Toda equation is studied, which is a classical integrable system and plays a vital role in mathematics, physics and other areas. New lump-type solution is constructed by using the Hirota bilinear method. One interesting feature of this research is that this lump-type solutions possesses two types of multiple-lump-type waves, which are one- and two-lump-type waves. Moreover, the corresponding 3d plots, density plots and contour plots are given to show the dynamical features of the obtained multiple-lump-type solutions.

Keywords: 2d Toda equation, Hirota bilinear method, Lump-type solution, multiple-lump-type solution

Procedia PDF Downloads 209
879 Secure Image Encryption via Enhanced Fractional Order Chaotic Map

Authors: Ismail Haddad, Djamel Herbadji, Aissa Belmeguenai, Selma Boumerdassi

Abstract:

in this paper, we provide a novel approach for image encryption that employs the Fibonacci matrix and an enhanced fractional order chaotic map. The enhanced map overcomes the drawbacks of the classical map, especially the limited chaotic range and non-uniform distribution of chaotic sequences, resulting in a larger encryption key space. As a result, this strategy improves the encryption system's security. Our experimental results demonstrate that our proposed algorithm effectively encrypts grayscale images with exceptional efficiency. Furthermore, our technique is resistant to a wide range of potential attacks, including statistical and entropy attacks.

Keywords: image encryption, logistic map, fibonacci matrix, grayscale images

Procedia PDF Downloads 299
878 Bidirectional Pendulum Vibration Absorbers with Homogeneous Variable Tangential Friction: Modelling and Design

Authors: Emiliano Matta

Abstract:

Passive resonant vibration absorbers are among the most widely used dynamic control systems in civil engineering. They typically consist in a single-degree-of-freedom mechanical appendage of the main structure, tuned to one structural target mode through frequency and damping optimization. One classical scheme is the pendulum absorber, whose mass is constrained to move along a curved trajectory and is damped by viscous dashpots. Even though the principle is well known, the search for improved arrangements is still under way. In recent years this investigation inspired a type of bidirectional pendulum absorber (BPA), consisting of a mass constrained to move along an optimal three-dimensional (3D) concave surface. For such a BPA, the surface principal curvatures are designed to ensure a bidirectional tuning of the absorber to both principal modes of the main structure, while damping is produced either by horizontal viscous dashpots or by vertical friction dashpots, connecting the BPA to the main structure. In this paper, a variant of BPA is proposed, where damping originates from the variable tangential friction force which develops between the pendulum mass and the 3D surface as a result of a spatially-varying friction coefficient pattern. Namely, a friction coefficient is proposed that varies along the pendulum surface in proportion to the modulus of the 3D surface gradient. With such an assumption, the dissipative model of the absorber can be proven to be nonlinear homogeneous in the small displacement domain. The resulting homogeneous BPA (HBPA) has a fundamental advantage over conventional friction-type absorbers, because its equivalent damping ratio results independent on the amplitude of oscillations, and therefore its optimal performance does not depend on the excitation level. On the other hand, the HBPA is more compact than viscously damped BPAs because it does not need the installation of dampers. This paper presents the analytical model of the HBPA and an optimal methodology for its design. Numerical simulations of single- and multi-story building structures under wind and earthquake loads are presented to compare the HBPA with classical viscously damped BPAs. It is shown that the HBPA is a promising alternative to existing BPA types and that homogeneous tangential friction is an effective means to realize systems provided with amplitude-independent damping.

Keywords: amplitude-independent damping, homogeneous friction, pendulum nonlinear dynamics, structural control, vibration resonant absorbers

Procedia PDF Downloads 137
877 The Use of Language as a Cognitive Tool in French Immersion Teaching

Authors: Marie-Josée Morneau

Abstract:

A literacy-based approach, centred on the use of the language of instruction as a cognitive tool, can increase the L2 communication skills of French immersion students. Academic subject areas such as science and mathematics offer an authentic language learning context where students can become more proficient speakers while using specific vocabulary and language structures to learn, interact and communicate their reasoning, when provided the opportunities and guidance to do so. In this Canadian quasi-experimental study, the effects of teaching specific language elements during mathematic classes through literacy-based activities in Early French Immersion programming were compared between two Grade 7/8 groups: the experimental group, which received literacy-based teaching for a 6-week period, and the control group, which received regular teaching instruction. The results showed that the participants from the experimental group made more progress in their mathematical communication skills, which suggests that targeting L2 language as a cognitive tool can be beneficial to immersion learners who learn mathematic concepts and remind us that all L2 teachers are language teachers.

Keywords: mathematics, French immersion, literacy-based, oral communication, L2

Procedia PDF Downloads 65
876 Design and Validation of Different Steering Geometries for an All-Terrain Vehicle

Authors: Prabhsharan Singh, Rahul Sindhu, Piyush Sikka

Abstract:

The steering system is an integral part and medium through which the driver communicates with the vehicle and terrain, hence the most suitable steering geometry as per requirements must be chosen. The function of the chosen steering geometry of an All-Terrain Vehicle (ATV) is to provide the desired understeer gradient, minimum tire slippage, expected weight transfer during turning as these are requirements for a good steering geometry of a BAJA ATV. This research paper focuses on choosing the best suitable steering geometry for BAJA ATV tracks by reasoning the working principle and using fundamental trigonometric functions for obtaining these geometries on the same vehicle itself, namely Ackermann, Anti- Ackermann, Parallel Ackermann. Full vehicle analysis was carried out on Adams Car Analysis software, and graphical results were obtained for various parameters. Steering geometries were achieved by using a single versatile knuckle for frontward and rearward tie-rod placement and were practically tested with the help of data acquisition systems set up on the ATV. Each was having certain characteristics, setup, and parameters were observed for the BAJA ATV, and correlations were created between analytical and practical values.

Keywords: all-terrain vehicle, Ackermann, Adams car, Baja Sae, steering geometry, steering system, tire slip, traction, understeer gradient

Procedia PDF Downloads 139
875 Milk Curd Obstruction as a Mimic of Necrotising Enterocolitis (NEC)

Authors: Sofia Baldelli, Aman More

Abstract:

Milk curd obstruction is commonly reported as being misdiagnosed for NEC, and they predominantly mimic each other in clinical presentation, including abdominal distension, vomiting, constipation, feeding intolerance and frank or occult blood PR. Using the case of a pre-term neonate misdiagnosed with necrotising enterocolitis when in fact, they had milk curd obstruction, we compare the two diagnoses and why they are hard to differentiate, the risk factors for clinicians to consider and the different management options. The main diagnostic tool for these conditions remains the plain radiograph and here we present the original radiograph of the neonate and discuss the classical radiological features of both diagnoses. We conclude that further imaging techniques such as ultrasound might be used to improve diagnosis when X-ray is inconclusive.

Keywords: milk curd obstruction, Necrotising Enterocolitis, radiology, pediatric surgery

Procedia PDF Downloads 85
874 Representational Issues in Learning Solution Chemistry at Secondary School

Authors: Lam Pham, Peter Hubber, Russell Tytler

Abstract:

Students’ conceptual understandings of chemistry concepts/phenomena involve capability to coordinate across the three levels of Johnston’s triangle model. This triplet model is based on reasoning about chemical phenomena across macro, sub-micro and symbolic levels. In chemistry education, there is a need for further examining inquiry-based approaches that enhance students’ conceptual learning and problem solving skills. This research adopted a directed inquiry pedagogy based on students constructing and coordinating representations, to investigate senior school students’ capabilities to flexibly move across Johnston’ levels when learning dilution and molar concentration concepts. The participants comprise 50 grade 11 and 20 grade 10 students and 4 chemistry teachers who were selected from 4 secondary schools located in metropolitan Melbourne, Victoria. This research into classroom practices used ethnographic methodology, involved teachers working collaboratively with the research team to develop representational activities and lesson sequences in the instruction of a unit on solution chemistry. The representational activities included challenges (Representational Challenges-RCs) that used ‘representational tools’ to assist students to move across Johnson’s three levels for dilution phenomena. In this report, the ‘representational tool’ called ‘cross and portion’ model was developed and used in teaching and learning the molar concentration concept. Students’ conceptual understanding and problem solving skills when learning with this model are analysed through group case studies of year 10 and 11 chemistry students. In learning dilution concepts, students in both group case studies actively conducted a practical experiment, used their own language and visualisation skills to represent dilution phenomena at macroscopic level (RC1). At the sub-microscopic level, students generated and negotiated representations of the chemical interactions between solute and solvent underpinning the dilution process. At the symbolic level, students demonstrated their understandings about dilution concepts by drawing chemical structures and performing mathematical calculations. When learning molar concentration with a ‘cross and portion’ model (RC2), students coordinated across visual and symbolic representational forms and Johnson’s levels to construct representations. The analysis showed that in RC1, Year 10 students needed more ‘scaffolding’ in inducing to representations to explicit the form and function of sub-microscopic representations. In RC2, Year 11 students showed clarity in using visual representations (drawings) to link to mathematics to solve representational challenges about molar concentration. In contrast, year 10 students struggled to get match up the two systems, symbolic system of mole per litre (‘cross and portion’) and visual representation (drawing). These conceptual problems do not lie in the students’ mathematical calculation capability but rather in students’ capability to align visual representations with the symbolic mathematical formulations. This research also found that students in both group case studies were able to coordinate representations when probed about the use of ‘cross and portion’ model (in RC2) to demonstrate molar concentration of diluted solutions (in RC1). Students mostly succeeded in constructing ‘cross and portion’ models to represent the reduction of molar concentration of the concentration gradients. In conclusion, this research demonstrated how the strategic introduction and coordination of chemical representations across modes and across the macro, sub-micro and symbolic levels, supported student reasoning and problem solving in chemistry.

Keywords: cross and portion, dilution, Johnston's triangle, molar concentration, representations

Procedia PDF Downloads 130
873 Ontology for Cross-Site-Scripting (XSS) Attack in Cybersecurity

Authors: Jean Rosemond Dora, Karol Nemoga

Abstract:

In this work, we tackle a frequent problem that frequently occurs in the cybersecurity field which is the exploitation of websites by XSS attacks, which are nowadays considered a complicated attack. These types of attacks aim to execute malicious scripts in a web browser of the client by including code in a legitimate web page. A serious matter is when a website accepts the “user-input” option. Attackers can exploit the web application (if vulnerable), and then steal sensitive data (session cookies, passwords, credit cards, etc.) from the server and/or from the client. However, the difficulty of the exploitation varies from website to website. Our focus is on the usage of ontology in cybersecurity against XSS attacks, on the importance of the ontology, and its core meaning for cybersecurity. We explain how a vulnerable website can be exploited, and how different JavaScript payloads can be used to detect vulnerabilities. We also enumerate some tools to use for an efficient analysis. We present detailed reasoning on what can be done to improve the security of a website in order to resist attacks, and we provide supportive examples. Then, we apply an ontology model against XSS attacks to strengthen the protection of a web application. However, we note that the existence of ontology does not improve the security itself, but it has to be properly used and should require a maximum of security layers to be taken into account.

Keywords: cybersecurity, web application vulnerabilities, cyber threats, ontology model

Procedia PDF Downloads 159
872 Element-Independent Implementation for Method of Lagrange Multipliers

Authors: Gil-Eon Jeong, Sung-Kie Youn, K. C. Park

Abstract:

Treatment for the non-matching interface is an important computational issue. To handle this problem, the method of Lagrange multipliers including classical and localized versions are the most popular technique. It essentially imposes the interface compatibility conditions by introducing Lagrange multipliers. However, the numerical system becomes unstable and inefficient due to the Lagrange multipliers. The interface element-independent formulation that does not include the Lagrange multipliers can be obtained by modifying the independent variables mathematically. Through this modification, more efficient and stable system can be achieved while involving equivalent accuracy comparing with the conventional method. A numerical example is conducted to verify the validity of the presented method.

Keywords: element-independent formulation, interface coupling, methods of Lagrange multipliers, non-matching interface

Procedia PDF Downloads 397
871 A Flexible Pareto Distribution Using α-Power Transformation

Authors: Shumaila Ehtisham

Abstract:

In Statistical Distribution Theory, considering an additional parameter to classical distributions is a usual practice. In this study, a new distribution referred to as α-Power Pareto distribution is introduced by including an extra parameter. Several properties of the proposed distribution including explicit expressions for the moment generating function, mode, quantiles, entropies and order statistics are obtained. Unknown parameters have been estimated by using maximum likelihood estimation technique. Two real datasets have been considered to examine the usefulness of the proposed distribution. It has been observed that α-Power Pareto distribution outperforms while compared to different variants of Pareto distribution on the basis of model selection criteria.

Keywords: α-power transformation, maximum likelihood estimation, moment generating function, Pareto distribution

Procedia PDF Downloads 210
870 Construction of an Assessment Tool for Early Childhood Development in the World of DiscoveryTM Curriculum

Authors: Divya Palaniappan

Abstract:

Early Childhood assessment tools must measure the quality and the appropriateness of a curriculum with respect to culture and age of the children. Preschool assessment tools lack psychometric properties and were developed to measure only few areas of development such as specific skills in music, art and adaptive behavior. Existing preschool assessment tools in India are predominantly informal and are fraught with judgmental bias of observers. The World of Discovery TM curriculum focuses on accelerating the physical, cognitive, language, social and emotional development of pre-schoolers in India through various activities. The curriculum caters to every child irrespective of their dominant intelligence as per Gardner’s Theory of Multiple Intelligence which concluded "even students as young as four years old present quite distinctive sets and configurations of intelligences". The curriculum introduces a new theme every week where, concepts are explained through various activities so that children with different dominant intelligences could understand it. For example: The ‘Insects’ theme is explained through rhymes, craft and counting corner, and hence children with one of these dominant intelligences: Musical, bodily-kinesthetic and logical-mathematical could grasp the concept. The child’s progress is evaluated using an assessment tool that measures a cluster of inter-dependent developmental areas: physical, cognitive, language, social and emotional development, which for the first time renders a multi-domain approach. The assessment tool is a 5-point rating scale that measures these Developmental aspects: Cognitive, Language, Physical, Social and Emotional. Each activity strengthens one or more of the developmental aspects. During cognitive corner, the child’s perceptual reasoning, pre-math abilities, hand-eye co-ordination and fine motor skills could be observed and evaluated. The tool differs from traditional assessment methodologies by providing a framework that allows teachers to assess a child’s continuous development with respect to specific activities in real time objectively. A pilot study of the tool was done with a sample data of 100 children in the age group 2.5 to 3.5 years. The data was collected over a period of 3 months across 10 centers in Chennai, India, scored by the class teacher once a week. The teachers were trained by psychologists on age-appropriate developmental milestones to minimize observer’s bias. The norms were calculated from the mean and standard deviation of the observed data. The results indicated high internal consistency among parameters and that cognitive development improved with physical development. A significant positive relationship between physical and cognitive development has been observed among children in a study conducted by Sibley and Etnier. In Children, the ‘Comprehension’ ability was found to be greater than ‘Reasoning’ and pre-math abilities as indicated by the preoperational stage of Piaget’s theory of cognitive development. The average scores of various parameters obtained through the tool corroborates the psychological theories on child development, offering strong face validity. The study provides a comprehensive mechanism to assess a child’s development and differentiate high performers from the rest. Based on the average scores, the difficulty level of activities could be increased or decreased to nurture the development of pre-schoolers and also appropriate teaching methodologies could be devised.

Keywords: child development, early childhood assessment, early childhood curriculum, quantitative assessment of preschool curriculum

Procedia PDF Downloads 351
869 Use of EPR in Experimental Mechanics

Authors: M. Sikoń, E. Bidzińska

Abstract:

An attempt to apply EPR (Electron Paramagnetic Resonance) spectroscopy to experimental analysis of the mechanical state of the loaded material is considered in this work. Theory concerns the participation of electrons in transfer of mechanical action. The model of measurement is shown by applying classical mechanics and quantum mechanics. Theoretical analysis is verified using EPR spectroscopy twice, once for the free spacemen and once for the mechanical loaded spacemen. Positive results in the form of different spectra for free and loaded materials are used to describe the mechanical state in continuum based on statistical mechanics. Perturbation of the optical electrons in the field of the mechanical interactions inspires us to propose new optical properties of the materials with mechanical stresses.

Keywords: Cosserat medium, EPR spectroscopy, optical active electrons, optical activity

Procedia PDF Downloads 365
868 Ideological Passing: A Study of Tawfiq Al-Hakim’s The River of Madness

Authors: Yasser Khamis Ragab Aman

Abstract:

Tawfiq Al-Hakim (1898-1987) celebrated the 1919 Revolution by writing The Return of The Spirit published in 1933, a novel which portrayed national awakening and illustrated the cult of a nationalist leader, such as Saad Zaghloul so much that it influenced Egypt’s first president Gamal Abdel Nasser. However, in 1974, and because of an excruciating sense of disappointment, Al-Hakim wrote The Return of Consciousness. Between losing and regaining consciousness, Al-Hakim wrote The River of Madness, a short play published in 1937. It portrays an old kingdom established in a distant place where there is a conflict between the King and his minister, who have not drunk from the river of madness, on the one hand, and the inhabitants of the kingdom who thought that the King and the minister have gone mad because they refused to drink from the river. (Each party doubted the sanity of the other). In fact, the King and the minister differ in their political stance from the rest of the people. By philosophical reasoning, the minister convinces the King that it is safer to go mad with the majority than to be treated as an unwanted minority. It is believed that in The River of Madness, Al-Hakim deftly portrays an example of ideological passing as an alternative solution that can save the country from the woes of the aftermath of revolution and civil war.

Keywords: ideological passing, Al Hakim, The River of Madness, Arabic literature

Procedia PDF Downloads 111
867 Semi-Empirical Modeling of Heat Inactivation of Enterococci and Clostridia During the Hygienisation in Anaerobic Digestion Process

Authors: Jihane Saad, Thomas Lendormi, Caroline Le Marechal, Anne-marie Pourcher, Céline Druilhe, Jean-louis Lanoiselle

Abstract:

Agricultural anaerobic digestion consists in the conversion of animal slurry and manure into biogas and digestate. They need, however, to be treated at 70 ºC during 60 min before anaerobic digestion according to the European regulation (EC n°1069/2009 & EU n°142/2011). The impact of such heat treatment on the outcome of bacteria has been poorly studied up to now. Moreover, a recent study¹ has shown that enterococci and clostridia are still detected despite the application of such thermal treatment, questioning the relevance of this approach for the hygienisation of digestate. The aim of this study is to establish the heat inactivation kinetics of two species of enterococci (Enterococcus faecalis and Enterococcus faecium) and two species of clostridia (Clostridioides difficile and Clostridium novyi as a non-toxic model for Clostridium botulinum of group III). A pure culture of each strain was prepared in a specific sterile medium at concentration of 10⁴ – 10⁷ MPN / mL (Most Probable number), depending on the bacterial species. Bacterial suspensions were then filled in sterilized capillary tubes and placed in a water or oil bath at desired temperature for a specific period of time. Each bacterial suspension was enumerated using a MPN approach, and tests were repeated three times for each temperature/time couple. The inactivation kinetics of the four indicator bacteria is described using the Weibull model and the classical Bigelow model of first-order kinetics. The Weibull model takes biological variation, with respect to thermal inactivation, into account and is basically a statistical model of distribution of inactivation times as the classical first-order approach is a special case of the Weibull model. The heat treatment at 70 ºC / 60 min contributes to a reduction greater than 5 log10 for E. faecium and E. faecalis. However, it results only in a reduction of about 0.7 log10 for C. difficile and an increase of 0.5 log10 for C. novyi. Application of treatments at higher temperatures is required to reach a reduction greater or equal to 3 log10 for C. novyi (such as 30 min / 100 ºC, 13 min / 105 ºC, 3 min / 110 ºC, and 1 min / 115 ºC), raising the question of the relevance of the application of heat treatment at 70 ºC / 60 min for these spore-forming bacteria. To conclude, the heat treatment (70 ºC / 60 min) defined by the European regulation is sufficient to inactivate non-sporulating bacteria. Higher temperatures (> 100 ºC) are required as far as spore-forming bacteria concerns to reach a 3 log10 reduction (sporicidal activity).

Keywords: heat treatment, enterococci, clostridia, inactivation kinetics

Procedia PDF Downloads 98
866 Solution of Some Boundary Value Problems of the Generalized Theory of Thermo-Piezoelectricity

Authors: Manana Chumburidze

Abstract:

We have considered a non-classical model of dynamical problems for a conjugated system of differential equations arising in thermo-piezoelectricity, which was formulated by Toupin – Mindlin. The basic concepts and the general theory of solvability for isotropic homogeneous elastic media is considered. They are worked by using the methods the Laplace integral transform, potential method and singular integral equations. Approximate solutions of mixed boundary value problems for finite domain, bounded by the some closed surface are constructed. They are solved in explicitly by using the generalized Fourier's series method.

Keywords: thermo-piezoelectricity, boundary value problems, Fourier's series, isotropic homogeneous elastic media

Procedia PDF Downloads 452
865 Association Rules Mining and NOSQL Oriented Document in Big Data

Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub

Abstract:

Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.

Keywords: Apriori, Association rules mining, Big Data, Data Mining, Hadoop, MapReduce, MongoDB, NoSQL

Procedia PDF Downloads 150
864 Role-Governed Categorization and Category Learning as a Result from Structural Alignment: The RoleMap Model

Authors: Yolina A. Petrova, Georgi I. Petkov

Abstract:

The paper presents a symbolic model for category learning and categorization (called RoleMap). Unlike the other models which implement learning in a separate working mode, role-governed category learning and categorization emerge in RoleMap while it does its usual reasoning. The model is based on several basic mechanisms known as reflecting the sub-processes of analogy-making. It steps on the assumption that in their everyday life people constantly compare what they experience and what they know. Various commonalities between the incoming information (current experience) and the stored one (long-term memory) emerge from those comparisons. Some of those commonalities are considered to be highly important, and they are transformed into concepts for further use. This process denotes the category learning. When there is missing knowledge in the incoming information (i.e. the perceived object is still not recognized), the model makes anticipations about what is missing, based on the similar episodes from its long-term memory. Various such anticipations may emerge for different reasons. However, with time only one of them wins and is transformed into a category member. This process denotes the act of categorization.

Keywords: analogy-making, categorization, category learning, cognitive modeling, role-governed categories

Procedia PDF Downloads 131