Search results for: MODELICA simulation language
4639 Solar Cell Degradation by Electron Irradiation Effect of Irradiation Fluence
Authors: H. Mazouz, A. Belghachi, F. Hadjaj
Abstract:
Solar cells used in orbit are exposed to radiation environment mainly protons and high energy electrons. These particles degrade the output parameters of the solar cell. The aim of this work is to characterize the effects of electron irradiation fluence on the J (V) characteristic and output parameters of gaAs solar cell by numerical simulation. The results obtained demonstrate that the electron irradiation-induced degradation of performances of the cells concerns mainly the short circuit current.Keywords: gaAs solar cell, MeV electron irradiation, irradiation fluence, short circuit
Procedia PDF Downloads 4794638 Two Strain Dengue Dynamics Incorporating Temporary Cross Immunity with ADE Effect
Authors: Sunita Gakkhar, Arti Mishra
Abstract:
In this paper, a nonlinear host vector model has been proposed and analyzed for the two strain dengue dynamics incorporating ADE effect. The model considers that the asymptomatic infected people are more responsible for secondary infection than that of symptomatic ones and differentiates between them. The existence conditions are obtained for various equilibrium points. Basic reproduction number has been computed and analyzed to explore the effect of secondary infection enhancement parameter on dengue infection. Stability analyses of various equilibrium states have been performed. Numerical simulation has been done for the stability of endemic state.Keywords: dengue, ade, stability, threshold, asymptomatic, infection
Procedia PDF Downloads 4314637 Design of Compact UWB Multilayered Microstrip Filter with Wide Stopband
Authors: N. Azadi-Tinat, H. Oraizi
Abstract:
Design of compact UWB multilayered microstrip filter with E-shape resonator is presented, which provides wide stopband up to 20 GHz and arbitrary impedance matching. The design procedure is developed based on the method of least squares and theory of N-coupled transmission lines. The dimensions of designed filter are about 11 mm × 11 mm and the three E-shape resonators are placed among four dielectric layers. The average insertion loss in the passband is less than 1 dB and in the stopband is about 30 dB up to 20 GHz. Its group delay in the UWB region is about 0.5 ns. The performance of the optimized filter design perfectly agrees with the microwave simulation softwares.Keywords: method of least square, multilayer microstrip filter, n-coupled transmission lines, ultra-wideband
Procedia PDF Downloads 3934636 Widely Diversified Macroeconomies in the Super-Long Run Casts a Doubt on Path-Independent Equilibrium Growth Model
Authors: Ichiro Takahashi
Abstract:
One of the major assumptions of mainstream macroeconomics is the path independence of capital stock. This paper challenges this assumption by employing an agent-based approach. The simulation results showed the existence of multiple "quasi-steady state" equilibria of the capital stock, which may cast serious doubt on the validity of the assumption. The finding would give a better understanding of many phenomena that involve hysteresis, including the causes of poverty. The "market-clearing view" has been widely shared among major schools of macroeconomics. They understand that the capital stock, the labor force, and technology, determine the "full-employment" equilibrium growth path and demand/supply shocks can move the economy away from the path only temporarily: the dichotomy between the short-run business cycles and the long-run equilibrium path. The view then implicitly assumes the long-run capital stock to be independent of how the economy has evolved. In contrast, "Old Keynesians" have recognized fluctuations in output as arising largely from fluctuations in real aggregate demand. It will then be an interesting question to ask if an agent-based macroeconomic model, which is known to have path dependence, can generate multiple full-employment equilibrium trajectories of the capital stock in the super-long run. If the answer is yes, the equilibrium level of capital stock, an important supply-side factor, would no longer be independent of the business cycle phenomenon. This paper attempts to answer the above question by using the agent-based macroeconomic model developed by Takahashi and Okada (2010). The model would serve this purpose well because it has neither population growth nor technology progress. The objective of the paper is twofold: (1) to explore the causes of long-term business cycle, and (2) to examine the super-long behaviors of the capital stock of full-employment economies. (1) The simulated behaviors of the key macroeconomic variables such as output, employment, real wages showed widely diversified macro-economies. They were often remarkably stable but exhibited both short-term and long-term fluctuations. The long-term fluctuations occur through the following two adjustments: the quantity and relative cost adjustments of capital stock. The first one is obvious and assumed by many business cycle theorists. The reduced aggregate demand lowers prices, which raises real wages, thereby decreasing the relative cost of capital stock with respect to labor. (2) The long-term business cycles/fluctuations were synthesized with the hysteresis of real wages, interest rates, and investments. In particular, a sequence of the simulation runs with a super-long simulation period generated a wide range of perfectly stable paths, many of which achieved full employment: all the macroeconomic trajectories, including capital stock, output, and employment, were perfectly horizontal over 100,000 periods. Moreover, the full-employment level of capital stock was influenced by the history of unemployment, which was itself path-dependent. Thus, an experience of severe unemployment in the past kept the real wage low, which discouraged a relatively costly investment in capital stock. Meanwhile, a history of good performance sometimes brought about a low capital stock due to a high-interest rate that was consistent with a strong investment.Keywords: agent-based macroeconomic model, business cycle, hysteresis, stability
Procedia PDF Downloads 2134635 Modulation of the Europay, MasterCard, and VisaCard Authentications by Using Avispa Tool
Authors: Ossama Al-Maliki
Abstract:
The Europay, MasterCard, and Visa (EMV) is the transaction protocol for most of the world and especially in Europe and the UK. EMV protocol consists of three main stages which are: card authentication, cardholder verification methods, and transaction authorization. This paper details in full the EMV card authentications. We have used AVISPA and SPAN tools to do our modulization for the EMV card authentications. The code for each type of the card authentication was written by using CAS+ language. The results showed that our modulations were successfully addressed all the steps of the EMV card authentications and the entire process of the EMV card authentication are secured. Also, our modulations were successfully addressed all the main goals behind the EMV card authentications according to the EMV specifications.Keywords: EMV, card authentication, contactless card, SDA, DDA, CDA AVISPA
Procedia PDF Downloads 1814634 Construction of an Assessment Tool for Early Childhood Development in the World of DiscoveryTM Curriculum
Authors: Divya Palaniappan
Abstract:
Early Childhood assessment tools must measure the quality and the appropriateness of a curriculum with respect to culture and age of the children. Preschool assessment tools lack psychometric properties and were developed to measure only few areas of development such as specific skills in music, art and adaptive behavior. Existing preschool assessment tools in India are predominantly informal and are fraught with judgmental bias of observers. The World of Discovery TM curriculum focuses on accelerating the physical, cognitive, language, social and emotional development of pre-schoolers in India through various activities. The curriculum caters to every child irrespective of their dominant intelligence as per Gardner’s Theory of Multiple Intelligence which concluded "even students as young as four years old present quite distinctive sets and configurations of intelligences". The curriculum introduces a new theme every week where, concepts are explained through various activities so that children with different dominant intelligences could understand it. For example: The ‘Insects’ theme is explained through rhymes, craft and counting corner, and hence children with one of these dominant intelligences: Musical, bodily-kinesthetic and logical-mathematical could grasp the concept. The child’s progress is evaluated using an assessment tool that measures a cluster of inter-dependent developmental areas: physical, cognitive, language, social and emotional development, which for the first time renders a multi-domain approach. The assessment tool is a 5-point rating scale that measures these Developmental aspects: Cognitive, Language, Physical, Social and Emotional. Each activity strengthens one or more of the developmental aspects. During cognitive corner, the child’s perceptual reasoning, pre-math abilities, hand-eye co-ordination and fine motor skills could be observed and evaluated. The tool differs from traditional assessment methodologies by providing a framework that allows teachers to assess a child’s continuous development with respect to specific activities in real time objectively. A pilot study of the tool was done with a sample data of 100 children in the age group 2.5 to 3.5 years. The data was collected over a period of 3 months across 10 centers in Chennai, India, scored by the class teacher once a week. The teachers were trained by psychologists on age-appropriate developmental milestones to minimize observer’s bias. The norms were calculated from the mean and standard deviation of the observed data. The results indicated high internal consistency among parameters and that cognitive development improved with physical development. A significant positive relationship between physical and cognitive development has been observed among children in a study conducted by Sibley and Etnier. In Children, the ‘Comprehension’ ability was found to be greater than ‘Reasoning’ and pre-math abilities as indicated by the preoperational stage of Piaget’s theory of cognitive development. The average scores of various parameters obtained through the tool corroborates the psychological theories on child development, offering strong face validity. The study provides a comprehensive mechanism to assess a child’s development and differentiate high performers from the rest. Based on the average scores, the difficulty level of activities could be increased or decreased to nurture the development of pre-schoolers and also appropriate teaching methodologies could be devised.Keywords: child development, early childhood assessment, early childhood curriculum, quantitative assessment of preschool curriculum
Procedia PDF Downloads 3634633 SCR-Based Advanced ESD Protection Device for Low Voltage Application
Authors: Bo Bae Song, Byung Seok Lee, Hyun young Kim, Chung Kwang Lee, Yong Seo Koo
Abstract:
This paper proposed a silicon controller rectifier (SCR) based ESD protection device to protect low voltage ESD for integrated circuit. The proposed ESD protection device has low trigger voltage and high holding voltage compared with conventional SCR-based ESD protection devices. The proposed ESD protection circuit is verified and compared by TCAD simulation. This paper verified effective low voltage ESD characteristics with low trigger voltage of 5.79V and high holding voltage of 3.5V through optimization depending on design variables (D1, D2, D3, and D4).Keywords: ESD, SCR, holding voltage, latch-up
Procedia PDF Downloads 5794632 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit
Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic
Abstract:
Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method
Procedia PDF Downloads 1254631 Convergence Analysis of Reactive Power Based Schemes Used in Sensorless Control of Induction Motors
Authors: N. Ben Si Ali, N. Benalia, N. Zerzouri
Abstract:
Many electronic drivers for the induction motor control are based on sensorless technologies. Speed and torque control is usually attained by application of a speed or position sensor which requires the additional mounting space, reduce the reliability and increase the cost. This paper seeks to analyze dynamical performances and sensitivity to motor parameter changes of reactive power based technique used in sensorless control of induction motors. Validity of theoretical results is verified by simulation.Keywords: adaptive observers, model reference adaptive system, RP-based estimator, sensorless control, stability analysis
Procedia PDF Downloads 5504630 FEM Analysis of an Occluded Ear Simulator with Narrow Slit Pathway
Authors: Manabu Sasajima, Takao Yamaguchi, Yoshio Koike, Mitsuharu Watanabe
Abstract:
This paper discusses the propagation of sound waves in air, specifically in narrow rectangular pathways of an occluded-ear simulator for acoustic measurements. In narrow pathways, both the speed of sound and the phase of the sound waves are affected by the damping of the air viscosity. Herein, we propose a new finite-element method (FEM) that considers the effects of the air viscosity. The method was developed as an extension of existing FEMs for porous, sound-absorbing materials. The results of a numerical calculation for a three-dimensional ear-simulator model using the proposed FEM were validated by comparing with theoretical lumped-parameter modeling analysis and standard values.Keywords: ear simulator, FEM, simulation, viscosity
Procedia PDF Downloads 4474629 Simulation 2D of Flare Steel Tubes
Authors: B. Daheche, M. T. Hannachi, H. Djebaili
Abstract:
In this approach, we tried to describe the flare test tubes welded by high frequency induction HF, and its experimental application. The test is carried out ENTTPP (National company of pipe mill and processing of flat products). Usually, the final products (tube) undergo a series of destructive testing (CD) in order to see the efficiency of welding. This test performed on sections of pipe with a length defined in the notice is made under a determined effort (pressure), which depends on its share of other parameters namely mechanical (fracture resistance) and geometry (thickness tube, outside diameter), the variation of this effort is well researched and recorded.Keywords: flare, destructive testing, pressure, drafts tube, tube finished
Procedia PDF Downloads 3194628 Design and Development of a Prototype Vehicle for Shell Eco-Marathon
Authors: S. S. Dol
Abstract:
Improvement in vehicle efficiency can reduce global fossil fuels consumptions. For that sole reason, Shell Global Corporation introduces Shell Eco-marathon where student teams require to design, build and test energy-efficient vehicles. Hence, this paper will focus on design processes and the development of a fuel economic vehicle which satisfying the requirements of the competition. In this project, three components are designed and analyzed, which are the body, chassis and powertrain of the vehicle. Optimum design for each component is produced through simulation analysis and theoretical calculation in which improvement is made as the project progresses.Keywords: energy efficient, drag force, chassis, powertrain
Procedia PDF Downloads 3404627 The Diary of Dracula, by Marin Mincu: Inquiries into a Romanian 'Book of Wisdom' as a Fictional Counterpart for Corpus Hermeticum
Authors: Lucian Vasile Bagiu, Paraschiva Bagiu
Abstract:
The novel written in Italian and published in Italy in 1992 by the Romanian scholar Marin Mincu is meant for the foreign reader, aiming apparently at a better knowledge of the historical character of Vlad the Empalor (Vlad Dracul), within the European cultural, political and historical context of 1463. Throughout the very well written tome, one comes to realize that one of the underlining levels of the fiction is the exposing of various fundamental features of the Romanian culture and civilization. The author of the diary, Dracula, makes mention of Corpus Hermeticum no less than fifteen times, suggesting his own diary is some sort of a philosophical counterpart. The essay focuses on several ‘truths’ and ‘wisdom’ revealed in the fictional teachings of Dracula. The boycott of History by the Romanians is identified as an echo of the philosophical approach of the famous Romanian scholar and writer Lucian Blaga. The orality of the Romanian culture is a landmark opposed to written culture of the Western Europe. The religion of the ancient Dacian God Zalmoxis is seen as the basis for the Romanian existential and/or metaphysical ethnic philosophy (a feature tackled by the famous Romanian historian of religion Mircea Eliade), with a suggestion that Hermes Trismegistus may have written his Corpus Hermeticum being influenced by Zalmoxis. The historical figure of the last Dacian king Decebalus (death 106 AD) is a good pretext for a tantalizing Indo-European suggestion that the prehistoric Thraco-Dacian people may have been the ancestors of the first Romans settled in Latium. The lost diary of the Emperor Trajan The Bello Dacico may have proved that the unknown language of the Dacians was very much alike Latin language (a secret well hidden by the Vatican). The attitude towards death of the Dacians, as described by Herodotus, may have later inspired Pitagora, Socrates, the Eleusinian and Orphic Mysteries, etc. All of these within the Humanistic and Renascentist European context of the epoch, Dracula having a close relationship with scholars such as Nicolaus Cusanus, Cosimo de Medici, Marsilio Ficino, Pope Pius II, etc. Thus The Diary of Dracula turns out as exciting and stupefying as Corpus Hermeticum, a book impossible to assimilate entirely, yet a reference not wise to be ignored.Keywords: Corpus Hermeticum, Dacians, Dracula, Zalmoxis
Procedia PDF Downloads 1624626 An Alternative Proof for the Topological Entropy of the Motzkin Shift
Authors: Fahad Alsharari, Mohd Salmi Md. Noorani
Abstract:
A Motzkin shift is a mathematical model for constraints on genetic sequences. In terms of the theory of symbolic dynamics, the Motzkin shift is nonsofic, and therefore, we cannot use the Perron-Frobenius theory to calculate its topological entropy. The Motzkin shift M(M,N) which comes from language theory, is defined to be the shift system over an alphabet A that consists of N negative symbols, N positive symbols and M neutral symbols. For an x in the full shift AZ, x is in M(M,N) if and only if every finite block appearing in x has a non-zero reduced form. Therefore, the constraint for x cannot be bounded in length. K. Inoue has shown that the entropy of the Motzkin shift M(M,N) is log(M + N + 1). In this paper, we find a new method of calculating the topological entropy of the Motzkin shift M(M,N) without any measure theoretical discussion.Keywords: entropy, Motzkin shift, mathematical model, theory
Procedia PDF Downloads 4794625 Early Detection of Damages in Railway Steel Truss Bridges from Measured Dynamic Responses
Authors: Dinesh Gundavaram
Abstract:
This paper presents an investigation on bridge damage detection based on the dynamic responses estimated from a passing vehicle. A numerical simulation of steel truss bridge for railway was used in this investigation. The bridge response at different locations is measured using CSI-Bridge software. Several damage scenarios are considered including different locations and severities. The possibilities of dynamic properties of global modes in the identification of structural changes in truss bridges were discussed based on the results of measurement.Keywords: bridge, damage, dynamic responses, detection
Procedia PDF Downloads 2774624 Effective Teaching of Thermofluid Pratical Courses during COVID-19
Authors: Opeyemi Fadipe, Masud Salimian
Abstract:
The COVID-19 pandemic has introduced a new normal into the world; online teaching is now the most used method of teaching over the face to face meeting. With the emergency of these teaching, online-teaching has been improved over time and with more technological advancement tools introduced. Practical courses are more demanding to teach because it requires the physical presence of the student as well as a demonstration of the equipment. In this study, a case of Lagos State University thermofluid practical was the understudy. A survey was done and give to a sample of students to fill. The result showed that the blend-approach is better for practical course teaching. Software simulation of the equipment used to conduct practical should be encouraged in the future.Keywords: COVID-19, online teaching, t-distribution, thermofluid
Procedia PDF Downloads 1814623 Study of Hybrid Cells Based on Perovskite Materials Using Oghmasimultion
Authors: Nadia Bachir (Dahmani), Fatima Zohra Otmani
Abstract:
Due to its interesting optoelectronic properties, methylammonium perovskite CH3NH3PbI3 is used as the active layer in the development of several solar cells. In this work, the hybrid (organic-inorganic) cell with the architecture FTO/pedotpss/CH3NH3PbI3/pcdtbt/Al is simulated using the Organic and Hybrid Material Nano Simulation Tool (OghmaNano). We studied the influence of certain parameters, such as thickness, on the characteristics of the solar cell. The effect of the device temperature was also investigated. The photovoltaic characteristic curves, such as current-voltage (j-V), are presented in this work. The optimized final parameters are Voc = 0.947 V, FF = 0.8034%, and PCE = 23.16%.Keywords: OghmaNano software, hybrid perovskite cell, CH3NH3PbI3, conversion efficiency
Procedia PDF Downloads 194622 Engineering Analysis for Fire Safety Using Computational Fluid Dynamic (CFD)
Authors: Munirajulu M, Srikanth Modem
Abstract:
A large cricket stadium with the capacity to accommodate several thousands of spectators has the seating arena consisting of a two-tier arrangement with an upper and a lower bowl and an intermediate concourse podium level for pedestrian movement to access the bowls. The uniqueness of the stadium is that spectators can have an unobstructed view from all around the podium towards the field of play. Upper and lower bowls are connected by stairs. The stairs landing is a precast slab supported by cantilevered steel beams. These steel beams are fixed to precast columns supporting the stadium structure. The stair slabs are precast concrete supported on a landing slab and cantilevered steel beams. During an event of a fire at podium level between two staircases, fire resistance of steel beams is very critical to life safety. If the steel beam loses its strength due to lack of fire resistance, it will be weak in supporting stair slabs and may lead to a hazard in evacuating occupants from the upper bowl to the lower bowl. In this study, to ascertain fire rating and life safety, a performance-based design using CFD analysis is used to evaluate the steel beams' fire resistance. A fire size of 3.5 MW (convective heat output of fire) with a wind speed of 2.57 m/s is considered for fire and smoke simulation. CFD results show that the smoke temperature near the staircase/ around the staircase does not exceed 1500 C for the fire duration considered. The surface temperature of cantilevered steel beams is found to be less than or equal to 1500 C. Since this temperature is much less than the critical failure temperature of steel (5200 C), it is concluded that the design of structural steel supports on the staircase is adequate and does not need additional fire protection such as fire-resistant coating. CFD analysis provided an engineering basis for the performance-based design of steel structural elements and an opportunity to optimize fire protection requirements. Thus, performance-based design using CFD modeling and simulation of fire and smoke is an innovative way to evaluate fire rating requirements, ascertain life safety and optimize the design with regard to fire protection on structural steel elements.Keywords: fire resistance, life safety, performance-based design, CFD analysis
Procedia PDF Downloads 1954621 On Parameter Estimation of Simultaneous Linear Functional Relationship Model for Circular Variables
Authors: N. A. Mokhtar, A. G. Hussin, Y. Z. Zubairi
Abstract:
This paper proposes a new simultaneous simple linear functional relationship model by assuming equal error variances. We derive the maximum likelihood estimate of the parameters in the simultaneous model and the covariance. We show by simulation study the small bias values of the parameters suggest the suitability of the estimation method. As an illustration, the proposed simultaneous model is applied to real data of the wind direction and wave direction measured by two different instruments.Keywords: simultaneous linear functional relationship model, Fisher information matrix, parameter estimation, circular variables
Procedia PDF Downloads 3704620 [Keynote Speech]: An Overview on the Effectiveness of Critical Thinking on Knowledge
Authors: Solehah Yaacob
Abstract:
The study focuses on revisiting the effectiveness of Critical Thinking in human mind capability as a major faculty in human life. The tool used as a measurement of this knowledge ability consists of several processes including experience and education background. To emphasize the `Overview` concept, the researcher highlights two major aspects of philosophical approach, they are; Divine Revelation Concept and Modern Scientific Theory. The research compares between the both parties to introduce the Divine Revelation into Modern Scientific theory. An analytical and critical study of the both concepts become the methodology of the discussion.Keywords: critical thinking, knowledge, intellectual, language
Procedia PDF Downloads 4414619 Developing Innovations in Classrom Teaching: Process or Product
Authors: Mani Ram Sharma
Abstract:
We live in a busy world with sudden distractions and many things to think about. The rapid speed of science and technology keeps our world in constant motion. Students leaving the classroom after being taught by the teachers are thinking about a thousand things: "Did I understand what teacher taught?" However, when they come into the classroom, as teachers, we expect them to be ready to learn, ready to receive information, and retain it. There is a question that how can learners do this with so much in their learning process. It is obliviously with the use of innovation in the classroom. It fosters the students to learn innovatively to establish learner's autonomy. This article outlines the role, need, and process of innovation in the language classroom and teaching.Keywords: distraction, foster, innovation, learner's autonomy, retainment
Procedia PDF Downloads 2714618 Description of a Structural Health Monitoring and Control System Using Open Building Information Modeling
Authors: Wahhaj Ahmed Farooqi, Bilal Ahmad, Sandra Maritza Zambrano Bernal
Abstract:
In view of structural engineering, monitoring of structural responses over time is of great importance with respect to recent developments of construction technologies. Recently, developments of advanced computing tools have enabled researcher’s better execution of structural health monitoring (SHM) and control systems. In the last decade, building information modeling (BIM) has substantially enhanced the workflow of planning and operating engineering structures. Typically, building information can be stored and exchanged via model files that are based on the Industry Foundation Classes (IFC) standard. In this study a modeling approach for semantic modeling of SHM and control systems is integrated into the BIM methodology using the IFC standard. For validation of the modeling approach, a laboratory test structure, a four-story shear frame structure, is modeled using a conventional BIM software tool. An IFC schema extension is applied to describe information related to monitoring and control of a prototype SHM and control system installed on the laboratory test structure. The SHM and control system is described by a semantic model applying Unified Modeling Language (UML). Subsequently, the semantic model is mapped into the IFC schema. The test structure is composed of four aluminum slabs and plate-to-column connections are fully fixed. In the center of the top story, semi-active tuned liquid column damper (TLCD) is installed. The TLCD is used to reduce effects of structural responses in context of dynamic vibration and displacement. The wireless prototype SHM and control system is composed of wireless sensor nodes. For testing the SHM and control system, acceleration response is automatically recorded by the sensor nodes equipped with accelerometers and analyzed using embedded computing. As a result, SHM and control systems can be described within open BIM, dynamic responses and information of damages can be stored, documented, and exchanged on the formal basis of the IFC standard.Keywords: structural health monitoring, open building information modeling, industry foundation classes, unified modeling language, semi-active tuned liquid column damper, nondestructive testing
Procedia PDF Downloads 1564617 Experimental Quantification of the Intra-Tow Resin Storage Evolution during RTM Injection
Authors: Mathieu Imbert, Sebastien Comas-Cardona, Emmanuelle Abisset-Chavanne, David Prono
Abstract:
Short cycle time Resin Transfer Molding (RTM) applications appear to be of great interest for the mass production of automotive or aeronautical lightweight structural parts. During the RTM process, the two components of a resin are mixed on-line and injected into the cavity of a mold where a fibrous preform has been placed. Injection and polymerization occur simultaneously in the preform inducing evolutions of temperature, degree of cure and viscosity that furthermore affect flow and curing. In order to adjust the processing conditions to reduce the cycle time, it is, therefore, essential to understand and quantify the physical mechanisms occurring in the part during injection. In a previous study, a dual-scale simulation tool has been developed to help determining the optimum injection parameters. This tool allows tracking finely the repartition of the resin and the evolution of its properties during reactive injections with on-line mixing. Tows and channels of the fibrous material are considered separately to deal with the consequences of the dual-scale morphology of the continuous fiber textiles. The simulation tool reproduces the unsaturated area at the flow front, generated by the tow/channel difference of permeability. Resin “storage” in the tows after saturation is also taken into account as it may significantly affect the repartition and evolution of the temperature, degree of cure and viscosity in the part during reactive injections. The aim of the current study is, thanks to experiments, to understand and quantify the “storage” evolution in the tows to adjust and validate the numerical tool. The presented study is based on four experimental repeats conducted on three different types of textiles: a unidirectional Non Crimp Fabric (NCF), a triaxial NCF and a satin weave. Model fluids, dyes and image analysis, are used to study quantitatively, the resin flow in the saturated area of the samples. Also, textiles characteristics affecting the resin “storage” evolution in the tows are analyzed. Finally, fully coupled on-line mixing reactive injections are conducted to validate the numerical model.Keywords: experimental, on-line mixing, high-speed RTM process, dual-scale flow
Procedia PDF Downloads 1694616 Iterative Linear Quadratic Regulator (iLQR) vs LQR Controllers for Quadrotor Path Tracking
Authors: Wesam Jasim, Dongbing Gu
Abstract:
This paper presents an iterative linear quadratic regulator optimal control technique to solve the problem of quadrotors path tracking. The dynamic motion equations are represented based on unit quaternion representation and include some modelled aerodynamical effects as a nonlinear part. Simulation results prove the ability and effectiveness of iLQR to stabilize the quadrotor and successfully track different paths. It also shows that iLQR controller outperforms LQR controller in terms of fast convergence and tracking errors.Keywords: iLQR controller, optimal control, path tracking, quadrotor UAVs
Procedia PDF Downloads 4544615 Occasional Word-Formation in Postfeminist Fiction: Cognitive Approach
Authors: Kateryna Nykytchenko
Abstract:
Modern fiction and non-fiction writers commonly use their own lexical and stylistic devices to capture a reader’s attention and bring certain thoughts and feelings to his reader. Among such devices is the appearance of one of the neologic notions – individual author’s formations: occasionalisms or nonce words. To a significant extent, the host of examples of new words occurs in chick lit genre which has experienced exponential growth in recent years. Chick Lit is a new-millennial postfeminist fiction which focuses primarily on twenty- to thirtysomething middle-class women. It brings into focus the image of 'a new woman' of the 21st century who is always fallible, funny. This paper aims to investigate different types of occasional word-formation which reflect cognitive mechanisms of conveying women’s perception of the world. Chick lit novels of Irish author Marian Keyes present genuinely innovative mixture of forms, both literary and nonliterary which is displayed in different types of occasional word-formation processes such as blending, compounding, creative respelling, etc. Crossing existing mental and linguistic boundaries, adopting herself to new and overlapping linguistic spaces, chick lit author creates new words which demonstrate the result of development and progress of language and the relationship between language, thought and new reality, ultimately resulting in hybrid word-formation (e.g. affixation or pseudoborrowing). Moreover, this article attempts to present the main characteristics of chick-lit fiction genre with the help of the Marian Keyes’s novels and their influence on occasionalisms. There has been a lack of research concerning cognitive nature of occasionalisms. The current paper intends to account for occasional word-formation as a set of interconnected cognitive mechanisms, operations and procedures meld together to create a new word. The results of the generalized analysis solidify arguments that the kind of new knowledge an occasionalism manifests is inextricably linked with cognitive procedure underlying it, which results in corresponding type of word-formation processes. In addition, the findings of the study reveal that the necessity of creating occasionalisms in postmodern fiction novels arises from the need to write in a new way keeping up with a perpetually developing world, and thus the evolution of the speaker herself and her perception of the world.Keywords: Chick Lit, occasionalism, occasional word-formation, cognitive linguistics
Procedia PDF Downloads 1844614 “laws Drifting Off While Artificial Intelligence Thriving” – A Comparative Study with Special Reference to Computer Science and Information Technology
Authors: Amarendar Reddy Addula
Abstract:
Definition of Artificial Intelligence: Artificial intelligence is the simulation of mortal intelligence processes by machines, especially computer systems. Explicit operations of AI comprise expert systems, natural language processing, and speech recognition, and machine vision. Artificial Intelligence (AI) is an original medium for digital business, according to a new report by Gartner. The last 10 times represent an advance period in AI’s development, prodded by the confluence of factors, including the rise of big data, advancements in cipher structure, new machine literacy ways, the materialization of pall computing, and the vibrant open- source ecosystem. Influence of AI to a broader set of use cases and druggies and its gaining fashionability because it improves AI’s versatility, effectiveness, and rigidity. Edge AI will enable digital moments by employing AI for real- time analytics closer to data sources. Gartner predicts that by 2025, further than 50 of all data analysis by deep neural networks will do at the edge, over from lower than 10 in 2021. Responsible AI is a marquee term for making suitable business and ethical choices when espousing AI. It requires considering business and societal value, threat, trust, translucency, fairness, bias mitigation, explainability, responsibility, safety, sequestration, and nonsupervisory compliance. Responsible AI is ever more significant amidst growing nonsupervisory oversight, consumer prospects, and rising sustainability pretensions. Generative AI is the use of AI to induce new vestiges and produce innovative products. To date, generative AI sweats have concentrated on creating media content similar as photorealistic images of people and effects, but it can also be used for law generation, creating synthetic irregular data, and designing medicinals and accoutrements with specific parcels. AI is the subject of a wide- ranging debate in which there's a growing concern about its ethical and legal aspects. Constantly, the two are varied and nonplussed despite being different issues and areas of knowledge. The ethical debate raises two main problems the first, abstract, relates to the idea and content of ethics; the alternate, functional, and concerns its relationship with the law. Both set up models of social geste, but they're different in compass and nature. The juridical analysis is grounded on anon-formalistic scientific methodology. This means that it's essential to consider the nature and characteristics of the AI as a primary step to the description of its legal paradigm. In this regard, there are two main issues the relationship between artificial and mortal intelligence and the question of the unitary or different nature of the AI. From that theoretical and practical base, the study of the legal system is carried out by examining its foundations, the governance model, and the nonsupervisory bases. According to this analysis, throughout the work and in the conclusions, International Law is linked as the top legal frame for the regulation of AI.Keywords: artificial intelligence, ethics & human rights issues, laws, international laws
Procedia PDF Downloads 984613 Leveraging Unannotated Data to Improve Question Answering for French Contract Analysis
Authors: Touila Ahmed, Elie Louis, Hamza Gharbi
Abstract:
State of the art question answering models have recently shown impressive performance especially in a zero-shot setting. This approach is particularly useful when confronted with a highly diverse domain such as the legal field, in which it is increasingly difficult to have a dataset covering every notion and concept. In this work, we propose a flexible generative question answering approach to contract analysis as well as a weakly supervised procedure to leverage unannotated data and boost our models’ performance in general, and their zero-shot performance in particular.Keywords: question answering, contract analysis, zero-shot, natural language processing, generative models, self-supervision
Procedia PDF Downloads 1984612 Cluster-Based Multi-Path Routing Algorithm in Wireless Sensor Networks
Authors: Si-Gwan Kim
Abstract:
Small-size and low-power sensors with sensing, signal processing and wireless communication capabilities is suitable for the wireless sensor networks. Due to the limited resources and battery constraints, complex routing algorithms used for the ad-hoc networks cannot be employed in sensor networks. In this paper, we propose node-disjoint multi-path hexagon-based routing algorithms in wireless sensor networks. We suggest the details of the algorithm and compare it with other works. Simulation results show that the proposed scheme achieves better performance in terms of efficiency and message delivery ratio.Keywords: clustering, multi-path, routing protocol, sensor network
Procedia PDF Downloads 4064611 An Improved Photovolatic System Balancer Architecture
Authors: Chih-Chiang Hua, Yi-Hsiung Fang, Cyuan-Jyun Wong
Abstract:
An improved PV balancer for photovoltaic applications is proposed in this paper. The proposed PV balancer senses the voltage and current of PV module and adjusts the output voltage of converter. Thus, the PV system can implement maximum power point tracking (MPPT) independently for each module whether it is under shading, different irradiation or degradation of PV cell. In addition, the cost of PV balancer can be reduced due to the low power rating of converter. To assess the effectiveness of the proposed system, two PV balancers are designed and verified through simulation under different shading conditions. The proposed PV balancers can provide more energy than the traditional PV balancer.Keywords: MPPT, partial shading, PV System, converter
Procedia PDF Downloads 2964610 A Corpus-Based Contrastive Analysis of Directive Speech Act Verbs in English and Chinese Legal Texts
Authors: Wujian Han
Abstract:
In the process of human interaction and communication, speech act verbs are considered to be the most active component and the main means for information transmission, and are also taken as an indication of the structure of linguistic behavior. The theoretical value and practical significance of such everyday built-in metalanguage have long been recognized. This paper, which is part of a bigger study, is aimed to provide useful insights for a more precise and systematic application to speech act verbs translation between English and Chinese, especially with regard to the degree to which generic integrity is maintained in the practice of translation of legal documents. In this study, the corpus, i.e. Chinese legal texts and their English translations, English legal texts, ordinary Chinese texts, and ordinary English texts, serve as a testing ground for examining contrastively the usage of English and Chinese directive speech act verbs in legal genre. The scope of this paper is relatively wide and essentially covers all directive speech act verbs which are used in ordinary English and Chinese, such as order, command, request, prohibit, threat, advice, warn and permit. The researcher, by combining the corpus methodology with a contrastive perspective, explored a range of characteristics of English and Chinese directive speech act verbs including their semantic, syntactic and pragmatic features, and then contrasted them in a structured way. It has been found that there are similarities between English and Chinese directive speech act verbs in legal genre, such as similar semantic components between English speech act verbs and their translation equivalents in Chinese, formal and accurate usage of English and Chinese directive speech act verbs in legal contexts. But notable differences have been identified in areas of difference between their usage in the original Chinese and English legal texts such as valency patterns and frequency of occurrences. For example, the subjects of some directive speech act verbs are very frequently omitted in Chinese legal texts, but this is not the case in English legal texts. One of the practicable methods to achieve adequacy and conciseness in speech act verb translation from Chinese into English in legal genre is to repeat the subjects or the message with discrepancy, and vice versa. In addition, translation effects such as overuse and underuse of certain directive speech act verbs are also found in the translated English texts compared to the original English texts. Legal texts constitute a particularly valuable material for speech act verb study. Building up such a contrastive picture of the Chinese and English speech act verbs in legal language would yield results of value and interest to legal translators and students of language for legal purposes and have practical application to legal translation between English and Chinese.Keywords: contrastive analysis, corpus-based, directive speech act verbs, legal texts, translation between English and Chinese
Procedia PDF Downloads 502