Search results for: Extended Park´s vector approach
14952 Cognitive SATP for Airborne Radar Based on Slow-Time Coding
Authors: Fanqiang Kong, Jindong Zhang, Daiyin Zhu
Abstract:
Space-time adaptive processing (STAP) techniques have been motivated as a key enabling technology for advanced airborne radar applications. In this paper, the notion of cognitive radar is extended to STAP technique, and cognitive STAP is discussed. The principle for improving signal-to-clutter ratio (SCNR) based on slow-time coding is given, and the corresponding optimization algorithm based on cyclic and power-like algorithms is presented. Numerical examples show the effectiveness of the proposed method.Keywords: space-time adaptive processing (STAP), airborne radar, signal-to-clutter ratio, slow-time coding
Procedia PDF Downloads 27314951 An Analytical Approach to Calculate Thermo-Mechanical Stresses in Integral Abutment Bridge Piles
Authors: Jafar Razmi
Abstract:
Integral abutment bridges are bridges that do not have joints. If these bridges are subject to large seasonal and daily temperature variations, the expansion and contraction of the bridge slab is transferred to the piles. Since the piles are deep into the soil, displacement induced by slab can cause bending and stresses in piles. These stresses cause fatigue and failure of piles. A complex mechanical interaction exists between the slab, pile, soil and abutment. This complex interaction needs to be understood in order to calculate the stresses in piles. This paper uses a mechanical approach in developing analytical equations for the complex structure to determine the stresses in piles. The solution to these analytical solutions is developed and compared with finite element analysis results and experimental data. Our comparison shows that using analytical approach can accurately predict the displacement in piles. This approach offers a simplified technique that can be utilized without the need for computationally extensive finite element model.Keywords: integral abutment bridges, piles, thermo-mechanical stress, stress and strains
Procedia PDF Downloads 24114950 Defects Estimation of Embedded Systems Components by a Bond Graph Approach
Authors: I. Gahlouz, A. Chellil
Abstract:
The paper concerns the estimation of system components faults by using an unknown inputs observer. To reach this goal, we used the Bond Graph approach to physical modelling. We showed that this graphical tool is allowing the representation of system components faults as unknown inputs within the state representation of the considered physical system. The study of the causal and structural features of the system (controllability, observability, finite structure, and infinite structure) based on the Bond Graph approach was hence fulfilled in order to design an unknown inputs observer which is used for the system component fault estimation.Keywords: estimation, bond graph, controllability, observability
Procedia PDF Downloads 41414949 A Generic Approach to Reuse Unified Modeling Language Components Following an Agile Process
Authors: Rim Bouhaouel, Naoufel Kraïem, Zuhoor Al Khanjari
Abstract:
Unified Modeling Language (UML) is considered as one of the widespread modeling language standardized by the Object Management Group (OMG). Therefore, the model driving engineering (MDE) community attempts to provide reuse of UML diagrams, and do not construct it from scratch. The UML model appears according to a specific software development process. The existing method generation models focused on the different techniques of transformation without considering the development process. Our work aims to construct an UML component from fragments of UML diagram basing on an agile method. We define UML fragment as a portion of a UML diagram, which express a business target. To guide the generation of fragments of UML models using an agile process, we need a flexible approach, which adapts to the agile changes and covers all its activities. We use the software product line (SPL) to derive a fragment of process agile method. This paper explains our approach, named RECUP, to generate UML fragments following an agile process, and overviews the different aspects. In this paper, we present the approach and we define the different phases and artifacts.Keywords: UML, component, fragment, agile, SPL
Procedia PDF Downloads 39814948 Argumentation Frameworks and Theories of Judging
Authors: Sonia Anand Knowlton
Abstract:
With the rise of artificial intelligence, computer science is becoming increasingly integrated in virtually every area of life. Of course, the law is no exception. Through argumentation frameworks (AFs), computer scientists have used abstract algebra to structure the legal reasoning process in a way that allows conclusions to be drawn from a formalized system of arguments. In AFs, arguments compete against each other for logical success and are related to one another through the binary operation of the attack. The prevailing arguments make up the preferred extension of the given argumentation framework, telling us what set of arguments must be accepted from a logical standpoint. There have been several developments of AFs since its original conception in the early 90’s in efforts to make them more aligned with the human reasoning process. Generally, these developments have sought to add nuance to the factors that influence the logical success of competing arguments (e.g., giving an argument more logical strength based on the underlying value it promotes). The most cogent development was that of the Extended Argumentation Framework (EAF), in which attacks can themselves be attacked by other arguments, and the promotion of different competing values can be formalized within the system. This article applies the logical structure of EAFs to current theoretical understandings of judicial reasoning to contribute to theories of judging and to the evolution of AFs simultaneously. The argument is that the main limitation of EAFs, when applied to judicial reasoning, is that they require judges to themselves assign values to different arguments and then lexically order these values to determine the given framework’s preferred extension. Drawing on John Rawls’ Theory of Justice, the examination that follows is whether values are lexical and commensurable to this extent. The analysis that follows then suggests a potential extension of the EAF system with an approach that formalizes different “planes of attack” for competing arguments that promote lexically ordered values. This article concludes with a summary of how these insights contribute to theories of judging and of legal reasoning more broadly, specifically in indeterminate cases where judges must turn to value-based approaches.Keywords: computer science, mathematics, law, legal theory, judging
Procedia PDF Downloads 6014947 Ecopsychological Approach to Enhance Space Consciousness Toward Environment
Authors: Tiwi Kamidin
Abstract:
After years of effort trying to integrate environmental education, studies keep revealing that Malaysian still not reached the certain level of desired commitment toward the environment. Some researchers mentioned that our planet healthy is depending on our mentally health especially our psychological and spiritual is split from the natural. Therefore, this study discussed on ecopcyhological approach in order to enhance space consciousness toward the environment. Space consciousness represents not only freedom from ego but also from dependency on the things of this world, from materialism and materiality. It is the spiritual dimension which alone can give transcendent and true meaning to this world. If pupils can balance this internal awareness will put an individual to respect the environment as part of yourself and your family against only as contributors to the continuance of human’s life. Qualitative findings showed that the informants considered their consciousness toward environment has been changed.Keywords: ecopsychological approach, space consciousness, environmental education, environment
Procedia PDF Downloads 30914946 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia
Authors: Rohan Bhasin
Abstract:
Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM
Procedia PDF Downloads 16414945 Segmentation of Korean Words on Korean Road Signs
Authors: Lae-Jeong Park, Kyusoo Chung, Jungho Moon
Abstract:
This paper introduces an effective method of segmenting Korean text (place names in Korean) from a Korean road sign image. A Korean advanced directional road sign is composed of several types of visual information such as arrows, place names in Korean and English, and route numbers. Automatic classification of the visual information and extraction of Korean place names from the road sign images make it possible to avoid a lot of manual inputs to a database system for management of road signs nationwide. We propose a series of problem-specific heuristics that correctly segments Korean place names, which is the most crucial information, from the other information by leaving out non-text information effectively. The experimental results with a dataset of 368 road sign images show 96% of the detection rate per Korean place name and 84% per road sign image.Keywords: segmentation, road signs, characters, classification
Procedia PDF Downloads 44414944 Parameter Interactions in the Cumulative Prospect Theory: Fitting the Binary Choice Experiment Data
Authors: Elzbieta Babula, Juhyun Park
Abstract:
Tversky and Kahneman’s cumulative prospect theory assumes symmetric probability cumulation with regard to the reference point within decision weights. Theoretically, this model should be invariant under the change of the direction of probability cumulation. In the present study, this phenomenon is being investigated by creating a reference model that allows verifying the parameter interactions in the cumulative prospect theory specifications. The simultaneous parametric fitting of utility and weighting functions is applied to binary choice data from the experiment. The results show that the flexibility of the probability weighting function is a crucial characteristic allowing to prevent parameter interactions while estimating cumulative prospect theory.Keywords: binary choice experiment, cumulative prospect theory, decision weights, parameter interactions
Procedia PDF Downloads 21614943 An Efficient Automated Radiation Measuring System for Plasma Monopole Antenna
Authors: Gurkirandeep Kaur, Rana Pratap Yadav
Abstract:
This experimental study is aimed to examine the radiation characteristics of different plasma structures of a surface wave-driven plasma antenna by an automated measuring system. In this study, a 30 cm long plasma column of argon gas with a diameter of 3 cm is excited by surface wave discharge mechanism operating at 13.56 MHz with RF power level up to 100 Watts and gas pressure between 0.01 to 0.05 mb. The study reveals that a single structured plasma monopole can be modified into an array of plasma antenna elements by forming multiple striations or plasma blobs inside the discharge tube by altering the values of plasma properties such as working pressure, operating frequency, input RF power, discharge tube dimensions, i.e., length, radius, and thickness. It is also reported that plasma length, electron density, and conductivity are functions of operating plasma parameters and controlled by changing working pressure and input power. To investigate the antenna radiation efficiency for the far-field region, an automation-based radiation measuring system has been fabricated and presented in detail. This developed automated system involves a combined setup of controller, dc servo motors, vector network analyzer, and computing device to evaluate the radiation intensity, directivity, gain and efficiency of plasma antenna. In this system, the controller is connected to multiple motors for moving aluminum shafts in both elevation and azimuthal plane whereas radiation from plasma monopole antenna is measured by a Vector Network Analyser (VNA) which is further wired up with the computing device to display radiations in polar plot forms. Here, the radiation characteristics of both continuous and array plasma monopole antenna have been studied for various working plasma parameters. The experimental results clearly indicate that the plasma antenna is as efficient as a metallic antenna. The radiation from plasma monopole antenna is significantly influenced by plasma properties which provides a wider range in radiation pattern where desired radiation parameters like beam-width, the direction of radiation, radiation intensity, antenna efficiency, etc. can be achieved in a single monopole. Due to its wide range of selectivity in radiation pattern; this can meet the demands of wider bandwidth to get high data speed in communication systems. Moreover, this developed system provides an efficient and cost-effective solution for measuring the radiation pattern in far-field zone for any kind of antenna system.Keywords: antenna radiation characteristics, dynamically reconfigurable, plasma antenna, plasma column, plasma striations, surface wave
Procedia PDF Downloads 11914942 Communicative Language Teaching Technique: A Neglected Approach in Reading Comprehension Instruction
Authors: Olumide Yusuf Jimoh
Abstract:
Reading comprehension is an interactive and purposeful process of getting meaning from and bringing meaning to a text. Over the years, teachers of the English Language (in Nigeria) have been glued to the monotonous method of making students read comprehension passages silently and then answer the questions that follow such passages without making the reading session interactive. Hence, students often find such exercises monotonous and boring. Consequently, students' interest in language learning continues to dwindle, and this often affects their overall academic performance. Relying on Communicative Accommodation Theory therefore, the study employed the qualitative research design method to x-ray Communicative Language Teaching Approach (CLTA) in reading comprehension. Moreover, techniques such as the Genuinely Collaborative Reading Approach (GCRA), Jigsaw reading, Pre-reading, and Post-reading tasks were examined. The researcher submitted that effective reading comprehension could not be done passively. Students must respond to what they read; they must interact not only with the materials being read but also with one another and with the teacher; this can be achieved by developing communicative and interactive reading programs.Keywords: collaborative reading approach, communicative teaching, interactive reading program, pre-reading task, reading comprehension
Procedia PDF Downloads 10914941 The Implementation of Organizational Ecoinnovativeness as an Expression of a Strategic Approach of an Organization
Authors: Marzena Hajduk-Stelmachowicz
Abstract:
This paper presents the reasons why the implementation of the organizational eco-innovation (based on requirements of the International Standard ISO 14001) can be an expression of a strategic organization approach. An elaboration about different issues associated with the Environmental Management Systems are given.Keywords: envionmental management system, ISO 14001, organizational ecoinnovativeness, ecoinnovation
Procedia PDF Downloads 31514940 Coupled Space and Time Homogenization of Viscoelastic-Viscoplastic Composites
Authors: Sarra Haouala, Issam Doghri
Abstract:
In this work, a multiscale computational strategy is proposed for the analysis of structures, which are described at a refined level both in space and in time. The proposal is applied to two-phase viscoelastic-viscoplastic (VE-VP) reinforced thermoplastics subjected to large numbers of cycles. The main aim is to predict the effective long time response while reducing the computational cost considerably. The proposed computational framework is a combination of the mean-field space homogenization based on the generalized incrementally affine formulation for VE-VP composites, and the asymptotic time homogenization approach for coupled isotropic VE-VP homogeneous solids under large numbers of cycles. The time homogenization method is based on the definition of micro and macro-chronological time scales, and on asymptotic expansions of the unknown variables. First, the original anisotropic VE-VP initial-boundary value problem of the composite material is decomposed into coupled micro-chronological (fast time scale) and macro-chronological (slow time-scale) problems. The former is purely VE, and solved once for each macro time step, whereas the latter problem is nonlinear and solved iteratively using fully implicit time integration. Second, mean-field space homogenization is used for both micro and macro-chronological problems to determine the micro and macro-chronological effective behavior of the composite material. The response of the matrix material is VE-VP with J2 flow theory assuming small strains. The formulation exploits the return-mapping algorithm for the J2 model, with its two steps: viscoelastic predictor and plastic corrections. The proposal is implemented for an extended Mori-Tanaka scheme, and verified against finite element simulations of representative volume elements, for a number of polymer composite materials subjected to large numbers of cycles.Keywords: asymptotic expansions, cyclic loadings, inclusion-reinforced thermoplastics, mean-field homogenization, time homogenization
Procedia PDF Downloads 37114939 Development of Transgenic Tomato Immunity to Pepino Mosaic Virus and Tomato Yellow Leaf Curl Virus by Gene Silencing Approach
Authors: D. Leibman, D. Wolf, A. Gal-On
Abstract:
Viral diseases of tomato crops result in heavy yield losses and may even jeopardize the production of these crops. Classical tomato breeding for disease resistance against Tomato yellow leaf curl virus (TYLCV), leads to partial resistance associated with a number of recessive genes. To author’s best knowledge Pepino mosaic virus (PepMV) genetic resistance is not yet available. The generation of viral resistance by means of genetic engineering was reported and implemented for many crops, including tomato. Transgenic resistance against viruses is based, in most cases, on Post Transcriptional Gene Silencing (PTGS), an endogenous mechanism which destroys the virus genome. In this work, we developed immunity against PepMV and TYLCV in a tomato based on a PTGS mechanism. Tomato plants were transformed with a hairpin-construct-expressed transgene-derived double-strand-RNA (tr-dsRNA). In the case of PepMV, the binary construct harbored three consecutive fragments of the replicase gene from three different PepMV strains (Italian, Spanish and American), to provide resistance against a range of virus strains. In the case of TYLCV, the binary vector included three consecutive fragments of the IR, V2 and C2 viral genes constructed in a hairpin configuration. Selected transgenic lines (T0) showed a high accumulation of transgene siRNA of 21-24 bases, and T1 transgenic lines showed complete immunity to PepMV and TYLCV. Graft inoculation displayed immunity of the transgenic scion against PepMV and TYLCV. The study presents the engineering of resistance in tomato against two serious diseases, which will help in the production of high-quality tomato. However, unfortunately, these resistant plants have not been implemented due to public ignorance and opposition against breeding by genetic engineering.Keywords: PepMV, PTGS, TYLCV, tr-dsRNA
Procedia PDF Downloads 13414938 Evaluation of Progressive Collapse of Transmission Tower
Authors: Jeong-Hwan Choi, Hyo-Sang Park, Tae-Hyung Lee
Abstract:
The transmission tower is one of the crucial lifeline structures in a modern society, and it needs to be protected against extreme loading conditions. However, the transmission tower is a very complex structure and, therefore, it is very difficult to simulate the actual damage and the collapse behavior of the tower structure. In this study, the actual collapse behavior of the transmission tower due to lateral loading conditions such as wind load is evaluated through the computational simulation. For that, a progressive collapse procedure is applied to the simulation. In this procedure, after running the simulation, if a member of the tower structure fails, the failed member is removed and the simulation run again. The 154kV transmission tower is selected for this study. The simulation is performed by nonlinear static analysis procedure, namely pushover analysis, using OpenSEES, an earthquake simulation platform. Three-dimensional finite element models of those towers are developed.Keywords: transmission tower, OpenSEES, pushover, progressive collapse
Procedia PDF Downloads 35814937 Gene Expression Meta-Analysis of Potential Shared and Unique Pathways Between Autoimmune Diseases Under anti-TNFα Therapy
Authors: Charalabos Antonatos, Mariza Panoutsopoulou, Georgios K. Georgakilas, Evangelos Evangelou, Yiannis Vasilopoulos
Abstract:
The extended tissue damage and severe clinical outcomes of autoimmune diseases, accompanied by the high annual costs to the overall health care system, highlight the need for an efficient therapy. Increasing knowledge over the pathophysiology of specific chronic inflammatory diseases, namely Psoriasis (PsO), Inflammatory Bowel Diseases (IBD) consisting of Crohn’s disease (CD) and Ulcerative colitis (UC), and Rheumatoid Arthritis (RA), has provided insights into the underlying mechanisms that lead to the maintenance of the inflammation, such as Tumor Necrosis Factor alpha (TNF-α). Hence, the anti-TNFα biological agents pose as an ideal therapeutic approach. Despite the efficacy of anti-TNFα agents, several clinical trials have shown that 20-40% of patients do not respond to treatment. Nowadays, high-throughput technologies have been recruited in order to elucidate the complex interactions in multifactorial phenotypes, with the most ubiquitous ones referring to transcriptome quantification analyses. In this context, a random effects meta-analysis of available gene expression cDNA microarray datasets was performed between responders and non-responders to anti-TNFα therapy in patients with IBD, PsO, and RA. Publicly available datasets were systematically searched from inception to 10th of November 2020 and selected for further analysis if they assessed the response to anti-TNFα therapy with clinical score indexes from inflamed biopsies. Specifically, 4 IBD (79 responders/72 non-responders), 3 PsO (40 responders/11 non-responders) and 2 RA (16 responders/6 non-responders) datasetswere selected. After the separate pre-processing of each dataset, 4 separate meta-analyses were conducted; three disease-specific and a single combined meta-analysis on the disease-specific results. The MetaVolcano R package (v.1.8.0) was utilized for a random-effects meta-analysis through theRestricted Maximum Likelihood (RELM) method. The top 1% of the most consistently perturbed genes in the included datasets was highlighted through the TopConfects approach while maintaining a 5% False Discovery Rate (FDR). Genes were considered as Differentialy Expressed (DEGs) as those with P ≤ 0.05, |log2(FC)| ≥ log2(1.25) and perturbed in at least 75% of the included datasets. Over-representation analysis was performed using Gene Ontology and Reactome Pathways for both up- and down-regulated genes in all 4 performed meta-analyses. Protein-Protein interaction networks were also incorporated in the subsequentanalyses with STRING v11.5 and Cytoscape v3.9. Disease-specific meta-analyses detected multiple distinct pro-inflammatory and immune-related down-regulated genes for each disease, such asNFKBIA, IL36, and IRAK1, respectively. Pathway analyses revealed unique and shared pathways between each disease, such as Neutrophil Degranulation and Signaling by Interleukins. The combined meta-analysis unveiled 436 DEGs, 86 out of which were up- and 350 down-regulated, confirming the aforementioned shared pathways and genes, as well as uncovering genes that participate in anti-inflammatory pathways, namely IL-10 signaling. The identification of key biological pathways and regulatory elements is imperative for the accurate prediction of the patient’s response to biological drugs. Meta-analysis of such gene expression data could aid the challenging approach to unravel the complex interactions implicated in the response to anti-TNFα therapy in patients with PsO, IBD, and RA, as well as distinguish gene clusters and pathways that are altered through this heterogeneous phenotype.Keywords: anti-TNFα, autoimmune, meta-analysis, microarrays
Procedia PDF Downloads 18314936 An Optimized RDP Algorithm for Curve Approximation
Authors: Jean-Pierre Lomaliza, Kwang-Seok Moon, Hanhoon Park
Abstract:
It is well-known that Ramer Douglas Peucker (RDP) algorithm greatly depends on the method of choosing starting points. Therefore, this paper focuses on finding such starting points that will optimize the results of RDP algorithm. Specifically, this paper proposes a curve approximation algorithm that finds flat points, called essential points, of an input curve, divides the curve into corner-like sub-curves using the essential points, and applies the RDP algorithm to the sub-curves. The number of essential points play a role on optimizing the approximation results by balancing the degree of shape information loss and the amount of data reduction. Through experiments with curves of various types and complexities of shape, we compared the performance of the proposed algorithm with three other methods, i.e., the RDP algorithm itself and its variants. As a result, the proposed algorithm outperformed the others in term of maintaining the original shapes of the input curve, which is important in various applications like pattern recognition.Keywords: curve approximation, essential point, RDP algorithm
Procedia PDF Downloads 53914935 Theoretical BER Analyzing of MPSK Signals Based on the Signal Space
Authors: Jing Qing-feng, Liu Danmei
Abstract:
Based on the optimum detection, signal projection and Maximum A Posteriori (MAP) rule, Proakis has deduced the theoretical BER equation of Gray coded MPSK signals. Proakis analyzed the BER theoretical equations mainly based on the projection of signals, which is difficult to be understood. This article solve the same problem based on the signal space, which explains the vectors relations among the sending signals, received signals and noises. The more explicit and easy-deduced process is illustrated in this article based on the signal space, which can illustrated the relations among the signals and noises clearly. This kind of deduction has a univocal geometry meaning. It can explain the correlation between the production and calculation of BER in vector level.Keywords: MPSK, MAP, signal space, BER
Procedia PDF Downloads 34614934 A Pragmatic Study of Falnama Texts Based on Critical Discourse Analysis Approach
Authors: Raziyeh Mashhadi Moghadam
Abstract:
Persian writings in the form of stories, scientific articles, historiographies, biographies, and philosophical, religious, and poetic arguments have established their presence in the past and present. Any piece of text is composed in a unique style depending on its content and subject. In this paper, a manuscript called Falnama of the Prophet is reviewed. Only a few scattered pages of this version are extant, and the author, using the name of twenty-four prophets, seeks to explore the presence and future of the reader. This version is analyzed based on Norman Fairclough’s Critical Discourse Analysis (CDA) approach to unravel the underlying processes in this type of manuscript. The spelling of some words and sentences is different from that of the new written Persian version.Keywords: application of Falnama texts, critical discourse analysis, Fairclough’s approach
Procedia PDF Downloads 10914933 Ensuring Safe Operation by Providing an End-To-End Field Monitoring and Incident Management Approach for Autonomous Vehicle Based on ML/Dl SW Stack
Authors: Lucas Bublitz, Michael Herdrich
Abstract:
By achieving the first commercialization approval in San Francisco the Autonomous Driving (AD) industry proves the technology maturity of the SAE L4 AD systems and the corresponding software and hardware stack. This milestone reflects the upcoming phase in the industry, where the focus is now about scaling and supervising larger autonomous vehicle (AV) fleets in different operation areas. This requires an operation framework, which organizes and assigns responsibilities to the relevant AV technology and operation stakeholders from the AV system provider, the Remote Intervention Operator, the MaaS provider and regulatory & approval authority. This holistic operation framework consists of technological, processual, and organizational activities to ensure safe operation for fully automated vehicles. Regarding the supervision of large autonomous vehicle fleets, a major focus is on the continuous field monitoring. The field monitoring approach must reflect the safety and security criticality of incidents in the field during driving operation. This includes an automatic containment approach, with the overall goal to avoid safety critical incidents and reduce downtime by a malfunction of the AD software stack. An End-to-end (E2E) field monitoring approach detects critical faults in the field, uses a knowledge-based approach for evaluating the safety criticality and supports the automatic containment of these E/E faults. Applying such an approach will ensure the scalability of AV fleets, which is determined by the handling of incidents in the field and the continuous regulatory compliance of the technology after enhancing the Operational Design Domain (ODD) or the function scope by Functions on Demand (FoD) over the entire digital product lifecycle.Keywords: field monitoring, incident management, multicompliance management for AI in AD, root cause analysis, database approach
Procedia PDF Downloads 7714932 Stability Analysis for an Extended Model of the Hypothalamus-Pituitary-Thyroid Axis
Authors: Beata Jackowska-Zduniak
Abstract:
We formulate and analyze a mathematical model describing dynamics of the hypothalamus-pituitary-thyroid homoeostatic mechanism in endocrine system. We introduce to this system two types of couplings and delay. In our model, feedback controls the secretion of thyroid hormones and delay reflects time lags required for transportation of the hormones. The influence of delayed feedback on the stability behaviour of the system is discussed. Analytical results are illustrated by numerical examples of the model dynamics. This system of equations describes normal activity of the thyroid and also a couple of types of malfunctions (e.g. hyperthyroidism).Keywords: mathematical modeling, ordinary differential equations, endocrine system, delay differential equation
Procedia PDF Downloads 33614931 A Study of Shigeru Ban's Environmentally-Sensitive Design Approach
Authors: Duygu Merve Bulut, Fehime Yesim Gurani
Abstract:
The Japanese architect Shigeru Ban has succeeded in bringing a different understanding to the modern architectural design approach with both the material selection and the techniques he used while combining the material with the design. Ban, who reflects his respect to people and nature with his designs, has encouraged that design should be done with economic materials, easily accessible and understandable for everyone. Because of this, Ban has attracted attention and appreciated in the architectural world with his environmentally-sensitive design ideology and humanitarian projects. In order to understand Ban’s environmentally-sensitive design approach, with this article, Ban’s projects which have used natural materials; the projects of Ban’s Japenese Pavilion in Germany, Papertainer Museum in South Korea, Centre Pompidou-Metz in France and Cardboard Cathedral in New Zealand were examined and analyzed. In the following parts, 'paper tube' technology that creates awareness in architectural area, which developed and applied by Ban; has been examined in terms of building material and structure of sustainable space design. As a result of this review, Ban’s approach is evaluated in terms of its contribution to the understanding of sustainable design.Keywords: ecological design, environmentally-sensitive design, paper tube, Shigeru Ban, sustainability
Procedia PDF Downloads 50214930 Simulation of Lean Principles Impact in a Multi-Product Supply Chain
Authors: Matteo Rossini, Alberto Portioli Staudacher
Abstract:
The market competition is moving from the single firm to the whole supply chain one because of increasing competition and growing need for operational efficiencies and customer orientation. Supply chain management allows companies to look beyond their organizational boundaries to develop and leverage resources and capabilities of their supply chain partners. This leads to create competitive advantages in the marketplace and because of this SCM has acquired strategic importance. Lean Approach is a management strategy that focuses on reducing every type of waste present in an organization. This approach is becoming more and more popular among supply chain managers. The supply chain application of lean approach is low diffused. It is not well studied which are the impacts of lean approach principles in a supply chain context. In literature there are only few studies simulating the lean approach performance in single products supply chain. This research work studies the impacts of lean principles implementation along a supply chain. To achieve this, a simulation model of a three-echelon multiproduct product supply chain has been built. Kanban system (and several priority policies) and setup time reduction degrees are implemented in the lean-configured supply chain to apply pull and lot-sizing decrease principles respectively. To evaluate the benefits of lean approach, lean supply chain is compared with an EOQ-configured supply chain. The simulation results show that Kanban system and setup-time reduction improve inventory stock level. They also show that logistics efforts are affected to lean implementation degree. The paper concludes describing performances of lean supply chain in different contexts.Keywords: inventory policy, Kanban, lean supply chain, simulation study, supply chain management, planning
Procedia PDF Downloads 35814929 Numerical Investigation of the Effect of the Spark Plug Gap on Engine-Like Conditions
Authors: Fernanda Pinheiro Martins, Pedro Teixeira Lacava
Abstract:
The objective of this research is to analyze the effects of different spark plug conditions in engine-like conditions by applying computational fluid dynamics analysis. The 3D models applied consist of 3-Zones Extended Coherent Flame (ECFM-3Z) and Imposed Stretch Spark Ignition Model (ISSIM), respectively, for the combustion and the spark plug modelling. For this study, it was applied direct injection fuel system in a single cylinder engine operating with E0. The application of realistic operating conditions (load and speed) to the different cases studied will provide a deeper understanding of the effects of the spark plug gap, a result of parts outwearing in most of the cases, to the development of the combustion in engine-like conditions.Keywords: engine, CFD, direct injection, combustion, spark plug
Procedia PDF Downloads 13014928 Defining New Limits in Hybrid Perovskites: Single-Crystal Solar Cells with Exceptional Electron Diffusion Length Reaching Half Millimeters
Authors: Bekir Turedi
Abstract:
Exploiting the potential of perovskite single-crystal solar cells in optoelectronic applications necessitates overcoming a significant challenge: the low charge collection efficiency at increased thickness, which has restricted their deployment in radiation detectors and nuclear batteries. Our research details a promising approach to this problem, wherein we have successfully fabricated single-crystal MAPbI3 solar cells employing a space-limited inverse temperature crystallization (ITC) methodology. Remarkably, these cells, up to 400-fold thicker than current-generation perovskite polycrystalline films, maintain a high charge collection efficiency even without external bias. The crux of this achievement lies in the long electron diffusion length within these cells, estimated to be around 0.45 mm. This extended diffusion length ensures the conservation of high charge collection and power conversion efficiencies, even as the thickness of the cells increases. Fabricated cells at 110, 214, and 290 µm thickness manifested power conversion efficiencies (PCEs) of 20.0, 18.4, and 14.7% respectively. The single crystals demonstrated nearly optimal charge collection, even when their thickness exceeded 200 µm. Devices of thickness 108, 214, and 290 µm maintained 98.6, 94.3, and 80.4% of charge collection efficiency relative to their maximum theoretical short-circuit current value, respectively. Additionally, we have proposed an innovative, self-consistent technique for ascertaining the electron-diffusion length in perovskite single crystals under operational conditions. The computed electron-diffusion length approximated 446 µm, significantly surpassing previously reported values for this material. In conclusion, our findings underscore the feasibility of fabricating halide perovskite single-crystal solar cells of hundreds of micrometers in thickness while preserving high charge extraction efficiency and PCE. This advancement paves the way for developing perovskite-based optoelectronics necessitating thicker active layers, such as X-ray detectors and nuclear batteries.Keywords: perovskite, solar cell, single crystal, diffusion length
Procedia PDF Downloads 5314927 Towards a Business Process Model Deriving from an Intentional Perspective
Authors: Omnia Saidani Neffati, Rim Samia Kaabi, Naoufel Kraiem
Abstract:
In this paper, we propose an approach aiming at (i) representing services at two levels: the intentional level and the organizational level, and (ii) establishing mechanisms allowing to make a transition from the first level to the second one in order to execute intentional services. An example is used to validate our approach.Keywords: intentional service, business process, BPMN, MDE, intentional service execution
Procedia PDF Downloads 39614926 Comparative Efficacy of Gas Phase Sanitizers for Inactivating Salmonella, Escherichia coli O157:H7 and Listeria monocytogenes on Intact Lettuce Heads
Authors: Kayla Murray, Andrew Green, Gopi Paliyath, Keith Warriner
Abstract:
Introduction: It is now acknowledged that control of human pathogens associated with fresh produce requires an integrated approach of several interventions as opposed to relying on post-harvest washes to remove field acquired contamination. To this end, current research is directed towards identifying such interventions that can be applied at different points in leafy green processing. Purpose: In the following the efficacy of different gas phase treatments to decontaminate whole lettuce heads during pre-processing storage were evaluated. Methods: Whole Cos lettuce heads were spot inoculated with L. monocytogenes, E. coli O157:H7 or Salmonella spp. The inoculated lettuce heads were then placed in a treatment chamber and exposed to ozone, chlorine dioxide or hydroxyl radicals at different time periods under a range of relative humidity. Survivors of the treatments were enumerated along with sensory analysis performed on the treated lettuce. Results: Ozone gas reduced L. monocytogenes by 2-log10 after ten-minutes of exposure with Salmonella and E. coli O157:H7 being decreased by 0.66 and 0.56-log cfu respectively. Chlorine dioxide gas treatment reduced L. monocytogenes and Salmonella on lettuce heads by 4 log cfu but only supported a 0.8 log cfu reduction in E. coli O157:H7 numbers. In comparison, hydroxyl radicals supported a 2.9 – 4.8 log cfu reduction of model human pathogens inoculated onto lettuce heads but required extended exposure times and relative humidity < 0.8. Significance: From the gas phase sanitizers tested, chlorine dioxide and hydroxyl radicals are the most effective. The latter process holds most promise based on the ease of delivery, worker safety and preservation of lettuce sensory characteristics. Although expose times for hydroxyl radicles was relatively long (24h) this should not be considered a limitation given the intervention is applied in store rooms or in transport containers during transit.Keywords: gas phase sanitizers, iceberg lettuce heads, leafy green processing
Procedia PDF Downloads 40914925 A New Approach to Increase Consumer Understanding of Meal’s Quality – Food Focus Instead of Nutrient Focus
Authors: Elsa Lamy, Marília Prada, Ada Rocha, Cláudia Viegas
Abstract:
The traditional and widely used nutrition-focused approach to communicate with consumers is reductionist and makes it difficult for consumers to assess their food intake. Without sufficient nutrition knowledge and understanding, it would be difficult to choose a healthful diet based only on nutritional recommendations. This study aimed to evaluate the understanding of how food/nutritional information is presented in menus to Portuguese consumers, comparing the nutrient-focused approach (currently used Nutrition Declaration) and the new food-focused approach (the infographic). For data collection, a questionnaire was distributed online using social media channels. A main effect of format on ratings of meal balance and completeness (Fbalance(1,79) = 18.26, p < .001, ηp2 = .188; Fcompleteness(1,67) = 27.18, p < .001, ηp2 = .289). Overall, dishes paired with the nutritional information were rated as more balanced (Mbalance= 3.70, SE = .11; Mcompleteness = 4.00, SE = .14) than meals with the infographic representation (Mbalance = 3.14, SE = .11; Mcompleteness = 3.29, SE = .13). We also observed a main effect of the meal, F(3,237) = 48.90, p < .001, ηp2 = .382, such that M1 and M2 were perceived as less balanced than the M3 and M4, all p < .001. The use of a food-focused approach (infographic) helped participants identify the lack of balance in the less healthful meals (dishes M1 and M2), allowing for a better understanding of meals' compliance with recommendations contributing to better food choices and a healthier lifestyle.Keywords: food labelling, food and nutritional recommendations, infographics, portions based information
Procedia PDF Downloads 8014924 Dynamic Analysis of Double Deck Tunnel
Authors: C. W. Kwak, I. J. Park, D. I. Jang
Abstract:
The importance of cost-wise effective application and construction is getting increase due to the surge of traffic volume in the metropolitan cities. Accordingly, the necessity of the tunnel has large section becomes more critical. Double deck tunnel can be one of the most appropriate solutions to the necessity. The dynamic stability of double deck tunnel is essential against seismic load since it has large section and connection between perimeter lining and interim slab. In this study, 3-dimensional dynamic numerical analysis was conducted based on the Finite Difference Method to investigate the seismic behavior of double deck tunnel. Seismic joint for dynamic stability and the mitigation of seismic impact on the lining was considered in the modeling and analysis. Consequently, the mitigation of acceleration, lining displacement and stress were verified successfully.Keywords: double deck tunnel, interim slab, 3-dimensional dynamic numerical analysis, seismic joint
Procedia PDF Downloads 38214923 Robust Numerical Solution for Flow Problems
Authors: Gregor Kosec
Abstract:
Simple and robust numerical approach for solving flow problems is presented, where involved physical fields are represented through the local approximation functions, i.e., the considered field is approximated over a local support domain. The approximation functions are then used to evaluate the partial differential operators. The type of approximation, the size of support domain, and the type and number of basis function can be general. The solution procedure is formulated completely through local computational operations. Besides local numerical method also the pressure velocity is performed locally with retaining the correct temporal transient. The complete locality of the introduced numerical scheme has several beneficial effects. One of the most attractive is the simplicity since it could be understood as a generalized Finite Differences Method, however, much more powerful. Presented methodology offers many possibilities for treating challenging cases, e.g. nodal adaptivity to address regions with sharp discontinuities or p-adaptivity to treat obscure anomalies in physical field. The stability versus computation complexity and accuracy can be regulated by changing number of support nodes, etc. All these features can be controlled on the fly during the simulation. The presented methodology is relatively simple to understand and implement, which makes it potentially powerful tool for engineering simulations. Besides simplicity and straightforward implementation, there are many opportunities to fully exploit modern computer architectures through different parallel computing strategies. The performance of the method is presented on the lid driven cavity problem, backward facing step problem, de Vahl Davis natural convection test, extended also to low Prandtl fluid and Darcy porous flow. Results are presented in terms of velocity profiles, convergence plots, and stability analyses. Results of all cases are also compared against published data.Keywords: fluid flow, meshless, low Pr problem, natural convection
Procedia PDF Downloads 234