Search results for: contextuallism with humanistic approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13907

Search results for: contextuallism with humanistic approach

12167 A Linear Autoregressive and Non-Linear Regime Switching Approach in Identifying the Structural Breaks Caused by Anti-Speculation Measures: The Case of Hong Kong

Authors: Mengna Hu

Abstract:

This paper examines the impact of an anti-speculation tax policy on the trading activities and home price movements in the housing market in Hong Kong. The study focuses on the secondary residential property market where transactions dominate. The policy intervention substantially raised the transaction cost to speculators as well as genuine homeowners who dispose their homes within a certain period. Through the demonstration of structural breaks, our empirical results show that the rise in transaction cost effectively reduced speculative trading activities. However, it accelerated price increase in the small-sized segment by vastly demotivating existing homeowners from trading up to better homes, causing congestion in the lower-end market where the demand from first-time buyers is still strong. Apart from that, by employing regime switching approach, we further show that the unintended consequences are likely to be persistent due to this policy together with other strengthened cooling measures.

Keywords: transaction costs, housing market, structural breaks, regime switching

Procedia PDF Downloads 263
12166 Evaluation of a Method for the Virtual Design of a Software-based Approach for Electronic Fuse Protection in Automotive Applications

Authors: Dominic Huschke, Rudolf Keil

Abstract:

New driving functionalities like highly automated driving have a major impact on the electrics/electronics architecture of future vehicles and inevitably lead to higher safety requirements. Partly due to these increased requirements, the vehicle industry is increasingly looking at semiconductor switches as an alternative to conventional melting fuses. The protective functionality of semiconductor switches can be implemented in hardware as well as in software. A current approach discussed in science and industry is the implementation of a model of the protected low voltage power cable on a microcontroller to calculate its temperature. Here, the information regarding the current is provided by the continuous current measurement of the semiconductor switch. The signal to open the semiconductor switch is provided by the microcontroller when a previously defined limit for the temperature of the low voltage power cable is exceeded. A setup for the testing of the described principle for electronic fuse protection of a low voltage power cable is built and successfullyvalidated with experiments afterwards. Here, the evaluation criterion is the deviation of the measured temperature of the low voltage power cable from the specified limit temperature when the semiconductor switch is opened. The analysis is carried out with an assumed ambient temperature as well as with a measured ambient temperature. Subsequently, the experimentally performed investigations are simulated in a virtual environment. The explicit focus is on the simulation of the behavior of the microcontroller with an implemented model of a low voltage power cable in a real-time environment. Subsequently, the generated results are compared with those of the experiments. Based on this, the completely virtual design of the described approach is assumed to be valid.

Keywords: automotive wire harness, electronic fuse protection, low voltage power cable, semiconductor-based fuses, software-based validation

Procedia PDF Downloads 105
12165 Sentiment Analysis of Ensemble-Based Classifiers for E-Mail Data

Authors: Muthukumarasamy Govindarajan

Abstract:

Detection of unwanted, unsolicited mails called spam from email is an interesting area of research. It is necessary to evaluate the performance of any new spam classifier using standard data sets. Recently, ensemble-based classifiers have gained popularity in this domain. In this research work, an efficient email filtering approach based on ensemble methods is addressed for developing an accurate and sensitive spam classifier. The proposed approach employs Naive Bayes (NB), Support Vector Machine (SVM) and Genetic Algorithm (GA) as base classifiers along with different ensemble methods. The experimental results show that the ensemble classifier was performing with accuracy greater than individual classifiers, and also hybrid model results are found to be better than the combined models for the e-mail dataset. The proposed ensemble-based classifiers turn out to be good in terms of classification accuracy, which is considered to be an important criterion for building a robust spam classifier.

Keywords: accuracy, arcing, bagging, genetic algorithm, Naive Bayes, sentiment mining, support vector machine

Procedia PDF Downloads 142
12164 Determination of the Phosphate Activated Glutaminase Localization in the Astrocyte Mitochondria Using Kinetic Approach

Authors: N. V. Kazmiruk, Y. R. Nartsissov

Abstract:

Phosphate activated glutaminase (GA, E.C. 3.5.1.2) plays a key role in glutamine/glutamate homeostasis in mammalian brain, catalyzing the hydrolytic deamidation of glutamine to glutamate and ammonium ions. GA is mainly localized in mitochondria, where it has the catalytically active form on the inner mitochondrial membrane (IMM) and the other soluble form, which is supposed to be dormant. At present time, the exact localization of the membrane glutaminase active site remains a controversial and an unresolved issue. The first hypothesis called c-side localization suggests that the catalytic site of GA faces the inter-membrane space and products of the deamidation reaction have immediate access to cytosolic metabolism. According to the alternative m-side localization hypothesis, GA orients to the matrix, making glutamate and ammonium available for the tricarboxylic acid cycle metabolism in mitochondria directly. In our study, we used a multi-compartment kinetic approach to simulate metabolism of glutamate and glutamine in the astrocytic cytosol and mitochondria. We used physiologically important ratio between the concentrations of glutamine inside the matrix of mitochondria [Glnₘᵢₜ] and glutamine in the cytosol [Glncyt] as a marker for precise functioning of the system. Since this ratio directly depends on the mitochondrial glutamine carrier (MGC) flow parameters, key observation was to investigate the dependence of the [Glnmit]/[Glncyt] ratio on the maximal velocity of MGC at different initial concentrations of mitochondrial glutamate. Another important task was to observe the similar dependence at different inhibition constants of the soluble GA. The simulation results confirmed the experimental c-side localization hypothesis, in which the glutaminase active site faces the outer surface of the IMM. Moreover, in the case of such localization of the enzyme, a 3-fold decrease in ammonium production was predicted.

Keywords: glutamate metabolism, glutaminase, kinetic approach, mitochondrial membrane, multi-compartment modeling

Procedia PDF Downloads 120
12163 Econophysical Approach on Predictability of Financial Crisis: The 2001 Crisis of Turkey and Argentina Case

Authors: Arzu K. Kamberli, Tolga Ulusoy

Abstract:

Technological developments and the resulting global communication have made the 21st century when large capitals are moved from one end to the other via a button. As a result, the flow of capital inflows has accelerated, and capital inflow has brought with it crisis-related infectiousness. Considering the irrational human behavior, the financial crisis in the world under the influence of the whole world has turned into the basic problem of the countries and increased the interest of the researchers in the reasons of the crisis and the period in which they lived. Therefore, the complex nature of the financial crises and its linearly unexplained structure have also been included in the new discipline, econophysics. As it is known, although financial crises have prediction mechanisms, there is no definite information. In this context, in this study, using the concept of electric field from the electrostatic part of physics, an early econophysical approach for global financial crises was studied. The aim is to define a model that can take place before the financial crises, identify financial fragility at an earlier stage and help public and private sector members, policy makers and economists with an econophysical approach. 2001 Turkey crisis has been assessed with data from Turkish Central Bank which is covered between 1992 to 2007, and for 2001 Argentina crisis, data was taken from IMF and the Central Bank of Argentina from 1997 to 2007. As an econophysical method, an analogy is used between the Gauss's law used in the calculation of the electric field and the forecasting of the financial crisis. The concept of Φ (Financial Flux) has been adopted for the pre-warning of the crisis by taking advantage of this analogy, which is based on currency movements and money mobility. For the first time used in this study Φ (Financial Flux) calculations obtained by the formula were analyzed by Matlab software, and in this context, in 2001 Turkey and Argentina Crisis for Φ (Financial Flux) crisis of values has been confirmed to give pre-warning.

Keywords: econophysics, financial crisis, Gauss's Law, physics

Procedia PDF Downloads 153
12162 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data

Authors: Adarsh Shroff

Abstract:

Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.

Keywords: big data, map reduce, incremental processing, iterative computation

Procedia PDF Downloads 351
12161 Production Line Layout Planning Based on Complexity Measurement

Authors: Guoliang Fan, Aiping Li, Nan Xie, Liyun Xu, Xuemei Liu

Abstract:

Mass customization production increases the difficulty of the production line layout planning. The material distribution process for variety of parts is very complex, which greatly increases the cost of material handling and logistics. In response to this problem, this paper presents an approach of production line layout planning based on complexity measurement. Firstly, by analyzing the influencing factors of equipment layout, the complexity model of production line is established by using information entropy theory. Then, the cost of the part logistics is derived considering different variety of parts. Furthermore, the function of optimization including two objectives of the lowest cost, and the least configuration complexity is built. Finally, the validity of the function is verified in a case study. The results show that the proposed approach may find the layout scheme with the lowest logistics cost and the least complexity. Optimized production line layout planning can effectively improve production efficiency and equipment utilization with lowest cost and complexity.

Keywords: production line, layout planning, complexity measurement, optimization, mass customization

Procedia PDF Downloads 393
12160 Let’s Work It Out: Effects of a Cooperative Learning Approach on EFL Students’ Motivation and Reading Comprehension

Authors: Shiao-Wei Chu

Abstract:

In order to enhance the ability of their graduates to compete in an increasingly globalized economy, the majority of universities in Taiwan require students to pass Freshman English in order to earn a bachelor's degree. However, many college students show low motivation in English class for several important reasons, including exam-oriented lessons, unengaging classroom activities, a lack of opportunities to use English in authentic contexts, and low levels of confidence in using English. Students’ lack of motivation in English classes is evidenced when students doze off, work on assignments from other classes, or use their phones to chat with others, play video games or watch online shows. Cooperative learning aims to address these problems by encouraging language learners to use the target language to share individual experiences, cooperatively complete tasks, and to build a supportive classroom learning community whereby students take responsibility for one another’s learning. This study includes approximately 50 student participants in a low-proficiency Freshman English class. Each week, participants will work together in groups of between 3 and 4 students to complete various in-class interactive tasks. The instructor will employ a reward system that incentivizes students to be responsible for their own as well as their group mates’ learning. The rewards will be based on points that team members earn through formal assessment scores as well as assessment of their participation in weekly in-class discussions. The instructor will record each team’s week-by-week improvement. Once a team meets or exceeds its own earlier performance, the team’s members will each receive a reward from the instructor. This cooperative learning approach aims to stimulate EFL freshmen’s learning motivation by creating a supportive, low-pressure learning environment that is meant to build learners’ self-confidence. Students will practice all four language skills; however, the present study focuses primarily on the learners’ reading comprehension. Data sources include in-class discussion notes, instructor field notes, one-on-one interviews, students’ midterm and final written reflections, and reading scores. Triangulation is used to determine themes and concerns, and an instructor-colleague analyzes the qualitative data to build interrater reliability. Findings are presented through the researcher’s detailed description. The instructor-researcher has developed this approach in the classroom over several terms, and its apparent success at motivating students inspires this research. The aims of this study are twofold: first, to examine the possible benefits of this cooperative approach in terms of students’ learning outcomes; and second, to help other educators to adapt a more cooperative approach to their classrooms.

Keywords: freshman English, cooperative language learning, EFL learners, learning motivation, zone of proximal development

Procedia PDF Downloads 145
12159 Agent/Group/Role Organizational Model to Simulate an Industrial Control System

Authors: Noureddine Seddari, Mohamed Belaoued, Salah Bougueroua

Abstract:

The modeling of complex systems is generally based on the decomposition of their components into sub-systems easier to handle. This division has to be made in a methodical way. In this paper, we introduce an industrial control system modeling and simulation based on the Multi-Agent System (MAS) methodology AALAADIN and more particularly the underlying conceptual model Agent/Group/Role (AGR). Indeed, in this division using AGR model, the overall system is decomposed into sub-systems in order to improve the understanding of regulation and control systems, and to simplify the implementation of the obtained agents and their groups, which are implemented using the Multi-Agents Development KIT (MAD-KIT) platform. This approach appears to us to be the most appropriate for modeling of this type of systems because, due to the use of MAS, it is possible to model real systems in which very complex behaviors emerge from relatively simple and local interactions between many different individuals, therefore a MAS is well adapted to describe a system from the standpoint of the activity of its components, that is to say when the behavior of the individuals is complex (difficult to describe with equations). The main aim of this approach is the take advantage of the performance, the scalability and the robustness that are intuitively provided by MAS.

Keywords: complex systems, modeling and simulation, industrial control system, MAS, AALAADIN, AGR, MAD-KIT

Procedia PDF Downloads 240
12158 Optimizing Telehealth Internet of Things Integration: A Sustainable Approach through Fog and Cloud Computing Platforms for Energy Efficiency

Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo

Abstract:

The swift proliferation of telehealth Internet of Things (IoT) devices has sparked concerns regarding energy consumption and the need for streamlined data processing. This paper presents an energy-efficient model that integrates telehealth IoT devices into a platform based on fog and cloud computing. This integrated system provides a sustainable and robust solution to address the challenges. Our model strategically utilizes fog computing as a localized data processing layer and leverages cloud computing for resource-intensive tasks, resulting in a significant reduction in overall energy consumption. The incorporation of adaptive energy-saving strategies further enhances the efficiency of our approach. Simulation analysis validates the effectiveness of our model in improving energy efficiency for telehealth IoT systems, particularly when integrated with localized fog nodes and both private and public cloud infrastructures. Subsequent research endeavors will concentrate on refining the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability across various healthcare and industry sectors.

Keywords: energy-efficient, fog computing, IoT, telehealth

Procedia PDF Downloads 76
12157 Hybrid Control Mode Based on Multi-Sensor Information by Fuzzy Approach for Navigation Task of Autonomous Mobile Robot

Authors: Jonqlan Lin, C. Y. Tasi, K. H. Lin

Abstract:

This paper addresses the issue of the autonomous mobile robot (AMR) navigation task based on the hybrid control modes. The novel hybrid control mode, based on multi-sensors information by using the fuzzy approach, has been presented in this research. The system operates in real time, is robust, enables the robot to operate with imprecise knowledge, and takes into account the physical limitations of the environment in which the robot moves, obtaining satisfactory responses for a large number of different situations. An experiment is simulated and carried out with a pioneer mobile robot. From the experimental results, the effectiveness and usefulness of the proposed AMR obstacle avoidance and navigation scheme are confirmed. The experimental results show the feasibility, and the control system has improved the navigation accuracy. The implementation of the controller is robust, has a low execution time, and allows an easy design and tuning of the fuzzy knowledge base.

Keywords: autonomous mobile robot, obstacle avoidance, MEMS, hybrid control mode, navigation control

Procedia PDF Downloads 466
12156 The Use of Artificial Intelligence to Harmonization in the Lawmaking Process

Authors: Supriyadi, Andi Intan Purnamasari, Aminuddin Kasim, Sulbadana, Mohammad Reza

Abstract:

The development of the Industrial Revolution Era 4.0 brought a significant influence in the administration of countries in all parts of the world, including Indonesia, not only in the administration and economic sectors but the ways and methods of forming laws should also be adjusted. Until now, the process of making laws carried out by the Parliament with the Government still uses the classical method. The law-making process still uses manual methods, such as typing harmonization of regulations, so that it is not uncommon for errors to occur, such as writing errors, copying articles and so on, things that require a high level of accuracy and relying on inventory and harmonization carried out manually by humans. However, this method often creates several problems due to errors and inaccuracies on the part of officers who harmonize laws after discussion and approval; this has a very serious impact on the system of law formation in Indonesia. The use of artificial intelligence in the process of forming laws seems to be justified and becomes the answer in order to minimize the disharmony of various laws and regulations. This research is normative research using the Legislative Approach and the Conceptual Approach. This research focuses on the question of how to use Artificial Intelligence for Harmonization in the Lawmaking Process.

Keywords: artificial intelligence, harmonization, laws, intelligence

Procedia PDF Downloads 162
12155 Numerical Evaluation of the Flow Behavior inside the Scrubber Unit with Engine Exhaust Pipe

Authors: Kumaresh Selvakumar, Man Young Kim

Abstract:

A wet scrubber is an air pollution control device that removes particulate matter and acid gases from waste gas streams found in marine engine exhaust. If the flue gases in the exhaust is employed for CFD simulation, it makes the problem complicate due to the involvement of emissions. Owing to the fact, the scrubber system in this paper is handled with appropriate approach by designing with the flow properties of hot air and water droplet injections to evaluate the flow behavior inside the system. Since the wet scrubber has the capability of operating over wide range of mixture compositions, the current scrubber model with the designing approach doesn’t deviate from the actual behavior of the system. The scrubber design is constructed with engine exhaust pipe with the purpose of measuring the flow properties inside the scrubber by the influence of exhaust pipe characteristics. The flow properties are computed by the thermodynamic variables such as temperature and pressure with the flow velocity. In this work, numerical analyses have been conducted for the flow of fluid in the scrubber system through CFD technique.

Keywords: wet scrubber, water droplet injections, thermodynamic variables, CFD technique

Procedia PDF Downloads 345
12154 Linear Complementary Based Approach for Unilateral Frictional Contact between Wheel and Beam

Authors: Muskaan Sethi, Arnab Banerjee, Bappaditya Manna

Abstract:

The present paper aims to investigate a suitable contact between a wheel rolling over a flexible beam. A Linear Complementary (LCP) based approach has been adopted to simulate the contact dynamics for a rigid wheel traversing over a flexible Euler Bernoulli simply supported beam. The adopted methodology is suitable to incorporate the effect of frictional force acting at the wheel-beam interface. Moreover, the possibility of the generation of a gap between the two bodies has also been considered. The present method is based on a unilateral contact assumption which assumes that no penetration would occur when the two bodies come in contact. This assumption helps to predict the contact between wheels and beams in a more practical sense. The proposed methodology is validated with the previously published results and is found to be in good agreement. Further, this method is applied to simulate the contact between wheels and beams for various railway configurations. Moreover, different parametric studies are conducted to study the contact dynamics between the wheel and beam more thoroughly.

Keywords: contact dynamics, linear complementary problem, railway dynamics, unilateral contact

Procedia PDF Downloads 102
12153 Unraveling the Threads of Madness: Henry Russell’s 'The Maniac' as an Advocate for Deinstitutionalization in the Nineteenth Century

Authors: T. J. Laws-Nicola

Abstract:

Henry Russell was best known as a composer of more than 300 songs. Many of his compositions were popular for both their sentimental texts, as in ‘The Old Armchair,’ and those of a more political nature, such as ‘Woodsman, Spare That Tree!’ Indeed, Russell had written such songs of advocacy as those associated with abolitionism (‘The Slave Ship’) and environmentalism (‘Woodsman, Spare that Tree!’). ‘The Maniac’ is his only composition addressing the issue of institutionalization. The text is borrowed and adapted from the monodrama The Captive by M.G. ‘Monk’ Lewis. Through an analysis of form, harmony, melody, text, and thematic development and interactions between text and music we can approach a clearer understanding of ‘The Maniac’ and how the text and music interact. Select periodicals, such as The London Times, provide contemporary critical review for ‘The Maniac.’ Additional nineteenth century songs whose texts focus on madness and/or institutionalization will assist in building a stylistic and cultural context for ‘The Maniac.’ Through comparative analyses of ‘The Maniac’ with a body of songs that focus on similar topics, we can approach a clear understanding of the song as a vehicle for deinstitutionalization.

Keywords: 19th century song, institutionalization, M. G. Lewis, Henry Russell

Procedia PDF Downloads 535
12152 Autonomous Flight Performance Improvement of Load-Carrying Unmanned Aerial Vehicles by Active Morphing

Authors: Tugrul Oktay, Mehmet Konar, Mohamed Abdallah Mohamed, Murat Aydin, Firat Sal, Murat Onay, Mustafa Soylak

Abstract:

In this paper, it is aimed to improve autonomous flight performance of a load-carrying (payload: 3 kg and total: 6kg) unmanned aerial vehicle (UAV) through active wing and horizontal tail active morphing and also integrated autopilot system parameters (i.e. P, I, D gains) and UAV parameters (i.e. extension ratios of wing and horizontal tail during flight) design. For this purpose, a loadcarrying UAV (i.e. ZANKA-II) is manufactured in Erciyes University, College of Aviation, Model Aircraft Laboratory is benefited. Optimum values of UAV parameters and autopilot parameters are obtained using a stochastic optimization method. Using this approach autonomous flight performance of UAV is substantially improved and also in some adverse weather conditions an opportunity for safe flight is satisfied. Active morphing and integrated design approach gives confidence, high performance and easy-utility request of UAV users.

Keywords: unmanned aerial vehicles, morphing, autopilots, autonomous performance

Procedia PDF Downloads 673
12151 The Basics of Cognitive Behavioral Family Therapy and the Treatment of Various Physical and Mental Diseases

Authors: Mahta Mohamadkashi

Abstract:

The family is the most important source of security and health for the people of the society, and at the same time, it is the main field of creating all kinds of social and psychological problems. On the one hand, a family is a natural group with many goals and roles that are important and necessary for all family members. On the other hand, the family is a strong and organized group that recruits the therapist because of the goals that are concealed in its policy and procedures. The relationship between the environment and the family background with mental illnesses has been the focus of various researchers for a long time, and the research and experiments that have been conducted to show that the functioning of the family is related to the mental health of the members of the family. Currently, several theoretical perspectives with different approaches seek to explain and resolve psychological problems and family conflicts that can be mentioned. This research aims to investigate "cognitive-behavioral family therapy" by using the "family therapy" research method which is included the descriptive-analytical method and the method of collecting library information, with special reliance on Persian and Latin books and articles. for considering one of the important approaches of family therapy that we are going which have been known as data and its conditions that also includes requirements and limitations. For this purpose, in the beginning, brief background and introduction about family and family therapy are going to describe, and then the basics of cognitive-behavioral family therapy and the implementation process and various techniques of this approach can go through a big discussion. After that, we will apply this approach in the treatment of various physical and mental diseases in the form of related research, and we will examine the ups and downs of the implementation procedures, limitations, and future directions in this field. In general, This study emphasizes the role of the family system in the occurrence of psychological diseases and disorders and also validates the role of the family system in the treatment of those diseases and disorders. Also, cognitive-behavioral family therapy has been approved as an effective treatment approach for a variety of mental disorders.

Keywords: cognitive-behavioral, family, family therapy, cognitive-behavioral family therapy

Procedia PDF Downloads 101
12150 Image Transform Based on Integral Equation-Wavelet Approach

Authors: Yuan Yan Tang, Lina Yang, Hong Li

Abstract:

Harmonic model is a very important approximation for the image transform. The harmanic model converts an image into arbitrary shape; however, this mode cannot be described by any fixed functions in mathematics. In fact, it is represented by partial differential equation (PDE) with boundary conditions. Therefore, to develop an efficient method to solve such a PDE is extremely significant in the image transform. In this paper, a novel Integral Equation-Wavelet based method is presented, which consists of three steps: (1) The partial differential equation is converted into boundary integral equation and representation by an indirect method. (2) The boundary integral equation and representation are changed to plane integral equation and representation by boundary measure formula. (3) The plane integral equation and representation are then solved by a method we call wavelet collocation. Our approach has two main advantages, the shape of an image is arbitrary and the program code is independent of the boundary. The performance of our method is evaluated by numerical experiments.

Keywords: harmonic model, partial differential equation (PDE), integral equation, integral representation, boundary measure formula, wavelet collocation

Procedia PDF Downloads 558
12149 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes

Authors: Nadarajah I. Ramesh

Abstract:

Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.

Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model

Procedia PDF Downloads 278
12148 Differential Approach to Technology Aided English Language Teaching: A Case Study in a Multilingual Setting

Authors: Sweta Sinha

Abstract:

Rapid evolution of technology has changed language pedagogy as well as perspectives on language use, leading to strategic changes in discourse studies. We are now firmly embedded in a time when digital technologies have become an integral part of our daily lives. This has led to generalized approaches to English Language Teaching (ELT) which has raised two-pronged concerns in linguistically diverse settings: a) the diverse linguistic background of the learner might interfere/ intervene with the learning process and b) the differential level of already acquired knowledge of target language might make the classroom practices too easy or too difficult for the target group of learners. ELT needs a more systematic and differential pedagogical approach for greater efficiency and accuracy. The present research analyses the need of identifying learner groups based on different levels of target language proficiency based on a longitudinal study done on 150 undergraduate students. The learners were divided into five groups based on their performance on a twenty point scale in Listening Speaking Reading and Writing (LSRW). The groups were then subjected to varying durations of technology aided language learning sessions and their performance was recorded again on the same scale. Identifying groups and introducing differential teaching and learning strategies led to better results compared to generalized teaching strategies. Language teaching includes different aspects: the organizational, the technological, the sociological, the psychological, the pedagogical and the linguistic. And a facilitator must account for all these aspects in a carefully devised differential approach meeting the challenge of learner diversity. Apart from the justification of the formation of differential groups the paper attempts to devise framework to account for all these aspects in order to make ELT in multilingual setting much more effective.

Keywords: differential groups, English language teaching, language pedagogy, multilingualism, technology aided language learning

Procedia PDF Downloads 391
12147 Toward Automatic Chest CT Image Segmentation

Authors: Angely Sim Jia Wun, Sasa Arsovski

Abstract:

Numerous studies have been conducted on the segmentation of medical images. Segmenting the lungs is one of the common research topics in those studies. Our research stemmed from the lack of solutions for automatic bone, airway, and vessel segmentation, despite the existence of multiple lung segmentation techniques. Consequently, currently, available software tools used for medical image segmentation do not provide automatic lung, bone, airway, and vessel segmentation. This paper presents segmentation techniques along with an interactive software tool architecture for segmenting bone, lung, airway, and vessel tissues. Additionally, we propose a method for creating binary masks from automatically generated segments. The key contribution of our approach is the technique for automatic image thresholding using adjustable Hounsfield values and binary mask extraction. Generated binary masks can be successfully used as a training dataset for deep-learning solutions in medical image segmentation. In this paper, we also examine the current software tools used for medical image segmentation, discuss our approach, and identify its advantages.

Keywords: lung segmentation, binary masks, U-Net, medical software tools

Procedia PDF Downloads 98
12146 A New Approach to the Boom Welding Technique by Determining Seam Profile Tracking

Authors: Muciz Özcan, Mustafa Sacid Endiz, Veysel Alver

Abstract:

In this paper we present a new approach to the boom welding related to the mobile cranes manufacturing, implementing a new method in order to get homogeneous welding quality and reduced energy usage during booms production. We aim to get the realization of the same welding quality carried out on the boom in every region during the manufacturing process and to detect the possible welding errors whether they could be eliminated using laser sensors. We determine the position of the welding region directly through our system and with the help of the welding oscillator we are able to perform a proper boom welding. Errors that may occur in the welding process can be observed by monitoring and eliminated by means of an operator. The major modification in the production of the crane booms will be their form of the booms. Although conventionally, more than one welding is required to perform this process, with the suggested concept, only one particular welding is sufficient, which will be more energy and environment-friendly. Consequently, as only one welding is needed for the manufacturing of the boom, the particular welding quality becomes more essential. As a way to satisfy the welding quality, a welding manipulator was made and fabricated. By using this welding manipulator, the risks of involving dangerous gases formed during the welding process for the operator and the surroundings are diminished as much as possible.

Keywords: boom welding, seam tracking, energy saving, global warming

Procedia PDF Downloads 346
12145 Vehicle Activity Characterization Approach to Quantify On-Road Mobile Source Emissions

Authors: Hatem Abou-Senna, Essam Radwan

Abstract:

Transportation agencies and researchers in the past have estimated emissions using one average speed and volume on a long stretch of roadway. Other methods provided better accuracy utilizing annual average estimates. Travel demand models provided an intermediate level of detail through average daily volumes. Currently, higher accuracy can be established utilizing microscopic analyses by splitting the network links into sub-links and utilizing second-by-second trajectories to calculate emissions. The need to accurately quantify transportation-related emissions from vehicles is essential. This paper presents an examination of four different approaches to capture the environmental impacts of vehicular operations on a 10-mile stretch of Interstate 4 (I-4), an urban limited access highway in Orlando, Florida. First, (at the most basic level), emissions were estimated for the entire 10-mile section 'by hand' using one average traffic volume and average speed. Then, three advanced levels of detail were studied using VISSIM/MOVES to analyze smaller links: average speeds and volumes (AVG), second-by-second link drive schedules (LDS), and second-by-second operating mode distributions (OPMODE). This paper analyzes how the various approaches affect predicted emissions of CO, NOx, PM2.5, PM10, and CO2. The results demonstrate that obtaining precise and comprehensive operating mode distributions on a second-by-second basis provides more accurate emission estimates. Specifically, emission rates are highly sensitive to stop-and-go traffic and the associated driving cycles of acceleration, deceleration, and idling. Using the AVG or LDS approach may overestimate or underestimate emissions, respectively, compared to an operating mode distribution approach.

Keywords: limited access highways, MOVES, operating mode distribution (OPMODE), transportation emissions, vehicle specific power (VSP)

Procedia PDF Downloads 339
12144 Selective Circular Dichroism Sensor Based on the Generation of Quantum Dots for Cadmium Ion Detection

Authors: Pradthana Sianglam, Wittaya Ngeontae

Abstract:

A new approach for the fabrication of cadmium ion (Cd2+) sensor is demonstrated. The detection principle is based on the in-situ generation of cadmium sulfide quantum dots (CdS QDs) in the presence of chiral thiol containing compound and detection by the circular dichroism spectroscopy (CD). Basically, the generation of CdS QDs can be done in the presence of Cd2+, sulfide ion and suitable capping compounds. In addition, the strong CD signal can be recorded if the generated QDs possess chiral property (from chiral capping molecule). Thus, the degree of CD signal change depends on the number of the generated CdS QDs which can be related to the concentration of Cd2+ (excess of other components). In this work, we use the mixture of cysteamine (Cys) and L-Penicillamine (LPA) as the capping molecules. The strong CD signal can be observed when the solution contains sodium sulfide, Cys, LPA, and Cd2+. Moreover, the CD signal is linearly related to the concentration of Cd2+. This approach shows excellence selectivity towards the detection of Cd2+ when comparing to other cation. The proposed CD sensor provides low limit detection limits around 70 µM and can be used with real water samples with satisfactory results.

Keywords: circular dichroism sensor, quantum dots, enaniomer, in-situ generation, chemical sensor, heavy metal ion

Procedia PDF Downloads 363
12143 Development of High Strength Filler Consumables by Means of Calculations and Microstructural Characterization

Authors: S. Holly, R. Schnitzer, P. Haslberger, D. Zügner

Abstract:

The development of new filler consumables necessitates a high effort regarding samples and experiments to achieve the required mechanical properties and chemistry. In the scope of the development of a metal-cored wire with the target tensile strength of 1150 MPa and acceptable impact toughness, thermodynamic and kinetic calculations via MatCalc were used to reduce the experimental work and the resources required. Micro alloying elements were used to reach the high strength as an alternative approach compared to the conventional solid solution hardening. In order to understand the influence of different micro alloying elements in more detail, the influence of different elements on the precipitation behavior in the weld metal was evaluated. Investigations of the microstructure were made via atom probe and EBSD to understand the effect of micro alloying elements. The calculated results are in accordance with the results obtained by experiments and can be explained by the microstructural investigations. On the example of aluminium, the approach is exemplified and clarifies the efficient way of development.

Keywords: alloy development, high strength steel, MatCalc, metal-cored wire

Procedia PDF Downloads 237
12142 Market Solvency Capital Requirement Minimization: How Non-linear Solvers Provide Portfolios Complying with Solvency II Regulation

Authors: Abraham Castellanos, Christophe Durville, Sophie Echenim

Abstract:

In this article, a portfolio optimization problem is performed in a Solvency II context: it illustrates how advanced optimization techniques can help to tackle complex operational pain points around the monitoring, control, and stability of Solvency Capital Requirement (SCR). The market SCR of a portfolio is calculated as a combination of SCR sub-modules. These sub-modules are the results of stress-tests on interest rate, equity, property, credit and FX factors, as well as concentration on counter-parties. The market SCR is non convex and non differentiable, which does not make it a natural optimization criteria candidate. In the SCR formulation, correlations between sub-modules are fixed, whereas risk-driven portfolio allocation is usually driven by the dynamics of the actual correlations. Implementing a portfolio construction approach that is efficient on both a regulatory and economic standpoint is not straightforward. Moreover, the challenge for insurance portfolio managers is not only to achieve a minimal SCR to reduce non-invested capital but also to ensure stability of the SCR. Some optimizations have already been performed in the literature, simplifying the standard formula into a quadratic function. But to our knowledge, it is the first time that the standard formula of the market SCR is used in an optimization problem. Two solvers are combined: a bundle algorithm for convex non- differentiable problems, and a BFGS (Broyden-Fletcher-Goldfarb- Shanno)-SQP (Sequential Quadratic Programming) algorithm, to cope with non-convex cases. A market SCR minimization is then performed with historical data. This approach results in significant reduction of the capital requirement, compared to a classical Markowitz approach based on the historical volatility. A comparative analysis of different optimization models (equi-risk-contribution portfolio, minimizing volatility portfolio and minimizing value-at-risk portfolio) is performed and the impact of these strategies on risk measures including market SCR and its sub-modules is evaluated. A lack of diversification of market SCR is observed, specially for equities. This was expected since the market SCR strongly penalizes this type of financial instrument. It was shown that this direct effect of the regulation can be attenuated by implementing constraints in the optimization process or minimizing the market SCR together with the historical volatility, proving the interest of having a portfolio construction approach that can incorporate such features. The present results are further explained by the Market SCR modelling.

Keywords: financial risk, numerical optimization, portfolio management, solvency capital requirement

Procedia PDF Downloads 117
12141 Fracture Energy Corresponding to the Puncture/Cutting of Nitrile Rubber by Pointed Blades

Authors: Ennouri Triki, Toan Vu-Khanh

Abstract:

Resistance to combined puncture/cutting by pointed blades is an important property of gloves materials. The purpose of this study is to propose an approach derived from the fracture mechanics theory to calculate the fracture energy associated to the puncture/cutting of nitrile rubber. The proposed approach is also based on the application of a sample pre-strained during the puncture/cutting test in order to remove the contribution of friction. It was validated with two different pointed blade angles of 22.5° and 35°. Results show that the applied total fracture energy corresponding to puncture/cutting is controlled by three energies, one is the fracture energy or the intrinsic strength of the material, the other reflects the friction energy between a pointed blade and the material. For an applied pre-strain energy (or tearing energy) of high value, the friction energy is completely removed. Without friction, the total fracture energy is constant. In that case, the fracture contribution of the tearing energy is marginal. Growth of the crack is thus completely caused by the puncture/cutting by a pointed blade. Finally, results suggest that the value of the fracture energy corresponding to puncture/cutting by pointed blades is obtained at a frictional contribution of zero.

Keywords: elastomer, energy, fracture, friction, pointed blades

Procedia PDF Downloads 305
12140 An Attempt to Improve Student´s Understanding on Thermal Conductivity Using Thermal Cameras

Authors: Mariana Faria Brito Francisquini

Abstract:

Many thermal phenomena are present and play a substantial role in our daily lives. This presence makes the study of this area at both High School and University levels a very widely explored topic in the literature. However, a lot of important concepts to a meaningful understanding of the world are neglected at the expense of a traditional approach with senseless algebraic problems. In this work, we intend to show how the introduction of new technologies in the classroom, namely thermal cameras, can work in our favor to make a clearer understanding of many of these concepts, such as thermal conductivity. The use of thermal cameras in the classroom tends to diminish the everlasting abstractness in thermal phenomena as they enable us to visualize something that happens right before our eyes, yet we cannot see it. In our study, we will provide the same amount of heat to metallic cylindrical rods of the same length, but different materials in order to study the thermal conductivity of each one. In this sense, the thermal camera allows us to visualize the increase in temperature along each rod in real time enabling us to infer how heat is being transferred from one part of the rod to another. Therefore, we intend to show how this approach can contribute to the exposure of students to more enriching, intellectually prolific, scenarios than those provided by traditional approaches.

Keywords: teaching physics, thermal cameras, thermal conductivity, thermal physics

Procedia PDF Downloads 282
12139 Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates

Authors: Rosa Tsegaye Aga, Xuan Jiang, Pavel Vazquez Faci, Siqing Liu, Simon Rayner, Endalkachew Alemu, Markos Abebe

Abstract:

Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models.

Keywords: machine learning, MTB, WGS, drug resistant TB

Procedia PDF Downloads 52
12138 Long-Term Economic-Ecological Assessment of Optimal Local Heat-Generating Technologies for the German Unrefurbished Residential Building Stock on the Quarter Level

Authors: M. A. Spielmann, L. Schebek

Abstract:

In order to reach the long-term national climate goals of the German government for the building sector, substantial energetic measures have to be executed. Historically, those measures were primarily energetic efficiency measures at the buildings’ shells. Advanced technologies for the on-site generation of heat (or other types of energy) often are not feasible at this small spatial scale of a single building. Therefore, the present approach uses the spatially larger dimension of a quarter. The main focus of the present paper is the long-term economic-ecological assessment of available decentralized heat-generating (CHP power plants and electrical heat pumps) technologies at the quarter level for the German unrefurbished residential buildings. Three distinct terms have to be described methodologically: i) Quarter approach, ii) Economic assessment, iii) Ecological assessment. The quarter approach is used to enable synergies and scaling effects over a single-building. For the present study, generic quarters that are differentiated according to significant parameters concerning their heat demand are used. The core differentiation of those quarters is made by the construction time period of the buildings. The economic assessment as the second crucial parameter is executed with the following structure: Full costs are quantized for each technology combination and quarter. The investment costs are analyzed on an annual basis and are modeled with the acquisition of debt. Annuity loans are assumed. Consequently, for each generic quarter, an optimal technology combination for decentralized heat generation is provided in each year of the temporal boundaries (2016-2050). The ecological assessment elaborates for each technology combination and each quarter a Life Cycle assessment. The measured impact category hereby is GWP 100. The technology combinations for heat production can be therefore compared against each other concerning their long-term climatic impacts. Core results of the approach can be differentiated to an economic and ecological dimension. With an annual resolution, the investment and running costs of different energetic technology combinations are quantified. For each quarter an optimal technology combination for local heat supply and/or energetic refurbishment of the buildings within the quarter is provided. Coherently to the economic assessment, the climatic impacts of the technology combinations are quantized and compared against each other.

Keywords: building sector, economic-ecological assessment, heat, LCA, quarter level

Procedia PDF Downloads 224