Search results for: dynamic capability approach
15685 Application of Fourier Series Based Learning Control on Mechatronic Systems
Authors: Sandra Baßler, Peter Dünow, Mathias Marquardt
Abstract:
A Fourier series based learning control (FSBLC) algorithm for tracking trajectories of mechanical systems with unknown nonlinearities is presented. Two processes are introduced to which the FSBLC with PD controller is applied. One is a simplified service robot capable of climbing stairs due to special wheels and the other is a propeller driven pendulum with nearly the same requirements on control. Additionally to the investigation of learning the feed forward for the desired trajectories some considerations on the implementation of such an algorithm on low cost microcontroller hardware are made. Simulations of the service robot as well as practical experiments on the pendulum show the capability of the used FSBLC algorithm to perform the task of improving control behavior for repetitive task of such mechanical systems.Keywords: climbing stairs, FSBLC, ILC, service robot
Procedia PDF Downloads 31415684 Secure Proxy Signature Based on Factoring and Discrete Logarithm
Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi
Abstract:
A digital signature is an electronic signature form used by an original signer to sign a specific document. When the original signer is not in his office or when he/she travels outside, he/she delegates his signing capability to a proxy signer and then the proxy signer generates a signing message on behalf of the original signer. The two parties must be able to authenticate one another and agree on a secret encryption key, in order to communicate securely over an unreliable public network. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties. In this paper, we present a secure proxy signature scheme over an efficient and secure authenticated key agreement protocol based on factoring and discrete logarithm problem.Keywords: discrete logarithm, factoring, proxy signature, key agreement
Procedia PDF Downloads 30915683 Object Oriented Fault Tree Analysis Methodology
Abstract:
Traditional safety, risk and reliability analysis approaches are problem-oriented, which make it great workload when analyzing complicated and huge system, besides, too much repetitive work would to do if the analyzed system composed by many similar components. It is pressing need an object and function oriented approach to maintain high consistency with problem domain. A new approach is proposed to overcome these shortcomings of traditional approaches, the concepts: class, abstract, inheritance, polymorphism and encapsulation are introduced into FTA and establish the professional class library that the abstractions of physical objects in real word, four areas relevant information also be proposed as the establish help guide. The interaction between classes is completed by the inside or external methods that mapping the attributes to base events through fully search the knowledge base, which forms good encapsulation. The object oriented fault tree analysis system that analyze and evaluate the system safety and reliability according to the original appearance of the problem is set up, where could mapped directly from the class and object to the problem domain of the fault tree analysis. All the system failure situations can be analyzed through this bottom-up fault tree construction approach. Under this approach architecture, FTA approach is developed, which avoids the human influence of the analyst on analysis results. It reveals the inherent safety problems of analyzed system itself and provides a new way of thinking and development for safety analysis. So that object oriented technology in the field of safety applications and development, safety theory is conducive to innovation.Keywords: FTA, knowledge base, object-oriented technology, reliability analysis
Procedia PDF Downloads 24815682 Cost-Effective, Accuracy Preserving Scalar Characterization for mmWave Transceivers
Authors: Mohammad Salah Abdullatif, Salam Hajjar, Paul Khanna
Abstract:
The development of instrument grade mmWave transceivers comes with many challenges. A general rule of thumb is that the performance of the instrument must be higher than the performance of the unit under test in terms of accuracy and stability. The calibration and characterizing of mmWave transceivers are important pillars for testing commercial products. Using a Vector Network Analyzer (VNA) with a mixer option has proven a high performance as an approach to calibrate mmWave transceivers. However, this approach comes with a high cost. In this work, a reduced-cost method to calibrate mmWave transceivers is proposed. A comparison between the proposed method and the VNA technology is provided. A demonstration of significant challenges is discussed, and an approach to meet the requirements is proposed.Keywords: mmWave transceiver, scalar characterization, coupler connection, magic tee connection, calibration, VNA, vector network analyzer
Procedia PDF Downloads 10715681 Worm Gearing Design Improvement by Considering Varying Mesh Stiffness
Authors: A. H. Elkholy, A. H. Falah
Abstract:
A new approach has been developed to estimate the load share and stress distribution of worm gear sets. The approach is based upon considering the instantaneous tooth meshing stiffness where the worm gear drive was modelled as a series of spur gear slices, and each slice was analyzed separately using the well established formulae of spur gears. By combining the results obtained for all slices, the entire envolute worm gear set loading and stressing was obtained. The geometric modelling method presented, allows tooth elastic deformation and tooth root stresses of worm gear drives under different load conditions to be investigated. On the basis of the method introduced in this study, the instantaneous meshing stiffness and load share were obtained. In comparison with existing methods, this approach has both good analysis accuracy and less computing time.Keywords: gear, load/stress distribution, worm, wheel, tooth stiffness, contact line
Procedia PDF Downloads 34515680 Educational Equity in Online Art Education: The Reggio Emilia Approach in White Ant Atelier for Persian-Speaking Children
Authors: Mahsa Mohammadhosseini
Abstract:
This study investigates the effectiveness of adapting the Reggio Emilia approach to online art education, specifically through White Ant Atelier (W.A.A), a virtual art initiative for Persian-speaking children. Employing an action research framework, the study examines the implementation of Reggio Emilia principles via the "Home" art project, which spanned four months and included 16 sessions. The analysis covers 50 artworks produced by participants, including 17 pieces created collaboratively by mothers and their children. The results demonstrate that integrating the Reggio Emilia approach into online platforms significantly improves children's creative expression and engagement. This finding illustrates that virtual education when integrated with child-centered methodologies like Reggio Emilia, can effectively address and reduce educational inequities among Persian-speaking children.Keywords: Reggio Emilia, online education, art education, educational equity
Procedia PDF Downloads 1815679 On Direct Matrix Factored Inversion via Broyden's Updates
Authors: Adel Mohsen
Abstract:
A direct method based on the good Broyden's updates for evaluating the inverse of a nonsingular square matrix of full rank and solving related system of linear algebraic equations is studied. For a matrix A of order n whose LU-decomposition is A = LU, the multiplication count is O (n3). This includes the evaluation of the LU-decompositions of the inverse, the lower triangular decomposition of A as well as a “reduced matrix inverse”. If an explicit value of the inverse is not needed the order reduces to O (n3/2) to compute to compute inv(U) and the reduced inverse. For a symmetric matrix only O (n3/3) operations are required to compute inv(L) and the reduced inverse. An example is presented to demonstrate the capability of using the reduced matrix inverse in treating ill-conditioned systems. Besides the simplicity of Broyden's update, the method provides a mean to exploit the possible sparsity in the matrix and to derive a suitable preconditioner.Keywords: Broyden's updates, matrix inverse, inverse factorization, solution of linear algebraic equations, ill-conditioned matrices, preconditioning
Procedia PDF Downloads 47915678 An Effective Approach to Knowledge Capture in Whole Life Costing in Constructions Project
Authors: Ndibarafinia Young Tobin, Simon Burnett
Abstract:
In spite of the benefits of implementing whole life costing technique as a valuable approach for comparing alternative building designs allowing operational cost benefits to be evaluated against any initial cost increases and also as part of procurement in the construction industry, its adoption has been relatively slow due to the lack of tangible evidence, ‘know-how’ skills and knowledge of the practice, i.e. the lack of professionals in many establishments with knowledge and training on the use of whole life costing technique, this situation is compounded by the absence of available data on whole life costing from relevant projects, lack of data collection mechanisms and so on. This has proved to be very challenging to those who showed some willingness to employ the technique in a construction project. The knowledge generated from a project can be considered as best practices learned on how to carry out tasks in a more efficient way, or some negative lessons learned which have led to losses and slowed down the progress of the project and performance. Knowledge management in whole life costing practice can enhance whole life costing analysis execution in a construction project, as lessons learned from one project can be carried on to future projects, resulting in continuous improvement, providing knowledge that can be used in the operation and maintenance phases of an assets life span. Purpose: The purpose of this paper is to report an effective approach which can be utilised in capturing knowledge in whole life costing practice in a construction project. Design/methodology/approach: An extensive literature review was first conducted on the concept of knowledge management and whole life costing. This was followed by a semi-structured interview to explore the existing and good practice knowledge management in whole life costing practice in a construction project. The data gathered from the semi-structured interview was analyzed using content analysis and used to structure an effective knowledge capturing approach. Findings: From the results obtained in the study, it shows that the practice of project review is the common method used in the capturing of knowledge and should be undertaken in an organized and accurate manner, and results should be presented in the form of instructions or in a checklist format, forming short and precise insights. The approach developed advised that irrespective of how effective the approach to knowledge capture, the absence of an environment for sharing knowledge, would render the approach ineffective. Open culture and resources are critical for providing a knowledge sharing setting, and leadership has to sustain whole life costing knowledge capture, giving full support for its implementation. The knowledge capturing approach has been evaluated by practitioners who are experts in the area of whole life costing practice. The results have indicated that the approach to knowledge capture is suitable and efficient.Keywords: whole life costing, knowledge capture, project review, construction industry, knowledge management
Procedia PDF Downloads 26015677 Multi-Plane Wrist Movement: Pathomechanics and Design of a 3D-Printed Splint
Authors: Sigal Portnoy, Yael Kaufman-Cohen, Yafa Levanon
Abstract:
Introduction: Rehabilitation following wrist fractures often includes exercising flexion-extension movements with a dynamic splint. However, during daily activities, we combine most of our wrist movements with radial and ulnar deviations. Also, the multi-plane wrist motion, named the ‘dart throw motion’ (DTM), was found to be a more stable motion in healthy individuals, in term of the motion of the proximal carpal bones, compared with sagittal wrist motion. The aim of this study was therefore to explore the pathomechanics of the wrist in a common multi-plane movement pattern (DTM) and design a novel splint for rehabilitation following distal radius fractures. Methods: First, a multi-axis electro-goniometer was used to quantify the plane angle of motion of the dominant and non-dominant wrists during various activities, e.g. drinking from a glass of water and answering a phone in 43 healthy individuals. The following protocols were then implemented with a population following distal radius fracture. Two dynamic scans were performed, one of the sagittal wrist motion and DTM, in a 3T magnetic resonance imaging (MRI) device, bilaterally. The scaphoid and lunate carpal bones, as well as the surface of the distal radius, were manually-segmented in SolidWorks and the angles of motion of the scaphoid and lunate bones were calculated. Subsequently, a patient-specific splint was designed using 3D scans of the hand. The brace design comprises of a proximal attachment to the arm and a distal envelope of the palm. An axle with two wheels is attached to the proximal part. Two wires attach the proximal part with the medial-palmar and lateral-ventral aspects of the distal part: when the wrist extends, the first wire is released and the second wire is strained towards the radius. The opposite occurs when the wrist flexes. The splint was attached to the wrist using Velcro and constrained the wrist movement to the desired calculated multi-plane of motion. Results: No significant differences were found between the multi-plane angles of the dominant and non-dominant wrists. The most common daily activities occurred at a plane angle of approximately 20° to 45° from the sagittal plane and the MRI studies show individual angles of the plane of motion. The printed splint fitted the wrist of the subjects and constricted movement to the desired multi-plane of motion. Hooks were inserted on each part to allow the addition of springs or rubber bands for resistance training towards muscle strengthening in the rehabilitation setting. Conclusions: It has been hypothesized that activation of the wrist in a multi-plane movement pattern following distal radius fractures will accelerate the recovery of the patient. Our results show that this motion can be determined from either the dominant or non-dominant wrists. The design of the patient-specific dynamic splint is the first step towards assessing whether splinting to induce combined movement is beneficial to the rehabilitation process, compared to conventional treatment. The evaluation of the clinical benefits of this method, compared to conventional rehabilitation methods following wrist fracture, are a part of a PhD work, currently conducted by an occupational therapist.Keywords: distal radius fracture, rehabilitation, dynamic magnetic resonance imaging, dart throw motion
Procedia PDF Downloads 29915676 Influences of Separation of the Boundary Layer in the Reservoir Pressure in the Shock Tube
Authors: Bruno Coelho Lima, Joao F.A. Martos, Paulo G. P. Toro, Israel S. Rego
Abstract:
The shock tube is a ground-facility widely used in aerospace and aeronautics science and technology for studies on gas dynamic and chemical-physical processes in gases at high-temperature, explosions and dynamic calibration of pressure sensors. A shock tube in its simplest form is comprised of two separate tubes of equal cross-section by a diaphragm. The diaphragm function is to separate the two reservoirs at different pressures. The reservoir containing high pressure is called the Driver, the low pressure reservoir is called Driven. When the diaphragm is broken by pressure difference, a normal shock wave and non-stationary (named Incident Shock Wave) will be formed in the same place of diaphragm and will get around toward the closed end of Driven. When this shock wave reaches the closer end of the Driven section will be completely reflected. Now, the shock wave will interact with the boundary layer that was created by the induced flow by incident shock wave passage. The interaction between boundary layer and shock wave force the separation of the boundary layer. The aim of this paper is to make an analysis of influences of separation of the boundary layer in the reservoir pressure in the shock tube. A comparison among CDF (Computational Fluids Dynamics), experiments test and analytical analysis were performed. For the analytical analysis, some routines in Python was created, in the numerical simulations (Computational Fluids Dynamics) was used the Ansys Fluent, and the experimental tests were used T1 shock tube located in IEAv (Institute of Advanced Studies).Keywords: boundary layer separation, moving shock wave, shock tube, transient simulation
Procedia PDF Downloads 31515675 Model-Based Software Regression Test Suite Reduction
Authors: Shiwei Deng, Yang Bao
Abstract:
In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.Keywords: dependence analysis, EFSM model, greedy algorithm, regression test
Procedia PDF Downloads 42715674 Cryptographic Resource Allocation Algorithm Based on Deep Reinforcement Learning
Authors: Xu Jie
Abstract:
As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decision-making problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security) by modeling the multi-job collaborative cryptographic service scheduling problem as a multi-objective optimized job flow scheduling problem and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real-time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing and effectively solves the problem of complex resource scheduling in cryptographic services.Keywords: cloud computing, cryptography on-demand service, reinforcement learning, workflow scheduling
Procedia PDF Downloads 1515673 Household Wealth and Portfolio Choice When Tail Events Are Salient
Authors: Carlson Murray, Ali Lazrak
Abstract:
Robust experimental evidence of systematic violations of expected utility (EU) establishes that individuals facing risk overweight utility from low probability gains and losses when making choices. These findings motivated development of models of preferences with probability weighting functions, such as rank dependent utility (RDU). We solve for the optimal investing strategy of an RDU investor in a dynamic binomial setting from which we derive implications for investing behavior. We show that relative to EU investors with constant relative risk aversion, commonly measured probability weighting functions produce optimal RDU terminal wealth with significant downside protection and upside exposure. We additionally find that in contrast to EU investors, RDU investors optimally choose a portfolio that contains fair bets that provide payo↵s that can be interpreted as lottery outcomes or exposure to idiosyncratic returns. In a calibrated version of the model, we calculate that RDU investors would be willing to pay 5% of their initial wealth for the freedom to trade away from an optimal EU wealth allocation. The dynamic trading strategy that supports the optimal wealth allocation implies portfolio weights that are independent of initial wealth but requires higher risky share after good stock return histories. Optimal trading also implies the possibility of non-participation when historical returns are poor. Our model fills a gap in the literature by providing new quantitative and qualitative predictions that can be tested experimentally or using data on household wealth and portfolio choice.Keywords: behavioral finance, probability weighting, portfolio choice
Procedia PDF Downloads 42015672 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities
Authors: Salman Naseer
Abstract:
One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission
Procedia PDF Downloads 14215671 Neighborhood-Scape as a Methodology for Enhancing Gulf Region Cities' Quality of Life: Case of Doha, Qatar
Authors: Eman AbdelSabour
Abstract:
Sustainability is increasingly being considered as a critical aspect in shaping the urban environment. It works as an invention development basis for global urban growth. Currently, different models and structures impact the means of interpreting the criteria that would be included in defining a sustainable city. There is a collective need to improve the growth path to an extremely durable path by presenting different suggestions regarding multi-scale initiatives. The global rise in urbanization has led to increased demand and pressure for better urban planning choice and scenarios for a better sustainable urban alternative. The need for an assessment tool at the urban scale was prompted due to the trend of developing increasingly sustainable urban development (SUD). The neighborhood scale is being managed by a growing research committee since it seems to be a pertinent scale through which economic, environmental, and social impacts could be addressed. Although neighborhood design is a comparatively old practice, it is in the initial years of the 21st century when environmentalists and planners started developing sustainable assessment at the neighborhood level. Through this, urban reality can be considered at a larger scale whereby themes which are beyond the size of a single building can be addressed, while it still stays small enough that concrete measures could be analyzed. The neighborhood assessment tool has a crucial role in helping neighborhood sustainability to perform approach and fulfill objectives through a set of themes and criteria. These devices are also known as neighborhood assessment tool, district assessment tool, and sustainable community rating tool. The primary focus of research has been on sustainability from the economic and environmental aspect, whereas the social, cultural issue is rarely focused. Therefore, this research is based on Doha, Qatar, the current urban conditions of the neighborhoods is discussed in this study. The research problem focuses on the spatial features in relation to the socio-cultural aspects. This study is outlined in three parts; the first section comprises of review of the latest use of wellbeing assessment methods to enhance decision process of retrofitting physical features of the neighborhood. The second section discusses the urban settlement development, regulations and the process of decision-making rule. An analysis of urban development policy with reference to neighborhood development is also discussed in this section. Moreover, it includes a historical review of the urban growth of the neighborhoods as an atom of the city system present in Doha. Last part involves developing quantified indicators regarding subjective well-being through a participatory approach. Additionally, applying GIS will be utilized as a visualizing tool for the apparent Quality of Life (QoL) that need to develop in the neighborhood area as an assessment approach. Envisaging the present QoL situation in Doha neighborhoods is a process to improve current condition neighborhood function involves many days to day activities of the residents, due to which areas are considered dynamic.Keywords: neighborhood, subjective wellbeing, decision support tools, Doha, retrofiring
Procedia PDF Downloads 13815670 Damage Assessment Based on Full-Polarimetric Decompositions in the 2017 Colombia Landslide
Authors: Hyeongju Jeon, Yonghyun Kim, Yongil Kim
Abstract:
Synthetic Aperture Radar (SAR) is an effective tool for damage assessment induced by disasters due to its all-weather and night/day acquisition capability. In this paper, the 2017 Colombia landslide was observed using full-polarimetric ALOS/PALSAR-2 data. Polarimetric decompositions, including the Freeman-Durden decomposition and the Cloude decomposition, are utilized to analyze the scattering mechanisms changes before and after-landslide. These analyses are used to detect the damaged areas induced by the landslide. Experimental results validate the efficiency of the full polarimetric SAR data since the damaged areas can be well discriminated. Thus, we can conclude the proposed method using full polarimetric data has great potential for damage assessment of landslides.Keywords: Synthetic Aperture Radar (SAR), polarimetric decomposition, damage assessment, landslide
Procedia PDF Downloads 39015669 The Design and Construction of the PV-Wind Autonomous System for Greenhouse Plantations in Central Thailand
Authors: Napat Watjanatepin, Wikorn Wong-Satiean
Abstract:
The objective of this research is to design and construct the PV-Wind hybrid autonomous system for the greenhouse plantation, and analyze the technical performance of the PV-Wind energy system. This design depends on the water consumption in the greenhouse by using 24 of the fogging mist each with the capability of 24 liter/min. The operating time is 4 times per day, each round for 15 min. The fogging system is being driven by water pump with AC motor rating 0.5 hp. The load energy consumed is around 1.125 kWh/d. The designing results of the PV-Wind hybrid energy system is that sufficient energy could be generated by this system. The results of this study can be applied as a technical data reference for other areas in the central part of Thailand.Keywords: PV-Wind hybrid autonomous system, greenhouse plantation, fogging system, central part of Thailand
Procedia PDF Downloads 31415668 Integrated Lateral Flow Electrochemical Strip for Leptospirosis Diagnosis
Authors: Wanwisa Deenin, Abdulhadee Yakoh, Chahya Kreangkaiwal, Orawon Chailapakul, Kanitha Patarakul, Sudkate Chaiyo
Abstract:
LipL32 is an outer membrane protein present only on pathogenic Leptospira species, which are the causative agent of leptospirosis. Leptospirosis symptoms are often misdiagnosed with other febrile illnesses as the clinical manifestations are non-specific. Therefore, an accurate diagnostic tool for leptospirosis is indeed critical for proper and prompt treatment. Typical diagnosis via serological assays is generally performed to assess the antibodies produced against Leptospira. However, their delayed antibody response and complicated procedure are undoubtedly limited the practical utilization especially in primary care setting. Here, we demonstrate for the first time an early-stage detection of LipL32 by an integrated lateral-flow immunoassay with electrochemical readout (eLFIA). A ferrocene trace tag was monitored via differential pulse voltammetry operated on a smartphone-based device, thus allowing for on-field testing. Superior performance in terms of the lowest detectable limit of detection (LOD) of 8.53 pg/mL and broad linear dynamic range (5 orders of magnitude) among other sensors available thus far was established. Additionally, the developed test strip provided a straightforward yet sensitive approach for diagnosis of leptospirosis using the collected human sera from patients, in which the results were comparable to the real-time polymerase chain reaction technique.Keywords: leptospirosis, electrochemical detection, lateral flow immunosensor, point-of-care testing, early-stage detection
Procedia PDF Downloads 9315667 Some New Bounds for a Real Power of the Normalized Laplacian Eigenvalues
Authors: Ayşe Dilek Maden
Abstract:
For a given a simple connected graph, we present some new bounds via a new approach for a special topological index given by the sum of the real number power of the non-zero normalized Laplacian eigenvalues. To use this approach presents an advantage not only to derive old and new bounds on this topic but also gives an idea how some previous results in similar area can be developed.Keywords: degree Kirchhoff index, normalized Laplacian eigenvalue, spanning tree, simple connected graph
Procedia PDF Downloads 36615666 Use Multiphysics Simulations and Resistive Pulse Sensing to Study the Effect of Metal and Non-Metal Nanoparticles in Different Salt Concentration
Authors: Chun-Lin Chiang, Che-Yen Lee, Yu-Shan Yeh, Jiunn-Haur Shaw
Abstract:
Wafer fabrication is a critical part of the semiconductor process, when the finest linewidth with the improvement of technology continues to decline and the structure development from 2D towards to 3D. The nanoparticles contained in the slurry or in the ultrapure water which used for cleaning have a large influence on the manufacturing process. Therefore, semiconductor industry is hoping to find a viable method for on-line detection the nanoparticles size and concentration. The resistive pulse sensing technology is one of the methods that may cover this question. As we know that nanoparticles properties of material differ significantly from their properties at larger length scales. So, we want to clear that the metal and non-metal nanoparticles translocation dynamic when we use the resistive pulse sensing technology. In this study we try to use the finite element method that contains three governing equations to do multiphysics coupling simulations. The Navier-Stokes equation describes the laminar motion, the Nernst-Planck equation describes the ion transport, and the Poisson equation describes the potential distribution in the flow channel. To explore that the metal nanoparticles and the non-metal nanoparticles in different concentration electrolytes, through the nanochannel caused by ion current changes. Then the reliability of the simulation results was verified by resistive pulse sensing test. The existing results show that the lower ion concentration, the greater effect of nanoparticles on the ion concentration in the nanochannel. The conductive spikes are correlated with nanoparticles surface charge. Then we can be concluded that in the resistive pulse sensing technique, the ion concentration in the nanochannel and nanoparticle properties are important for the translocation dynamic, and they have the interactions.Keywords: multiphysics simulations, resistive pulse sensing, nanoparticles, nanochannel
Procedia PDF Downloads 34915665 MONDO Neutron Tracker Characterisation by Means of Proton Therapeutical Beams and MonteCarlo Simulation Studies
Authors: G. Traini, V. Giacometti, R. Mirabelli, V. Patera, D. Pinci, A. Sarti, A. Sciubba, M. Marafini
Abstract:
The MONDO (MOnitor for Neutron Dose in hadrOntherapy) project aims a precise characterisation of the secondary fast and ultrafast neutrons produced in particle therapy treatments. The detector is composed of a matrix of scintillating fibres (250 um) readout by CMOS Digital-SPAD based sensors. Recoil protons from n-p elastic scattering are detected and used to track neutrons. A prototype was tested with proton beams (Trento Proton Therapy Centre): efficiency, light yield, and track-reconstruction capability were studied. The results of a MonteCarlo FLUKA simulation used to evaluated double scattering efficiency and expected backgrounds will be presented.Keywords: secondary neutrons, particle therapy, tracking, elastic scattering
Procedia PDF Downloads 26615664 Using Short Learning Programmes to Develop Students’ Digital Literacies in Art and Design Education
Authors: B.J. Khoza, B. Kembo
Abstract:
Global socioeconomic developments and ever-growing technological advancements of the art and design industry indicate the pivotal importance of lifelong learning. There exists a discrepancy between competencies, personal ambition, and workplace requirements. There are few , if at all, institutions of higher learning in South Africa which offer Short Learning Programmes (SLP) in Art and Design Education. Traditionally, Art and Design education is delivered face to face via a hands-on approach. In this way the enduring perception among educators is that art and design education does not lend itself to online delivery. Short Learning programmes (SLP) are a concentrated approach to make revenue and lure potential prospective students to embark on further education study, this is often of weighted value to both students and employers. SLPs are used by Higher Education institutions to generate income in support of the core academic programmes. However, there is a gap in terms of the translation of art and design studio pedagogy into SLPs which provide quality education, are adaptable and delivered via a blended mode. In our paper, we propose a conceptual framework drawing on secondary research to analyse existing research to SLPs for arts and design education. We aim to indicate a new dimension to the process of using a design-based research approach for short learning programmes in art and design education. The study draws on a conceptual framework, a qualitative analysis through the lenses of Herrington, McKenney, Reeves and Oliver (2005) principles of the design-based research approach. The results of this study indicate that design-based research is not only an effective methodological approach for developing and deploying arts and design education curriculum for 1st years in Higher Education context but it also has the potential to guide future research. The findings of this study propose that the design-based research approach could bring theory and praxis together regarding a common purpose to design context-based solutions to educational problems.Keywords: design education, design-based research, digital literacies, multi-literacies, short learning programme
Procedia PDF Downloads 16415663 An Improved Robust Algorithm Based on Cubature Kalman Filter for Single-Frequency Global Navigation Satellite System/Inertial Navigation Tightly Coupled System
Authors: Hao Wang, Shuguo Pan
Abstract:
The Global Navigation Satellite System (GNSS) signal received by the dynamic vehicle in the harsh environment will be frequently interfered with and blocked, which generates gross error affecting the positioning accuracy of the GNSS/Inertial Navigation System (INS) integrated navigation. Therefore, this paper put forward an improved robust Cubature Kalman filter (CKF) algorithm for single-frequency GNSS/INS tightly coupled system ambiguity resolution. Firstly, the dynamic model and measurement model of a single-frequency GNSS/INS tightly coupled system was established, and the method for GNSS integer ambiguity resolution with INS aided is studied. Then, we analyzed the influence of pseudo-range observation with gross error on GNSS/INS integrated positioning accuracy. To reduce the influence of outliers, this paper improved the CKF algorithm and realized an intelligent selection of robust strategies by judging the ill-conditioned matrix. Finally, a field navigation test was performed to demonstrate the effectiveness of the proposed algorithm based on the double-differenced solution mode. The experiment has proved the improved robust algorithm can greatly weaken the influence of separate, continuous, and hybrid observation anomalies for enhancing the reliability and accuracy of GNSS/INS tightly coupled navigation solutions.Keywords: GNSS/INS integrated navigation, ambiguity resolution, Cubature Kalman filter, Robust algorithm
Procedia PDF Downloads 10015662 Park’s Vector Approach to Detect an Inter Turn Stator Fault in a Doubly Fed Induction Machine by a Neural Network
Authors: Amel Ourici
Abstract:
An electrical machine failure that is not identified in an initial stage may become catastrophic and it may suffer severe damage. Thus, undetected machine faults may cascade in it failure, which in turn may cause production shutdowns. Such shutdowns are costly in terms of lost production time, maintenance costs, and wasted raw materials. Doubly fed induction generators are used mainly for wind energy conversion in MW power plants. This paper presents a detection of an inter turn stator fault in a doubly fed induction machine whose stator and rotor are supplied by two pulse width modulation (PWM) inverters. The method used in this article to detect this fault, is based on Park’s Vector Approach, using a neural network.Keywords: doubly fed induction machine, PWM inverter, inter turn stator fault, Park’s vector approach, neural network
Procedia PDF Downloads 60815661 Defect-Based Urgency Index for Bridge Maintenance Ranking and Prioritization
Authors: Saleh Abu Dabous, Khaled Hamad, Rami Al-Ruzouq
Abstract:
Bridge condition assessment and rating provide essential information needed for bridge management. This paper reviews bridge inspection and condition rating practices and introduces a defect-based urgency index. The index is estimated at the element-level based on the extent and severity of the different defects typical to the bridge element. The urgency index approach has the following advantages: (1) It facilitates judgment submission, i.e. instead of rating the bridge element with a specific linguistic overall expression (which can be subjective and used differently by different people), the approach is based on assessing the defects; (2) It captures multiple defects that can be present within a deteriorated element; and (3) It reflects how critical the element is through quantifying critical defects and their severity. The approach can be further developed and validated. It is expected to be useful for practical purposes as an early-warning system for critical bridge elements.Keywords: condition rating, deterioration, inspection, maintenance
Procedia PDF Downloads 45215660 Specialized Translation Teaching Strategies: A Corpus-Based Approach
Authors: Yingying Ding
Abstract:
This study presents a methodology of specialized translation with the objective of helping teachers to improve the strategies in teaching translation. In order to allow students to acquire skills to translate specialized texts, they need to become familiar with the semantic and syntactic features of source texts and target texts. The aim of our study is to use a corpus-based approach in the teaching of specialized translation between Chinese and Italian. This study proposes to construct a specialized Chinese - Italian comparable corpus that consists of 50 economic contracts from the domain of food. With the help of AntConc, we propose to compile a comparable corpus in for translation teaching purposes. This paper attempts to provide insight into how teachers could benefit from comparable corpus in the teaching of specialized translation from Italian into Chinese and through some examples of passive sentences how students could learn to apply different strategies for translating appropriately the voice.Keywords: contrastive studies, specialised translation, corpus-based approach, teaching
Procedia PDF Downloads 37115659 Validating Texture Analysis as a Tool for Determining Bioplastic (Bio)Degradation
Authors: Sally J. Price, Greg F. Walker, Weiyi Liu, Craig R. Bunt
Abstract:
Plastics, due to their long lifespan, are becoming more of an environmental concern once their useful life has been completed. There are a vast array of different types of plastic, and they can be found in almost every ecosystem on earth and are of particular concern in terrestrial environments where they can become incorporated into the food chain. Hence bioplastics have become more of interest to manufacturers and the public recently as they have the ability to (bio)degrade in commercial and in home composting situations. However, tools in which to quantify how they degrade in response to environmental variables are still being developed -one such approach is texture analysis using a TA.XT Texture Analyser, Stable Microsystems, was used to determine the force required to break or punch holes in standard ASTM D638 Type IV 3D printed bioplastic “dogbones” depending on the thicknesses of them. Manufacturers’ recommendations for calibrating the Texture Analyser are one such approach for standardising results; however, an independent technique using dummy dogbones and a substitute for the bioplastic was used alongside the samples. This approach was unexpectedly more valuable than realised at the start of the trial as irregular results were later discovered with the substitute material before valuable samples collected from the field were lost due to possible machine malfunction. This work will show the value of having an independent approach to machine calibration for accurate sample analysis with a Texture Analyser when analysing bioplastic samples.Keywords: bioplastic, degradation, environment, texture analyzer
Procedia PDF Downloads 20615658 High Performance Electrocardiogram Steganography Based on Fast Discrete Cosine Transform
Authors: Liang-Ta Cheng, Ching-Yu Yang
Abstract:
Based on fast discrete cosine transform (FDCT), the authors present a high capacity and high perceived quality method for electrocardiogram (ECG) signal. By using a simple adjusting policy to the 1-dimentional (1-D) DCT coefficients, a large volume of secret message can be effectively embedded in an ECG host signal and be successfully extracted at the intended receiver. Simulations confirmed that the resulting perceived quality is good, while the hiding capability of the proposed method significantly outperforms that of existing techniques. In addition, our proposed method has a certain degree of robustness. Since the computational complexity is low, it is feasible for our method being employed in real-time applications.Keywords: data hiding, ECG steganography, fast discrete cosine transform, 1-D DCT bundle, real-time applications
Procedia PDF Downloads 19415657 Characterization of Bio-Inspired Thermoelastoplastic Composites Filled with Modified Cellulose Fibers
Authors: S. Cichosz, A. Masek
Abstract:
A new cellulose hybrid modification approach, which is undoubtedly a scientific novelty, is introduced. The study reports the properties of cellulose (Arbocel UFC100 – Ultra Fine Cellulose) and characterizes cellulose filled polymer composites based on an ethylene-norbornene copolymer (TOPAS Elastomer E-140). Moreover, the approach of physicochemical two-stage cellulose treatment is introduced: solvent exchange (to ethanol or hexane) and further chemical modification with maleic anhydride (MA). Furthermore, the impact of the drying process on cellulose properties was investigated. Suitable measurements were carried out to characterize cellulose fibers: spectroscopic investigation (Fourier Transform Infrared Spektrofotometer-FTIR, Near InfraRed spectroscopy-NIR), thermal analysis (Differential scanning calorimetry, Thermal gravimetric analysis ) and Karl Fischer titration. It should be emphasized that for all UFC100 treatments carried out, a decrease in moisture content was evidenced. FT-IR reveals a drop in absorption band intensity at 3334 cm-1, the peak is associated with both –OH moieties and water. Similar results were obtained with Karl Fischer titration. Based on the results obtained, it may be claimed that the employment of ethanol contributes greatly to the lowering of cellulose water absorption ability (decrease of moisture content to approximately 1.65%). Additionally, regarding polymer composite properties, crucial data has been obtained from the mechanical and thermal analysis. The highest material performance was noted in the case of the composite sample that contained cellulose modified with MA after a solvent exchange with ethanol. This specimen exhibited sufficient tensile strength, which is almost the same as that of the neat polymer matrix – in the region of 40 MPa. Moreover, both the Payne effect and filler efficiency factor, calculated based on dynamic mechanical analysis (DMA), reveal the possibility of the filler having a reinforcing nature. What is also interesting is that, according to the Payne effect results, fibers dried before the further chemical modification are assumed to allow more regular filler structure development in the polymer matrix (Payne effect maximum at 1.60 MPa), compared with those not dried (Payne effect in the range 0.84-1.26 MPa). Furthermore, taking into consideration the data gathered from DSC and TGA, higher thermal stability is obtained in case of the materials filled with fibers that were dried before the carried out treatments (degradation activation energy in the region of 195 kJ/mol) in comparison with the polymer composite samples filled with unmodified cellulose (degradation activation energy of approximately 180 kJ/mol). To author’s best knowledge this work results in the introduction of a novel, new filler hybrid treatment approach. Moreover, valuable data regarding the properties of composites filled with cellulose fibers of various moisture contents have been provided. It should be emphasized that plant fiber-based polymer bio-materials described in this research might contribute significantly to polymer waste minimization because they are more readily degraded.Keywords: cellulose fibers, solvent exchange, moisture content, ethylene-norbornene copolymer
Procedia PDF Downloads 11515656 Gaussian Mixture Model Based Identification of Arterial Wall Movement for Computation of Distension Waveform
Authors: Ravindra B. Patil, P. Krishnamoorthy, Shriram Sethuraman
Abstract:
This work proposes a novel Gaussian Mixture Model (GMM) based approach for accurate tracking of the arterial wall and subsequent computation of the distension waveform using Radio Frequency (RF) ultrasound signal. The approach was evaluated on ultrasound RF data acquired using a prototype ultrasound system from an artery mimicking flow phantom. The effectiveness of the proposed algorithm is demonstrated by comparing with existing wall tracking algorithms. The experimental results show that the proposed method provides 20% reduction in the error margin compared to the existing approaches in tracking the arterial wall movement. This approach coupled with ultrasound system can be used to estimate the arterial compliance parameters required for screening of cardiovascular related disorders.Keywords: distension waveform, Gaussian Mixture Model, RF ultrasound, arterial wall movement
Procedia PDF Downloads 507