Search results for: execution time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18091

Search results for: execution time

17371 High-Throughput Mechanized Microfluidic Test Groundwork for Precise Microbial Genomics

Authors: Pouya Karimi, Ramin Gasemi Shayan, Parsa Sheykhzade

Abstract:

Ease shotgun DNA sequencing is changing the microbial sciences. Sequencing instruments are compelling to the point that example planning is currently the key constraining element. Here, we present a microfluidic test readiness stage that incorporates the key strides in cells to grouping library test groundwork for up to 96 examples and decreases DNA input prerequisites 100-overlay while keeping up or improving information quality. The universally useful microarchitecture we show bolsters work processes with subjective quantities of response and tidy up or catch steps. By decreasing the example amount necessities, we empowered low-input (∼10,000 cells) entire genome shotgun (WGS) sequencing of Mycobacterium tuberculosis and soil miniaturized scale settlements with prevalent outcomes. We additionally utilized the upgraded throughput to succession ∼400 clinical Pseudomonas aeruginosa libraries and exhibit magnificent single-nucleotide polymorphism discovery execution that clarified phenotypically watched anti-toxin opposition. Completely coordinated lab-on-chip test arrangement beats specialized boundaries to empower more extensive organization of genomics across numerous fundamental research and translational applications.

Keywords: clinical microbiology, DNA, microbiology, microbial genomics

Procedia PDF Downloads 109
17370 Analysis of Spatiotemporal Efficiency and Fairness of Railway Passenger Transport Network Based on Space Syntax: Taking Yangtze River Delta as an Example

Authors: Lin Dong, Fei Shi

Abstract:

Based on the railway network and the principles of space syntax, the study attempts to reconstruct the spatial relationship of the passenger network connections from space and time perspective. According to the travel time data of main stations in the Yangtze River Delta urban agglomeration obtained by the Internet, the topological drawing of railway network under different time sections is constructed. With the comprehensive index composed of connection and integration, the accessibility and network operation efficiency of the railway network in different time periods is calculated, while the fairness of the network is analyzed by the fairness indicators constructed with the integration and location entropy from the perspective of horizontal and vertical fairness respectively. From the analysis of the efficiency and fairness of the railway passenger transport network, the study finds: (1) There is a strong regularity in regional system accessibility change; (2) The problems of efficiency and fairness are different in different time periods; (3) The improvement of efficiency will lead to the decline of horizontal fairness to a certain extent, while from the perspective of vertical fairness, the supply-demand situation has changed smoothly with time; (4) The network connection efficiency of Shanghai, Jiangsu and Zhejiang regions is higher than that of the western regions such as Anqing and Chizhou; (5) The marginalization of Nantong, Yancheng, Yangzhou, Taizhou is obvious. The study explores the application of spatial syntactic theory in regional traffic analysis, in order to provide a reference for the development of urban agglomeration transportation network.

Keywords: spatial syntax, the Yangtze River Delta, railway passenger time, efficiency and fairness

Procedia PDF Downloads 121
17369 Contextualizing Policing in Local Communities: The Way Forward for Ghana Police Service

Authors: Bernard Owusu Asare

Abstract:

This study investigates the implementation and efficacy of community policing within the Ghana Police Service, with a focus on its impact on local communities. Emphasizing the goal of creating safer environments and improving the overall quality of life, the research engages opinion leaders from selected communities in Ghana, as well as members of the police force stationed within these communities. Employing a semi-structured interview guide as the primary research instrument, data collection involves face-to-face interviews conducted at respondents' residences and policing centers. The preliminary findings underscore the pivotal role of collaborative efforts between community elders and police personnel in the successful execution of community policing initiatives. Furthermore, the study identifies gainful employment for the youth as a key determinant of effective policing, highlighting the interconnectedness of socioeconomic factors with law enforcement outcomes. The study further reveals that access to the internet emerges as a factor influencing both policing practices and the overall quality of life within these communities. By contextualizing the dynamics of community policing in the local Ghanaian context, this research aims to contribute valuable insights to the ongoing discourse on effective law enforcement strategies and their impact on community well-being.

Keywords: community, policing, police service, Ghana

Procedia PDF Downloads 46
17368 Comparison of Due Date Assignment Rules in a Dynamic Job Shop

Authors: Mumtaz Ipek, Burak Erkayman

Abstract:

Due date is assigned as an input for scheduling problems. At the same time, due date is selected as a decision variable for real time scheduling applications. Correct determination of due dates increases shop floor performance and number of jobs completed on time. This subject has been mentioned widely in the literature. Moreover rules for due date determination have been developed from analytical analysis. When a job arrives to the shop floor, a due date is assigned for delivery. Various due date determination methods are used in the literature. In this study six different due date methods are implemented for a hypothetical dynamic job shop and the performances of the due date methods are compared.

Keywords: scheduling, dynamic job shop, due date assignment, management engineering

Procedia PDF Downloads 538
17367 The Analysis of Defects Prediction in Injection Molding

Authors: Mehdi Moayyedian, Kazem Abhary, Romeo Marian

Abstract:

This paper presents an evaluation of a plastic defect in injection molding before it occurs in the process; it is known as the short shot defect. The evaluation of different parameters which affect the possibility of short shot defect is the aim of this paper. The analysis of short shot possibility is conducted via SolidWorks Plastics and Taguchi method to determine the most significant parameters. Finite Element Method (FEM) is employed to analyze two circular flat polypropylene plates of 1 mm thickness. Filling time, part cooling time, pressure holding time, melt temperature and gate type are chosen as process and geometric parameters, respectively. A methodology is presented herein to predict the possibility of the short-shot occurrence. The analysis determined melt temperature is the most influential parameter affecting the possibility of short shot defect with a contribution of 74.25%, and filling time with a contribution of 22%, followed by gate type with a contribution of 3.69%. It was also determined the optimum level of each parameter leading to a reduction in the possibility of short shot are gate type at level 1, filling time at level 3 and melt temperature at level 3. Finally, the most significant parameters affecting the possibility of short shot were determined to be melt temperature, filling time, and gate type.

Keywords: injection molding, plastic defects, short shot, Taguchi method

Procedia PDF Downloads 207
17366 Titanium-Aluminium Oxide Coating on Aluminized Steel

Authors: Fuyan Sun, Guang Wang, Xueyuan Nie

Abstract:

In this study, a plasma electrolytic oxidation (PEO) process was used to form titanium-aluminium oxide coating on aluminized steel. The present work was mainly to study the effects of treatment time of PEO process on properties of the titanium coating. A potentiodynamic polarization corrosion test was employed to investigate the corrosion resistance of the coating. The friction coefficient and wear resistance of the coating were studied by using pin-on-disc test. The thermal transfer behaviours of uncoated and PEO-coated aluminized steels were also studied. It could be seen that treatment time of PEO process significantly influenced the properties of the titanium oxide coating. Samples with a longer treatment time had a better performance for corrosion and wear protection. This paper demonstrated different treatment time could alter the surface behaviour of the coating material.

Keywords: titanium-aluminum oxide, plasma electrolytic oxidation, corrosion, wear, thermal property

Procedia PDF Downloads 342
17365 A Comparative Study of GTC and PSP Algorithms for Mining Sequential Patterns Embedded in Database with Time Constraints

Authors: Safa Adi

Abstract:

This paper will consider the problem of sequential mining patterns embedded in a database by handling the time constraints as defined in the GSP algorithm (level wise algorithms). We will compare two previous approaches GTC and PSP, that resumes the general principles of GSP. Furthermore this paper will discuss PG-hybrid algorithm, that using PSP and GTC. The results show that PSP and GTC are more efficient than GSP. On the other hand, the GTC algorithm performs better than PSP. The PG-hybrid algorithm use PSP algorithm for the two first passes on the database, and GTC approach for the following scans. Experiments show that the hybrid approach is very efficient for short, frequent sequences.

Keywords: database, GTC algorithm, PSP algorithm, sequential patterns, time constraints

Procedia PDF Downloads 367
17364 Gender Based Variability Time Series Complexity Analysis

Authors: Ramesh K. Sunkaria, Puneeta Marwaha

Abstract:

Nonlinear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy Normal Sinus Rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.

Keywords: heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy

Procedia PDF Downloads 268
17363 Ilorin Traditional Architecture as a Good Example of a Green Building Design

Authors: Olutola Funmilayo Adekeye

Abstract:

Tradition African practice of architecture can be said to be deeply rooted in Green Architecture in concept, design and execution. A study into the ancient building techniques in Ilorin Emirate depicts prominent (eco-centric approach of) Green Architecture principles. In the Pre-colonial era before the introduction of modern architecture and Western building materials, the Nigeria traditional communities built their houses to meet their cultural, religious and social needs using mainly indigenous building materials such as mud (Amo), cowdung (Boto), straws (koriko), palm fronts (Imo-Ope) to mention a few. This research attempts to identify the various techniques of applying the traditional African principles of Green Architecture to Ilorin traditional buildings. It will examine and assess some case studies to understand the extent to which Green architecture principles have been applied to traditional building designs that are still preserved today in Ilorin, Nigeria. Furthermore, this study intends to answer many questions, which can be summarized into two basic questions which are: (1) What aspects of what today are recognized as important green architecture principles have been applied to Ilorin traditional buildings? (2) To what extent have the principles of green architecture applied to Ilorin traditional buildings been ways of demonstrating a cultural attachment to the earth as an expression of the African sense of human being as one with nature?

Keywords: green architecture, Ilorin, traditional buildings, design principles, ecocentric, application

Procedia PDF Downloads 524
17362 Ultra-Sensitive and Real Time Detection of ZnO NW Using QCM

Authors: Juneseok You, Kuewhan Jang, Chanho Park, Jaeyeong Choi, Hyunjun Park, Sehyun Shin, Changsoo Han, Sungsoo Na

Abstract:

Nanomaterials occur toxic effects to human being or ecological systems. Some sensors have been developed to detect toxic materials and the standard for toxic materials has been established. Zinc oxide nanowire (ZnO NW) is known for toxic material. By ionizing in cell body, ionized Zn ions are overexposed to cell components, which cause critical damage or death. In this paper, we detected ZnO NW in water using QCM (Quartz Crystal Microbalance) and ssDNA (single strand DNA). We achieved 30 minutes of response time for real time detection and 100 pg/mL of limit of detection (LOD).

Keywords: zinc oxide nanowire, QCM, ssDNA, toxic material, biosensor

Procedia PDF Downloads 413
17361 Effect of Interaction between Colchicine Concentrations and Treatment Time Duration on the Percentage of Chromosome Polyploidy of Crepis capillaris (with and without 2B Chromosome) in vitro Culture

Authors: Payman A. A. Zibari, Mosleh M. S. Duhoky

Abstract:

These experiments were conducted at Tissue Culture Laboratory/ Faculty of Agriculture / University of Duhok during the period from January 2011 to May 2013. The objectives of this study were to study the effects of interaction between colchcine concentrations and treatment time duration of Creps capilaris (with and without 2B chromosome) on chromosome polyploidy during fifteen passages until regeneration of plants from the callus. Data showed that high percentage of chromosome polyploidy approximately can be obtained from high concentration of colchicin and long time of duration.

Keywords: polyploidy, Crepis capilaris, colchicine, B chromosome

Procedia PDF Downloads 172
17360 Microwave Imaging by Application of Information Theory Criteria in MUSIC Algorithm

Authors: Majid Pourahmadi

Abstract:

The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.

Keywords: microwave imaging, time reversal, MUSIC algorithm, minimum description length (MDL)

Procedia PDF Downloads 316
17359 BIM Modeling of Site and Existing Buildings: Case Study of ESTP Paris Campus

Authors: Rita Sassine, Yassine Hassani, Mohamad Al Omari, Stéphanie Guibert

Abstract:

Building Information Modelling (BIM) is the process of creating, managing, and centralizing information during the building lifecycle. BIM can be used all over a construction project, from the initiation phase to the planning and execution phases to the maintenance and lifecycle management phase. For existing buildings, BIM can be used for specific applications such as lifecycle management. However, most of the existing buildings don’t have a BIM model. Creating a compatible BIM for existing buildings is very challenging. It requires special equipment for data capturing and efforts to convert these data into a BIM model. The main difficulties for such projects are to define the data needed, the level of development (LOD), and the methodology to be adopted. In addition to managing information for an existing building, studying the impact of the built environment is a challenging topic. So, integrating the existing terrain that surrounds buildings into the digital model is essential to be able to make several simulations as flood simulation, energy simulation, etc. Making a replication of the physical model and updating its information in real-time to make its Digital Twin (DT) is very important. The Digital Terrain Model (DTM) represents the ground surface of the terrain by a set of discrete points with unique height values over 2D points based on reference surface (e.g., mean sea level, geoid, and ellipsoid). In addition, information related to the type of pavement materials, types of vegetation and heights and damaged surfaces can be integrated. Our aim in this study is to define the methodology to be used in order to provide a 3D BIM model for the site and the existing building based on the case study of “Ecole Spéciale des Travaux Publiques (ESTP Paris)” school of engineering campus. The property is located on a hilly site of 5 hectares and is composed of more than 20 buildings with a total area of 32 000 square meters and a height between 50 and 68 meters. In this work, the campus precise levelling grid according to the NGF-IGN69 altimetric system and the grid control points are computed according to (Réseau Gédésique Français) RGF93 – Lambert 93 french system with different methods: (i) Land topographic surveying methods using robotic total station, (ii) GNSS (Global Network Satellite sytem) levelling grid with NRTK (Network Real Time Kinematic) mode, (iii) Point clouds generated by laser scanning. These technologies allow the computation of multiple building parameters such as boundary limits, the number of floors, the floors georeferencing, the georeferencing of the 4 base corners of each building, etc. Once the entry data are identified, the digital model of each building is done. The DTM is also modeled. The process of altimetric determination is complex and requires efforts in order to collect and analyze multiple data formats. Since many technologies can be used to produce digital models, different file formats such as DraWinG (DWG), LASer (LAS), Comma-separated values (CSV), Industry Foundation Classes (IFC) and ReViT (RVT) will be generated. Checking the interoperability between BIM models is very important. In this work, all models are linked together and shared on 3DEXPERIENCE collaborative platform.

Keywords: building information modeling, digital terrain model, existing buildings, interoperability

Procedia PDF Downloads 94
17358 Effects of Whole Body Vibration on Movement Variability Performing a Resistance Exercise with Different Ballasts and Rhythms

Authors: Sílvia tuyà Viñas, Bruno Fernández-Valdés, Carla Pérez-Chirinos, Monica Morral-Yepes, Lucas del Campo Montoliu, Gerard Moras Feliu

Abstract:

Some researchers stated that whole body vibration (WBV) generates postural destabilization, although there is no extensive research. Therefore, the aim of this study was to analyze movement variability when performing a half-squat with a different type of ballasts and rhythms with (V) and without (NV) WBV in male athletes using entropy. Twelve experienced in strength training males (age: 21.24  2.35 years, height: 176.83  5.80 cm, body mass: 70.63  8.58 kg) performed a half-squat with weighted vest (WV), dumbbells (D), and a bar with the weights suspended with elastic bands (B), in V and NV at 40 bpm and 60 bpm. Subjects performed one set of twelve repetitions of each situation, composed by the combination of the three factors. The movement variability was analyzed by calculating the Sample Entropy (SampEn) of the total acceleration signal recorded at the waist. In V, significant differences were found between D and WV (p<0.001; ES: 2.87 at 40 bpm; p<0.001; ES: 3.17 at 60 bpm) and between the B and WV at both rhythms (p<0.001; ES: 3.12 at 40 bpm; p<0.001; ES: 2.93 at 60 bpm) and a higher SampEn was obtained at 40 bpm with all ballasts (p<0.001; ES of WV: 1.22; ES of D: 4.49; ES of B: 4.03). No significant differences were found in NV. WBV is a disturbing and destabilizing stimulus. Strength and conditioning coaches should choose the combination of ballast and rhythm of execution according to the level and objectives of each athlete.

Keywords: accelerometry, destabilization, entropy, movement variability, resistance training

Procedia PDF Downloads 164
17357 Discrete Tracking Control of Nonholonomic Mobile Robots: Backstepping Design Approach

Authors: Alexander S. Andreev, Olga A. Peregudova

Abstract:

In this paper, we propose a discrete tracking control of nonholonomic mobile robots with two degrees of freedom. The electro-mechanical model of a mobile robot moving on a horizontal surface without slipping, with two rear wheels controlled by two independent DC electric, and one front roal wheel is considered. We present back-stepping design based on the Euler approximate discrete-time model of a continuous-time plant. Theoretical considerations are verified by numerical simulation. The work was supported by RFFI (15-01-08482).

Keywords: actuator dynamics, back stepping, discrete-time controller, Lyapunov function, wheeled mobile robot

Procedia PDF Downloads 395
17356 Cultural Statistics in Governance: A Comparative Analysis between the UK and Finland

Authors: Sandra Toledo

Abstract:

There is an increasing tendency in governments for a more evidence-based policy-making and a stricter auditing of public spheres. Especially when budgets are tight, and taxpayers demand a bigger scrutiny over the use of the available resources, statistics and numbers appeared as an effective tool to produce data that supports investments done, as well as evaluating public policy performance. This pressure has not exempted the cultural and art fields. Finland like the rest of Nordic countries has kept its principles from the welfare state, whilst UK seems to be going towards the opposite direction, relaying more and more in private sectors and foundations, as the state folds back. The boom of the creative industries along with a managerial trend introduced by Tatcher in the UK brought, as a result, a commodification of arts within a market logic, where sponsorship and commercial viability were the keynotes. Finland on its part, in spite of following a more protectionist approach of arts, seems to be heading in a similar direction. Additionally, there is an international growing interest in the application of cultural participation studies and the comparability between countries in their results. Nonetheless, the standardization in the application of cultural surveys has not happened yet. Not only there are differences in the application of these type of surveys in terms of time and frequency, but also regarding those conducting them. Therefore, one hypothesis considered in this research is that behind the differences between countries in the application of cultural surveys, production and utilization of cultural statistics is the cultural policy model adopted by the government. In other words, the main goal of this research is to answer the following: What are the differences and similarities between Finland and the UK regarding the role cultural surveys have in cultural policy making? Along with other secondary questions such as: How does the cultural policy model followed by each country influence the role of cultural surveys in cultural policy making? and what are the differences at the local level? In order to answer these questions, strategic cultural policy documents and interviews with key informants will be used and analyzed as source data, using content analysis methods. Cultural statistics per se will not be compared, but instead their use as instruments of governing, and its relation to the cultural policy model. Aspects such as execution of cultural surveys, funding, periodicity, and use of statistics in formal reports and publications, will be studied in the written documents while in the interviews other elements such as perceptions from those involved in collecting cultural statistics or policy making, distribution of tasks and hierarchies among cultural and statistical institutions, and a general view will be the target. A limitation identified beforehand and that it is expected to encounter throughout the process is the language barrier in the case of Finland when it comes to official documents, which will be tackled by interviewing the authors of such papers and choosing key extract of them for translation.

Keywords: Finland, cultural statistics, cultural surveys, United Kingdom

Procedia PDF Downloads 218
17355 Stochastic Model Predictive Control for Linear Discrete-Time Systems with Random Dither Quantization

Authors: Tomoaki Hashimoto

Abstract:

Recently, feedback control systems using random dither quantizers have been proposed for linear discrete-time systems. However, the constraints imposed on state and control variables have not yet been taken into account for the design of feedback control systems with random dither quantization. Model predictive control is a kind of optimal feedback control in which control performance over a finite future is optimized with a performance index that has a moving initial and terminal time. An important advantage of model predictive control is its ability to handle constraints imposed on state and control variables. Based on the model predictive control approach, the objective of this paper is to present a control method that satisfies probabilistic state constraints for linear discrete-time feedback control systems with random dither quantization. In other words, this paper provides a method for solving the optimal control problems subject to probabilistic state constraints for linear discrete-time feedback control systems with random dither quantization.

Keywords: optimal control, stochastic systems, random dither, quantization

Procedia PDF Downloads 426
17354 Optimizing of the Micro EDM Parameters in Drilling of Titanium Ti-6Al-4V Alloy for Higher Machining Accuracy-Fuzzy Modelling

Authors: Ahmed A. D. Sarhan, Mum Wai Yip, M. Sayuti, Lim Siew Fen

Abstract:

Ti6Al4V alloy is highly used in the automotive and aerospace industry due to its good machining characteristics. Micro EDM drilling is commonly used to drill micro hole on extremely hard material with very high depth to diameter ratio. In this study, the parameters of micro-electrical discharge machining (EDM) in drilling of Ti6Al4V alloy is optimized for higher machining accuracy with less hole-dilation and hole taper ratio. The micro-EDM machining parameters includes, peak current and pulse on time. Fuzzy analysis was developed to evaluate the machining accuracy. The analysis shows that hole-dilation and hole-taper ratio are increased with the increasing of peak current and pulse on time. However, the surface quality deteriorates as the peak current and pulse on time increase. The combination that gives the optimum result for hole dilation is medium peak current and short pulse on time. Meanwhile, the optimum result for hole taper ratio is low peak current and short pulse on time.

Keywords: Micro EDM, Ti-6Al-4V alloy, fuzzy logic based analysis, optimization, machining accuracy

Procedia PDF Downloads 482
17353 Stereo Camera Based Speed-Hump Detection Process for Real Time Driving Assistance System in the Daytime

Authors: Hyun-Koo Kim, Yong-Hun Kim, Soo-Young Suk, Ju H. Park, Ho-Youl Jung

Abstract:

This paper presents an effective speed hump detection process at the day-time. we focus only on round types of speed humps in the day-time dynamic road environment. The proposed speed hump detection scheme consists mainly of two process as stereo matching and speed hump detection process. Our proposed process focuses to speed hump detection process. Speed hump detection process consist of noise reduction step, data fusion step, and speed hemp detection step. The proposed system is tested on Intel Core CPU with 2.80 GHz and 4 GB RAM tested in the urban road environments. The frame rate of test videos is 30 frames per second and the size of each frame of grabbed image sequences is 1280 pixels by 670 pixels. Using object-marked sequences acquired with an on-vehicle camera, we recorded speed humps and non-speed humps samples. Result of the tests, our proposed method can be applied in real-time systems by computation time is 13 ms. For instance; our proposed method reaches 96.1 %.

Keywords: data fusion, round types speed hump, speed hump detection, surface filter

Procedia PDF Downloads 497
17352 Hybrid Subspace Approach for Time Delay Estimation in MIMO Systems

Authors: Mojtaba Saeedinezhad, Sarah Yousefi

Abstract:

In this paper, we present a hybrid subspace approach for Time Delay Estimation (TDE) in multivariable systems. While several methods have been proposed for time delay estimation in SISO systems, delay estimation in MIMO systems were always a big challenge. In these systems the existing TDE methods have significant limitations because most of procedures are just based on system response estimation or correlation analysis. We introduce a new hybrid method for TDE in MIMO systems based on subspace identification and explicit output error method; and compare its performance with previously introduced procedures in presence of different noise levels and in a statistical manner. Then the best method is selected with multi objective decision making technique. It is shown that the performance of new approach is much better than the existing methods, even in low signal-to-noise conditions.

Keywords: system identification, time delay estimation, ARX, OE, merit ratio, multi variable decision making

Procedia PDF Downloads 330
17351 Improvements and Implementation Solutions to Reduce the Computational Load for Traffic Situational Awareness with Alerts (TSAA)

Authors: Salvatore Luongo, Carlo Luongo

Abstract:

This paper discusses the implementation solutions to reduce the computational load for the Traffic Situational Awareness with Alerts (TSAA) application, based on Automatic Dependent Surveillance-Broadcast (ADS-B) technology. In 2008, there were 23 total mid-air collisions involving general aviation fixed-wing aircraft, 6 of which were fatal leading to 21 fatalities. These collisions occurred during visual meteorological conditions, indicating the limitations of the see-and-avoid concept for mid-air collision avoidance as defined in the Federal Aviation Administration’s (FAA). The commercial aviation aircraft are already equipped with collision avoidance system called TCAS, which is based on classic transponder technology. This system dramatically reduced the number of mid-air collisions involving air transport aircraft. In general aviation, the same reduction in mid-air collisions has not occurred, so this reduction is the main objective of the TSAA application. The major difference between the original conflict detection application and the TSAA application is that the conflict detection is focused on preventing loss of separation in en-route environments. Instead TSAA is devoted to reducing the probability of mid-air collision in all phases of flight. The TSAA application increases the flight crew traffic situation awareness providing alerts of traffic that are detected in conflict with ownship in support of the see-and-avoid responsibility. The relevant effort has been spent in the design process and the code generation in order to maximize the efficiency and performances in terms of computational load and memory consumption reduction. The TSAA architecture is divided into two high-level systems: the “Threats database” and the “Conflict detector”. The first one receives the traffic data from ADS-B device and provides the memorization of the target’s data history. Conflict detector module estimates ownship and targets trajectories in order to perform the detection of possible future loss of separation between ownship and each target. Finally, the alerts are verified by additional conflict verification logic, in order to prevent possible undesirable behaviors of the alert flag. In order to reduce the computational load, a pre-check evaluation module is used. This pre-check is only a computational optimization, so the performances of the conflict detector system are not modified in terms of number of alerts detected. The pre-check module uses analytical trajectories propagation for both target and ownship. This allows major accuracy and avoids the step-by-step propagation, which requests major computational load. Furthermore, the pre-check permits to exclude the target that is certainly not a threat, using an analytical and efficient geometrical approach, in order to decrease the computational load for the following modules. This software improvement is not suggested by FAA documents, and so it is the main innovation of this work. The efficiency and efficacy of this enhancement are verified using fast-time and real-time simulations and by the execution on a real device in several FAA scenarios. The final implementation also permits the FAA software certification in compliance with DO-178B standard. The computational load reduction allows the installation of TSAA application also on devices with multiple applications and/or low capacity in terms of available memory and computational capabilities

Keywords: traffic situation awareness, general aviation, aircraft conflict detection, computational load reduction, implementation solutions, software certification

Procedia PDF Downloads 263
17350 Design and Implementation of Partial Denoising Boundary Image Matching Using Indexing Techniques

Authors: Bum-Soo Kim, Jin-Uk Kim

Abstract:

In this paper, we design and implement a partial denoising boundary image matching system using indexing techniques. Converting boundary images to time-series makes it feasible to perform fast search using indexes even on a very large image database. Thus, using this converting method we develop a client-server system based on the previous partial denoising research in the GUI (graphical user interface) environment. The client first converts a query image given by a user to a time-series and sends denoising parameters and the tolerance with this time-series to the server. The server identifies similar images from the index by evaluating a range query, which is constructed using inputs given from the client, and sends the resulting images to the client. Experimental results show that our system provides much intuitive and accurate matching result.

Keywords: boundary image matching, indexing, partial denoising, time-series matching

Procedia PDF Downloads 125
17349 Determination of Surface Deformations with Global Navigation Satellite System Time Series

Authors: Ibrahim Tiryakioglu, Mehmet Ali Ugur, Caglar Ozkaymak

Abstract:

The development of GNSS technology has led to increasingly widespread and successful applications of GNSS surveys for monitoring crustal movements. However, multi-period GPS survey solutions have not been applied in monitoring vertical surface deformation. This study uses long-term GNSS time series that are required to determine vertical deformations. In recent years, the surface deformations that are parallel and semi-parallel to Bolvadin fault have occurred in Western Anatolia. These surface deformations have continued to occur in Bolvadin settlement area that is located mostly on alluvium ground. Due to these surface deformations, a number of cracks in the buildings located in the residential areas and breaks in underground water and sewage systems have been observed. In order to determine the amount of vertical surface deformations, two continuous GNSS stations have been established in the region. The stations have been operating since 2015 and 2017, respectively. In this study, GNSS observations from the mentioned two GNSS stations were processed with GAMIT/GLOBK (GNSS Analysis Massachusetts Institute of Technology/GLOBal Kalman) program package to create a coordinate time series. With the time series analyses, the GNSS stations’ behavior models (linear, periodical, etc.), the causes of these behaviors, and mathematical models were determined. The study results from the time series analysis of these two 2 GNSS stations shows approximately 50-80 mm/yr vertical movement.

Keywords: Bolvadin fault, GAMIT, GNSS time series, surface deformations

Procedia PDF Downloads 150
17348 A Real-Time Simulation Environment for Avionics Software Development and Qualification

Authors: Ferdinando Montemari, Antonio Vitale, Nicola Genito, Luca Garbarino, Urbano Tancredi, Domenico Accardo, Michele Grassi, Giancarmine Fasano, Anna Elena Tirri

Abstract:

The development of guidance, navigation and control algorithms and avionic procedures requires the disposability of suitable analysis and verification tools, such as simulation environments, which support the design process and allow detecting potential problems prior to the flight test, in order to make new technologies available at reduced cost, time and risk. This paper presents a simulation environment for avionic software development and qualification, especially aimed at equipment for general aviation aircrafts and unmanned aerial systems. The simulation environment includes models for short and medium-range radio-navigation aids, flight assistance systems, and ground control stations. All the software modules are able to simulate the modeled systems both in fast-time and real-time tests, and were implemented following component oriented modeling techniques and requirement based approach. The paper describes the specific models features, the architectures of the implemented software systems and its validation process. Performed validation tests highlighted the capability of the simulation environment to guarantee in real-time the required functionalities and performance of the simulated avionics systems, as well as to reproduce the interaction between these systems, thus permitting a realistic and reliable simulation of a complete mission scenario.

Keywords: ADS-B, avionics, NAVAIDs, real-time simulation, TCAS, UAS ground control station

Procedia PDF Downloads 210
17347 Minimizing Total Completion Time in No-Wait Flowshops with Setup Times

Authors: Ali Allahverdi

Abstract:

The m-machine no-wait flowshop scheduling problem is addressed in this paper. The objective is to minimize total completion time subject to the constraint that the makespan value is not greater than a certain value. Setup times are treated as separate from processing times. Several recent algorithms are adapted and proposed for the problem. An extensive computational analysis has been conducted for the evaluation of the proposed algorithms. The computational analysis indicates that the best proposed algorithm performs significantly better than the earlier existing best algorithm.

Keywords: scheduling, no-wait flowshop, algorithm, setup times, total completion time, makespan

Procedia PDF Downloads 329
17346 Automated Prepaid Billing Subscription System

Authors: Adekunle K. O, Adeniyi A. E, Kolawole E

Abstract:

One of the most dramatic trends in the communications market in recent years has been the growth of prepaid services. Today, prepaid no longer constitutes the low-revenue, basic-service segment. It is driven by a high margin, value-add service customers who view it as a convenient way of retaining control over their usage and communication spending while expecting high service levels. To service providers, prepaid services offer the advantage of reducing bad accounts while allowing them to predict usage and plan network resources. Yet, the real-time demands of prepaid services require a scalable, real-time platform to manage customers through their entire life cycle. It delivers integrated real-time rating, voucher management, recharge management, customer care and service provisioning for the generation of new prepaid services. It carries high scalability that can handle millions of prepaid customers in real-time through their entire life cycle.

Keywords: prepaid billing, voucher management, customers, automated, security

Procedia PDF Downloads 93
17345 In vivo Determination of Anticoagulant Property of the Tentacle Extract of Aurelia aurita (Moon Jellyfish) Using Sprague-Dawley Rats

Authors: Bea Carmel H. Casiding, Charmaine A. Guy, Funny Jovis P. Malasan, Katrina Chelsea B. Manlutac, Danielle Ann N. Novilla, Marianne R. Oliveros, Magnolia C. Sibulo

Abstract:

Moon jellyfish, Aurelia aurita, has become a popular research organism for diverse studies. Recent studies have verified the prevention of blood clotting properties of the moon jellyfish tentacle extract through in vitro methods. The purpose of this study was to validate the blood clotting ability of A. aurita tentacle extract using in vivo method of experimentation. The tentacles of A. aurita jellyfish were excised and filtered then centrifuged at 3000xg for 10 minutes. The crude nematocyst extract was suspended in 1:6 ratios with phosphate buffer solution and sonicated for three periods of 20 seconds each at 50 Hz. Protein concentration of the extract was determined using Bradford Assay. Bovine serum albumin was the standard solution used with the following concentrations: 35.0, 70.0, 105.0, 140.0, 175.0, 210.0, 245.0, and 280.0 µg/mL. The absorbance was read at 595 nm. Toxicity testing from OECD guidelines was adapted. The extract suspended in phosphate-buffered saline solution was arbitrarily set into three doses (0.1mg/kg, 0.3mg/kg, 0.5mg/kg) and were administered daily for five days to the experimental groups of five male Sprague-Dawley rats (one dose per group). Before and after the administration period, bleeding time and clotting time tests were performed. The One-way Analysis of Variance (ANOVA) was used to analyze the difference of before and after bleeding time and clotting time from the three treatment groups, time, positive and negative control groups. The average protein concentration of the sonicated crude tentacle extract was 206.5 µg/mL. The highest dose administered (0.5mg/kg) produced significant increase in the time for both bleeding and clotting tests. However, the preceding lower dose (0.3mg/kg) only was significantly effective for clotting time test. The protein contained in the tentacle extract with a concentration of 206.5 mcg/mL and dose of 0.3 mg/kg and 0.5 mg/kg of A. aurita elicited anticoagulating activity.

Keywords: anticoagulant, bleeding time test, clotting time test, moon jellyfish

Procedia PDF Downloads 379
17344 Estimation of Time Loss and Costs of Traffic Congestion: The Contingent Valuation Method

Authors: Amira Mabrouk, Chokri Abdennadher

Abstract:

The reduction of road congestion which is inherent to the use of vehicles is an obvious priority to public authority. Therefore, assessing the willingness to pay of an individual in order to save trip-time is akin to estimating the change in price which was the result of setting up a new transport policy to increase the networks fluidity and improving the level of social welfare. This study holds an innovative perspective. In fact, it initiates an economic calculation that has the objective of giving an estimation of the monetized time value during the trips made in Sfax. This research is founded on a double-objective approach. The aim of this study is to i) give an estimation of the monetized value of time; an hour dedicated to trips, ii) determine whether or not the consumer considers the environmental variables to be significant, iii) analyze the impact of applying a public management of the congestion via imposing taxation of city tolls on urban dwellers. This article is built upon a rich field survey led in the city of Sfax. With the use of the contingent valuation method, we analyze the “declared time preferences” of 450 drivers during rush hours. Based on the fond consideration of attributed bias of the applied method, we bring to light the delicacy of this approach with regards to the revelation mode and the interrogative techniques by following the NOAA panel recommendations bearing the exception of the valorization point and other similar studies about the estimation of transportation externality.

Keywords: willingness to pay, contingent valuation, time value, city toll

Procedia PDF Downloads 407
17343 Technology in the Calculation of People Health Level: Design of a Computational Tool

Authors: Sara Herrero Jaén, José María Santamaría García, María Lourdes Jiménez Rodríguez, Jorge Luis Gómez González, Adriana Cercas Duque, Alexandra González Aguna

Abstract:

Background: Health concept has evolved throughout history. The health level is determined by the own individual perception. It is a dynamic process over time so that you can see variations from one moment to the next. In this way, knowing the health of the patients you care for, will facilitate decision making in the treatment of care. Objective: To design a technological tool that calculates the people health level in a sequential way over time. Material and Methods: Deductive methodology through text analysis, extraction and logical knowledge formalization and education with expert group. Studying time: September 2015- actually. Results: A computational tool for the use of health personnel has been designed. It has 11 variables. Each variable can be given a value from 1 to 5, with 1 being the minimum value and 5 being the maximum value. By adding the result of the 11 variables we obtain a magnitude in a certain time, the health level of the person. The health calculator allows to represent people health level at a time, establishing temporal cuts being useful to determine the evolution of the individual over time. Conclusion: The Information and Communication Technologies (ICT) allow training and help in various disciplinary areas. It is important to highlight their relevance in the field of health. Based on the health formalization, care acts can be directed towards some of the propositional elements of the concept above. The care acts will modify the people health level. The health calculator allows the prioritization and prediction of different strategies of health care in hospital units.

Keywords: calculator, care, eHealth, health

Procedia PDF Downloads 246
17342 A Unique Exact Approach to Handle a Time-Delayed State-Space System: The Extraction of Juice Process

Authors: Mohamed T. Faheem Saidahmed, Ahmed M. Attiya Ibrahim, Basma GH. Elkilany

Abstract:

This paper discusses the application of Time Delay Control (TDC) compensation technique in the juice extraction process in a sugar mill. The objective is to improve the control performance of the process and increase extraction efficiency. The paper presents the mathematical model of the juice extraction process and the design of the TDC compensation controller. Simulation results show that the TDC compensation technique can effectively suppress the time delay effect in the process and improve control performance. The extraction efficiency is also significantly increased with the application of the TDC compensation technique. The proposed approach provides a practical solution for improving the juice extraction process in sugar mills using MATLAB Processes.

Keywords: time delay control (TDC), exact and unique state space model, delay compensation, Smith predictor.

Procedia PDF Downloads 67