Search results for: time extension
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18901

Search results for: time extension

18061 Microwave Imaging by Application of Information Theory Criteria in MUSIC Algorithm

Authors: Majid Pourahmadi

Abstract:

The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.

Keywords: microwave imaging, time reversal, MUSIC algorithm, minimum description length (MDL)

Procedia PDF Downloads 339
18060 Discrete Tracking Control of Nonholonomic Mobile Robots: Backstepping Design Approach

Authors: Alexander S. Andreev, Olga A. Peregudova

Abstract:

In this paper, we propose a discrete tracking control of nonholonomic mobile robots with two degrees of freedom. The electro-mechanical model of a mobile robot moving on a horizontal surface without slipping, with two rear wheels controlled by two independent DC electric, and one front roal wheel is considered. We present back-stepping design based on the Euler approximate discrete-time model of a continuous-time plant. Theoretical considerations are verified by numerical simulation. The work was supported by RFFI (15-01-08482).

Keywords: actuator dynamics, back stepping, discrete-time controller, Lyapunov function, wheeled mobile robot

Procedia PDF Downloads 416
18059 Use of Hierarchical Temporal Memory Algorithm in Heart Attack Detection

Authors: Tesnim Charrad, Kaouther Nouira, Ahmed Ferchichi

Abstract:

In order to reduce the number of deaths due to heart problems, we propose the use of Hierarchical Temporal Memory Algorithm (HTM) which is a real time anomaly detection algorithm. HTM is a cortical learning algorithm based on neocortex used for anomaly detection. In other words, it is based on a conceptual theory of how the human brain can work. It is powerful in predicting unusual patterns, anomaly detection and classification. In this paper, HTM have been implemented and tested on ECG datasets in order to detect cardiac anomalies. Experiments showed good performance in terms of specificity, sensitivity and execution time.

Keywords: cardiac anomalies, ECG, HTM, real time anomaly detection

Procedia PDF Downloads 231
18058 Long Short-Term Memory Stream Cruise Control Method for Automated Drift Detection and Adaptation

Authors: Mohammad Abu-Shaira, Weishi Shi

Abstract:

Adaptive learning, a commonly employed solution to drift, involves updating predictive models online during their operation to react to concept drifts, thereby serving as a critical component and natural extension for online learning systems that learn incrementally from each example. This paper introduces LSTM-SCCM “Long Short-Term Memory Stream Cruise Control Method”, a drift adaptation-as-a-service framework for online learning. LSTM-SCCM automates drift adaptation through prompt detection, drift magnitude quantification, dynamic hyperparameter tuning, performing shortterm optimization and model recalibration for immediate adjustments, and, when necessary, conducting long-term model recalibration to ensure deeper enhancements in model performance. LSTM-SCCM is incorporated into a suite of cutting-edge online regression models, assessing their performance across various types of concept drift using diverse datasets with varying characteristics. The findings demonstrate that LSTM-SCCM represents a notable advancement in both model performance and efficacy in handling concept drift occurrences. LSTM-SCCM stands out as the sole framework adept at effectively tackling concept drifts within regression scenarios. Its proactive approach to drift adaptation distinguishes it from conventional reactive methods, which typically rely on retraining after significant degradation to model performance caused by drifts. Additionally, LSTM-SCCM employs an in-memory approach combined with the Self-Adjusting Memory (SAM) architecture to enhance real-time processing and adaptability. The framework incorporates variable thresholding techniques and does not assume any particular data distribution, making it an ideal choice for managing high-dimensional datasets and efficiently handling large-scale data. Our experiments, which include abrupt, incremental, and gradual drifts across both low- and high-dimensional datasets with varying noise levels, and applied to four state-of-the-art online regression models, demonstrate that LSTM-SCCM is versatile and effective, rendering it a valuable solution for online regression models to address concept drift.

Keywords: automated drift detection and adaptation, concept drift, hyperparameters optimization, online and adaptive learning, regression

Procedia PDF Downloads 17
18057 Stochastic Model Predictive Control for Linear Discrete-Time Systems with Random Dither Quantization

Authors: Tomoaki Hashimoto

Abstract:

Recently, feedback control systems using random dither quantizers have been proposed for linear discrete-time systems. However, the constraints imposed on state and control variables have not yet been taken into account for the design of feedback control systems with random dither quantization. Model predictive control is a kind of optimal feedback control in which control performance over a finite future is optimized with a performance index that has a moving initial and terminal time. An important advantage of model predictive control is its ability to handle constraints imposed on state and control variables. Based on the model predictive control approach, the objective of this paper is to present a control method that satisfies probabilistic state constraints for linear discrete-time feedback control systems with random dither quantization. In other words, this paper provides a method for solving the optimal control problems subject to probabilistic state constraints for linear discrete-time feedback control systems with random dither quantization.

Keywords: optimal control, stochastic systems, random dither, quantization

Procedia PDF Downloads 446
18056 Optimizing of the Micro EDM Parameters in Drilling of Titanium Ti-6Al-4V Alloy for Higher Machining Accuracy-Fuzzy Modelling

Authors: Ahmed A. D. Sarhan, Mum Wai Yip, M. Sayuti, Lim Siew Fen

Abstract:

Ti6Al4V alloy is highly used in the automotive and aerospace industry due to its good machining characteristics. Micro EDM drilling is commonly used to drill micro hole on extremely hard material with very high depth to diameter ratio. In this study, the parameters of micro-electrical discharge machining (EDM) in drilling of Ti6Al4V alloy is optimized for higher machining accuracy with less hole-dilation and hole taper ratio. The micro-EDM machining parameters includes, peak current and pulse on time. Fuzzy analysis was developed to evaluate the machining accuracy. The analysis shows that hole-dilation and hole-taper ratio are increased with the increasing of peak current and pulse on time. However, the surface quality deteriorates as the peak current and pulse on time increase. The combination that gives the optimum result for hole dilation is medium peak current and short pulse on time. Meanwhile, the optimum result for hole taper ratio is low peak current and short pulse on time.

Keywords: Micro EDM, Ti-6Al-4V alloy, fuzzy logic based analysis, optimization, machining accuracy

Procedia PDF Downloads 496
18055 Geographic Information System for District Level Energy Performance Simulations

Authors: Avichal Malhotra, Jerome Frisch, Christoph van Treeck

Abstract:

The utilization of semantic, cadastral and topological data from geographic information systems (GIS) has exponentially increased for building and urban-scale energy performance simulations. Urban planners, simulation scientists, and researchers use virtual 3D city models for energy analysis, algorithms and simulation tools. For dynamic energy simulations at city and district level, this paper provides an overview of the available GIS data models and their levels of detail. Adhering to different norms and standards, these models also intend to describe building and construction industry data. For further investigations, CityGML data models are considered for simulations. Though geographical information modelling has considerably many different implementations, extensions of virtual city data can also be made for domain specific applications. Highlighting the use of the extended CityGML models for energy researches, a brief introduction to the Energy Application Domain Extension (ADE) along with its significance is made. Consequently, addressing specific input simulation data, a workflow using Modelica underlining the usage of GIS information and the quantification of its significance over annual heating energy demand is presented in this paper.

Keywords: CityGML, EnergyADE, energy performance simulation, GIS

Procedia PDF Downloads 172
18054 Mother-Child Attachment and Anxiety Symptoms in Middle Childhood: Differences in Levels of Attachment Security

Authors: Simran Sharda

Abstract:

There is increasing evidence that leads psychologists today to believe that the attachment formed between a mother and child plays a much more profound role in later-life outcomes than previously expected. Particularly, the fact that a link may exist between maternal attachment and the development in addition to the severity of social anxiety in middle childhood seems to be gaining ground. This research will examine and address a myriad of major issues related to the impact of mother-child attachment: behaviors of children with different levels of secure attachment, various aspects of anxiety in relation to attachment security as well as other styles of mother-child attachments, especially avoidant attachment and over-attachment. This analysis serves to compile previous literature on the subject and touch light upon a logical extension of the research. Moreover, researchers have identified links between attachment and the externalization of problem behaviors: these behaviors may later manifest as social anxiety as well as increased severity and likelihood of PTSD diagnosis (an anxiety disorder). Furthermore, secure attachment has been linked to increased health benefits, cognitive skills, emotive socialization, and developmental psychopathology.

Keywords: child development, anxiety, cognition, developmental psychopathology, mother-child relationships, maternal, cognitive development

Procedia PDF Downloads 160
18053 Stereo Camera Based Speed-Hump Detection Process for Real Time Driving Assistance System in the Daytime

Authors: Hyun-Koo Kim, Yong-Hun Kim, Soo-Young Suk, Ju H. Park, Ho-Youl Jung

Abstract:

This paper presents an effective speed hump detection process at the day-time. we focus only on round types of speed humps in the day-time dynamic road environment. The proposed speed hump detection scheme consists mainly of two process as stereo matching and speed hump detection process. Our proposed process focuses to speed hump detection process. Speed hump detection process consist of noise reduction step, data fusion step, and speed hemp detection step. The proposed system is tested on Intel Core CPU with 2.80 GHz and 4 GB RAM tested in the urban road environments. The frame rate of test videos is 30 frames per second and the size of each frame of grabbed image sequences is 1280 pixels by 670 pixels. Using object-marked sequences acquired with an on-vehicle camera, we recorded speed humps and non-speed humps samples. Result of the tests, our proposed method can be applied in real-time systems by computation time is 13 ms. For instance; our proposed method reaches 96.1 %.

Keywords: data fusion, round types speed hump, speed hump detection, surface filter

Procedia PDF Downloads 513
18052 Hybrid Subspace Approach for Time Delay Estimation in MIMO Systems

Authors: Mojtaba Saeedinezhad, Sarah Yousefi

Abstract:

In this paper, we present a hybrid subspace approach for Time Delay Estimation (TDE) in multivariable systems. While several methods have been proposed for time delay estimation in SISO systems, delay estimation in MIMO systems were always a big challenge. In these systems the existing TDE methods have significant limitations because most of procedures are just based on system response estimation or correlation analysis. We introduce a new hybrid method for TDE in MIMO systems based on subspace identification and explicit output error method; and compare its performance with previously introduced procedures in presence of different noise levels and in a statistical manner. Then the best method is selected with multi objective decision making technique. It is shown that the performance of new approach is much better than the existing methods, even in low signal-to-noise conditions.

Keywords: system identification, time delay estimation, ARX, OE, merit ratio, multi variable decision making

Procedia PDF Downloads 347
18051 Design and Implementation of Partial Denoising Boundary Image Matching Using Indexing Techniques

Authors: Bum-Soo Kim, Jin-Uk Kim

Abstract:

In this paper, we design and implement a partial denoising boundary image matching system using indexing techniques. Converting boundary images to time-series makes it feasible to perform fast search using indexes even on a very large image database. Thus, using this converting method we develop a client-server system based on the previous partial denoising research in the GUI (graphical user interface) environment. The client first converts a query image given by a user to a time-series and sends denoising parameters and the tolerance with this time-series to the server. The server identifies similar images from the index by evaluating a range query, which is constructed using inputs given from the client, and sends the resulting images to the client. Experimental results show that our system provides much intuitive and accurate matching result.

Keywords: boundary image matching, indexing, partial denoising, time-series matching

Procedia PDF Downloads 141
18050 Determination of Surface Deformations with Global Navigation Satellite System Time Series

Authors: Ibrahim Tiryakioglu, Mehmet Ali Ugur, Caglar Ozkaymak

Abstract:

The development of GNSS technology has led to increasingly widespread and successful applications of GNSS surveys for monitoring crustal movements. However, multi-period GPS survey solutions have not been applied in monitoring vertical surface deformation. This study uses long-term GNSS time series that are required to determine vertical deformations. In recent years, the surface deformations that are parallel and semi-parallel to Bolvadin fault have occurred in Western Anatolia. These surface deformations have continued to occur in Bolvadin settlement area that is located mostly on alluvium ground. Due to these surface deformations, a number of cracks in the buildings located in the residential areas and breaks in underground water and sewage systems have been observed. In order to determine the amount of vertical surface deformations, two continuous GNSS stations have been established in the region. The stations have been operating since 2015 and 2017, respectively. In this study, GNSS observations from the mentioned two GNSS stations were processed with GAMIT/GLOBK (GNSS Analysis Massachusetts Institute of Technology/GLOBal Kalman) program package to create a coordinate time series. With the time series analyses, the GNSS stations’ behavior models (linear, periodical, etc.), the causes of these behaviors, and mathematical models were determined. The study results from the time series analysis of these two 2 GNSS stations shows approximately 50-80 mm/yr vertical movement.

Keywords: Bolvadin fault, GAMIT, GNSS time series, surface deformations

Procedia PDF Downloads 165
18049 A Real-Time Simulation Environment for Avionics Software Development and Qualification

Authors: Ferdinando Montemari, Antonio Vitale, Nicola Genito, Luca Garbarino, Urbano Tancredi, Domenico Accardo, Michele Grassi, Giancarmine Fasano, Anna Elena Tirri

Abstract:

The development of guidance, navigation and control algorithms and avionic procedures requires the disposability of suitable analysis and verification tools, such as simulation environments, which support the design process and allow detecting potential problems prior to the flight test, in order to make new technologies available at reduced cost, time and risk. This paper presents a simulation environment for avionic software development and qualification, especially aimed at equipment for general aviation aircrafts and unmanned aerial systems. The simulation environment includes models for short and medium-range radio-navigation aids, flight assistance systems, and ground control stations. All the software modules are able to simulate the modeled systems both in fast-time and real-time tests, and were implemented following component oriented modeling techniques and requirement based approach. The paper describes the specific models features, the architectures of the implemented software systems and its validation process. Performed validation tests highlighted the capability of the simulation environment to guarantee in real-time the required functionalities and performance of the simulated avionics systems, as well as to reproduce the interaction between these systems, thus permitting a realistic and reliable simulation of a complete mission scenario.

Keywords: ADS-B, avionics, NAVAIDs, real-time simulation, TCAS, UAS ground control station

Procedia PDF Downloads 229
18048 Minimizing Total Completion Time in No-Wait Flowshops with Setup Times

Authors: Ali Allahverdi

Abstract:

The m-machine no-wait flowshop scheduling problem is addressed in this paper. The objective is to minimize total completion time subject to the constraint that the makespan value is not greater than a certain value. Setup times are treated as separate from processing times. Several recent algorithms are adapted and proposed for the problem. An extensive computational analysis has been conducted for the evaluation of the proposed algorithms. The computational analysis indicates that the best proposed algorithm performs significantly better than the earlier existing best algorithm.

Keywords: scheduling, no-wait flowshop, algorithm, setup times, total completion time, makespan

Procedia PDF Downloads 341
18047 Automated Prepaid Billing Subscription System

Authors: Adekunle K. O, Adeniyi A. E, Kolawole E

Abstract:

One of the most dramatic trends in the communications market in recent years has been the growth of prepaid services. Today, prepaid no longer constitutes the low-revenue, basic-service segment. It is driven by a high margin, value-add service customers who view it as a convenient way of retaining control over their usage and communication spending while expecting high service levels. To service providers, prepaid services offer the advantage of reducing bad accounts while allowing them to predict usage and plan network resources. Yet, the real-time demands of prepaid services require a scalable, real-time platform to manage customers through their entire life cycle. It delivers integrated real-time rating, voucher management, recharge management, customer care and service provisioning for the generation of new prepaid services. It carries high scalability that can handle millions of prepaid customers in real-time through their entire life cycle.

Keywords: prepaid billing, voucher management, customers, automated, security

Procedia PDF Downloads 115
18046 In vivo Determination of Anticoagulant Property of the Tentacle Extract of Aurelia aurita (Moon Jellyfish) Using Sprague-Dawley Rats

Authors: Bea Carmel H. Casiding, Charmaine A. Guy, Funny Jovis P. Malasan, Katrina Chelsea B. Manlutac, Danielle Ann N. Novilla, Marianne R. Oliveros, Magnolia C. Sibulo

Abstract:

Moon jellyfish, Aurelia aurita, has become a popular research organism for diverse studies. Recent studies have verified the prevention of blood clotting properties of the moon jellyfish tentacle extract through in vitro methods. The purpose of this study was to validate the blood clotting ability of A. aurita tentacle extract using in vivo method of experimentation. The tentacles of A. aurita jellyfish were excised and filtered then centrifuged at 3000xg for 10 minutes. The crude nematocyst extract was suspended in 1:6 ratios with phosphate buffer solution and sonicated for three periods of 20 seconds each at 50 Hz. Protein concentration of the extract was determined using Bradford Assay. Bovine serum albumin was the standard solution used with the following concentrations: 35.0, 70.0, 105.0, 140.0, 175.0, 210.0, 245.0, and 280.0 µg/mL. The absorbance was read at 595 nm. Toxicity testing from OECD guidelines was adapted. The extract suspended in phosphate-buffered saline solution was arbitrarily set into three doses (0.1mg/kg, 0.3mg/kg, 0.5mg/kg) and were administered daily for five days to the experimental groups of five male Sprague-Dawley rats (one dose per group). Before and after the administration period, bleeding time and clotting time tests were performed. The One-way Analysis of Variance (ANOVA) was used to analyze the difference of before and after bleeding time and clotting time from the three treatment groups, time, positive and negative control groups. The average protein concentration of the sonicated crude tentacle extract was 206.5 µg/mL. The highest dose administered (0.5mg/kg) produced significant increase in the time for both bleeding and clotting tests. However, the preceding lower dose (0.3mg/kg) only was significantly effective for clotting time test. The protein contained in the tentacle extract with a concentration of 206.5 mcg/mL and dose of 0.3 mg/kg and 0.5 mg/kg of A. aurita elicited anticoagulating activity.

Keywords: anticoagulant, bleeding time test, clotting time test, moon jellyfish

Procedia PDF Downloads 398
18045 Estimation of Time Loss and Costs of Traffic Congestion: The Contingent Valuation Method

Authors: Amira Mabrouk, Chokri Abdennadher

Abstract:

The reduction of road congestion which is inherent to the use of vehicles is an obvious priority to public authority. Therefore, assessing the willingness to pay of an individual in order to save trip-time is akin to estimating the change in price which was the result of setting up a new transport policy to increase the networks fluidity and improving the level of social welfare. This study holds an innovative perspective. In fact, it initiates an economic calculation that has the objective of giving an estimation of the monetized time value during the trips made in Sfax. This research is founded on a double-objective approach. The aim of this study is to i) give an estimation of the monetized value of time; an hour dedicated to trips, ii) determine whether or not the consumer considers the environmental variables to be significant, iii) analyze the impact of applying a public management of the congestion via imposing taxation of city tolls on urban dwellers. This article is built upon a rich field survey led in the city of Sfax. With the use of the contingent valuation method, we analyze the “declared time preferences” of 450 drivers during rush hours. Based on the fond consideration of attributed bias of the applied method, we bring to light the delicacy of this approach with regards to the revelation mode and the interrogative techniques by following the NOAA panel recommendations bearing the exception of the valorization point and other similar studies about the estimation of transportation externality.

Keywords: willingness to pay, contingent valuation, time value, city toll

Procedia PDF Downloads 438
18044 Technology in the Calculation of People Health Level: Design of a Computational Tool

Authors: Sara Herrero Jaén, José María Santamaría García, María Lourdes Jiménez Rodríguez, Jorge Luis Gómez González, Adriana Cercas Duque, Alexandra González Aguna

Abstract:

Background: Health concept has evolved throughout history. The health level is determined by the own individual perception. It is a dynamic process over time so that you can see variations from one moment to the next. In this way, knowing the health of the patients you care for, will facilitate decision making in the treatment of care. Objective: To design a technological tool that calculates the people health level in a sequential way over time. Material and Methods: Deductive methodology through text analysis, extraction and logical knowledge formalization and education with expert group. Studying time: September 2015- actually. Results: A computational tool for the use of health personnel has been designed. It has 11 variables. Each variable can be given a value from 1 to 5, with 1 being the minimum value and 5 being the maximum value. By adding the result of the 11 variables we obtain a magnitude in a certain time, the health level of the person. The health calculator allows to represent people health level at a time, establishing temporal cuts being useful to determine the evolution of the individual over time. Conclusion: The Information and Communication Technologies (ICT) allow training and help in various disciplinary areas. It is important to highlight their relevance in the field of health. Based on the health formalization, care acts can be directed towards some of the propositional elements of the concept above. The care acts will modify the people health level. The health calculator allows the prioritization and prediction of different strategies of health care in hospital units.

Keywords: calculator, care, eHealth, health

Procedia PDF Downloads 265
18043 Counteracting Disruptions during the COVID-19 Pandemic in the Supply Chains of the Automotive Industry: The Example of Polish Enterprises

Authors: Tomasz Rokicki, Piotr Bórawski, Aneta Bełdycka-Bórawska, András Szeberényi

Abstract:

The aim of the article was to present ways to counteract disruptions during the COVID-19 pandemic occurring in the supply chain of enterprises from the automotive industry. The specific objectives are to determine changes in the automotive industry during the pandemic, to show the types of disruptions in supply chains, and how to counteract these unfavorable situations. Enterprises from the automotive industry operating in Poland were deliberately selected for research. Using the purposive sampling method, ten companies from the automotive industry were selected for qualitative research. In-depth research was carried out in selected enterprises using a personal interview. At the beginning of the pandemic, lockdowns and unpredictability were a problem. The key was to protect employees and introduce appropriate procedures. In the later stages of the pandemic, there were restrictions on the timeliness of deliveries and extension of delivery times. There were problems with the shortage of materials, and the costs of products and transport increased. In automotive companies, counteracting the effects of the pandemic consisted of ensuring the safety of employees, maintaining constant contact and communication with branches and headquarters, as well as with suppliers and contractors. Therefore, appropriate communication, cooperation, and flexibility were important.

Keywords: disruptions, automotive industry, supply chain disruption, cooperation in supply chain

Procedia PDF Downloads 69
18042 A Unique Exact Approach to Handle a Time-Delayed State-Space System: The Extraction of Juice Process

Authors: Mohamed T. Faheem Saidahmed, Ahmed M. Attiya Ibrahim, Basma GH. Elkilany

Abstract:

This paper discusses the application of Time Delay Control (TDC) compensation technique in the juice extraction process in a sugar mill. The objective is to improve the control performance of the process and increase extraction efficiency. The paper presents the mathematical model of the juice extraction process and the design of the TDC compensation controller. Simulation results show that the TDC compensation technique can effectively suppress the time delay effect in the process and improve control performance. The extraction efficiency is also significantly increased with the application of the TDC compensation technique. The proposed approach provides a practical solution for improving the juice extraction process in sugar mills using MATLAB Processes.

Keywords: time delay control (TDC), exact and unique state space model, delay compensation, Smith predictor.

Procedia PDF Downloads 92
18041 Optimal ECG Sampling Frequency for Multiscale Entropy-Based HRV

Authors: Manjit Singh

Abstract:

Multiscale entropy (MSE) is an extensively used index to provide a general understanding of multiple complexity of physiologic mechanism of heart rate variability (HRV) that operates on a wide range of time scales. Accurate selection of electrocardiogram (ECG) sampling frequency is an essential concern for clinically significant HRV quantification; high ECG sampling rate increase memory requirements and processing time, whereas low sampling rate degrade signal quality and results in clinically misinterpreted HRV. In this work, the impact of ECG sampling frequency on MSE based HRV have been quantified. MSE measures are found to be sensitive to ECG sampling frequency and effect of sampling frequency will be a function of time scale.

Keywords: ECG (electrocardiogram), heart rate variability (HRV), multiscale entropy, sampling frequency

Procedia PDF Downloads 271
18040 Impact Evaluation and Technical Efficiency in Ethiopia: Correcting for Selectivity Bias in Stochastic Frontier Analysis

Authors: Tefera Kebede Leyu

Abstract:

The purpose of this study was to estimate the impact of LIVES project participation on the level of technical efficiency of farm households in three regions of Ethiopia. We used household-level data gathered by IRLI between February and April 2014 for the year 2013(retroactive). Data on 1,905 (754 intervention and 1, 151 control groups) sample households were analyzed using STATA software package version 14. Efforts were made to combine stochastic frontier modeling with impact evaluation methodology using the Heckman (1979) two-stage model to deal with possible selectivity bias arising from unobservable characteristics in the stochastic frontier model. Results indicate that farmers in the two groups are not efficient and operate below their potential frontiers i.e., there is a potential to increase crop productivity through efficiency improvements in both groups. In addition, the empirical results revealed selection bias in both groups of farmers confirming the justification for the use of selection bias corrected stochastic frontier model. It was also found that intervention farmers achieved higher technical efficiency scores than the control group of farmers. Furthermore, the selectivity bias-corrected model showed a different technical efficiency score for the intervention farmers while it more or less remained the same for that of control group farmers. However, the control group of farmers shows a higher dispersion as measured by the coefficient of variation compared to the intervention counterparts. Among the explanatory variables, the study found that farmer’s age (proxy to farm experience), land certification, frequency of visit to improved seed center, farmer’s education and row planting are important contributing factors for participation decisions and hence technical efficiency of farmers in the study areas. We recommend that policies targeting the design of development intervention programs in the agricultural sector focus more on providing farmers with on-farm visits by extension workers, provision of credit services, establishment of farmers’ training centers and adoption of modern farm technologies. Finally, we recommend further research to deal with this kind of methodological framework using a panel data set to test whether technical efficiency starts to increase or decrease with the length of time that farmers participate in development programs.

Keywords: impact evaluation, efficiency analysis and selection bias, stochastic frontier model, Heckman-two step

Procedia PDF Downloads 77
18039 A 15 Minute-Based Approach for Berth Allocation and Quay Crane Assignment

Authors: Hoi-Lam Ma, Sai-Ho Chung

Abstract:

In traditional integrated berth allocation with quay crane assignment models, time dimension is usually assumed in hourly based. However, nowadays, transshipment becomes the main business to many container terminals, especially in Southeast Asia (e.g. Hong Kong and Singapore). In these terminals, vessel arrivals are usually very frequent with small handling volume and very short staying time. Therefore, the traditional hourly-based modeling approach may cause significant berth and quay crane idling, and consequently cannot meet their practical needs. In this connection, a 15-minute-based modeling approach is requested by industrial practitioners. Accordingly, a Three-level Genetic Algorithm (3LGA) with Quay Crane (QC) shifting heuristics is designed to fulfill the research gap. The objective function here is to minimize the total service time. Preliminary numerical results show that the proposed 15-minute-based approach can reduce the berth and QC idling significantly.

Keywords: transshipment, integrated berth allocation, variable-in-time quay crane assignment, quay crane assignment

Procedia PDF Downloads 169
18038 Implementation of the Canadian Emergency Department Triage and Acuity Scale (CTAS) in an Urgent Care Center in Saudi Arabia

Authors: Abdullah Arafat, Ali Al-Farhan, Amir Omair

Abstract:

Objectives: To review and assess the effectiveness of the implemented modified five-levels triage and acuity scale triage system in AL-Yarmook Urgent Care Center (UCC), King Abdulaziz Residential city, Riyadh, Saudi Arabia. Method: The applied study design was an observational cross sectional design. A data collection sheet was designed and distributed to triage nurses; the data collection was done during triage process and was directly observed by the co-investigator. Triage system was reviewed by measuring three time intervals as quality indicators: time before triage (TBT), time before being seen by physician (TBP) and total length of stay (TLS) taking in consideration timing of presentation and level of triage. Results: During the study period, a total of 187 patients were included in our study. 118 visits were at weekdays and 68 visits at weekends. Overall, 173 patients (92.5%) were seen by the physician in timely manner according to triage guidelines while 14 patients (7.5%) were not seen at appropriate time.Overall, The mean time before seen the triage nurse (TBT) was 5.36 minutes, the mean time to be seen by physician (TBP) was 22.6 minutes and the mean length of stay (TLS) was 59 minutes. The data didn’t showed significant increase in TBT, TBP, and number of patients not seen at the proper time, referral rate and admission rate during weekend. Conclusion: The CTAS is adaptable to countries beyond Canada and worked properly. The applied CTAS triage system in Al-Yarmook UCC is considered to be effective and well applied. Overall, urgent cases have been seen by physician in timely manner according to triage system and there was no delay in the management of urgent cases.

Keywords: CTAS, emergency, Saudi Arabia, triage, urgent care

Procedia PDF Downloads 323
18037 The Experimental Study on Reducing and Carbonizing Titanium-Containing Slag by Iron-Containing Coke

Authors: Yadong Liu

Abstract:

The experimental study on reduction carbonization of coke containing iron respectively with the particle size of <0.3mm, 0.3-0.6mm and 0.6-0.9mm and synthetic sea sand ore smelting reduction titanium-bearing slag as material were studied under the conditions of holding 6h at most at 1500℃. The effects of coke containing iron particle size and heat preservation time on the formation of TiC and the size of TiC crystal were studied by XRD, SEM and EDS. The results show that it is not good for the formation, concentration and growth of TiC crystal when the particle size of coke containing iron is too small or too large. The suitable particle size is 0.3~0.6mm. The heat preservation time of 2h basically ensures that all the component TiO2 in the slag are reduced and carbonized and converted to TiC. The size of TiC crystal will increase with the prolongation of heat preservation time. The thickness of the TiC layer can reach 20μm when the heat preservation time is 6h.

Keywords: coke containing iron, formation and concentration and growth of TiC, reduction and carbonization, titanium-bearing slag

Procedia PDF Downloads 149
18036 Tracing the Evolution of English and Urdu Languages: A Linguistic and Cultural Analysis

Authors: Aamna Zafar

Abstract:

Through linguistic and cultural analysis, this study seeks to trace the development of the English and Urdu languages. Along with examining how the vocabulary and syntax of English and Urdu have evolved over time and the linguistic trends that may be seen in these changes, this study will also look at the historical and cultural influences that have shaped the languages throughout time. The study will also look at how English and Urdu have changed over time, both in terms of language use and communication inside each other's cultures and globally. We'll research how these changes affect social relations and cultural identity, as well as how they might affect the future of these languages.

Keywords: linguistic and cultural analysis, historical factors, cultural factors, vocabulary, syntax, significance

Procedia PDF Downloads 75
18035 Camera Model Identification for Mi Pad 4, Oppo A37f, Samsung M20, and Oppo f9

Authors: Ulrich Wake, Eniman Syamsuddin

Abstract:

The model for camera model identificaiton is trained using pretrained model ResNet43 and ResNet50. The dataset consists of 500 photos of each phone. Dataset is divided into 1280 photos for training, 320 photos for validation and 400 photos for testing. The model is trained using One Cycle Policy Method and tested using Test-Time Augmentation. Furthermore, the model is trained for 50 epoch using regularization such as drop out and early stopping. The result is 90% accuracy for validation set and above 85% for Test-Time Augmentation using ResNet50. Every model is also trained by slightly updating the pretrained model’s weights

Keywords: ​ One Cycle Policy, ResNet34, ResNet50, Test-Time Agumentation

Procedia PDF Downloads 209
18034 Travel Time Estimation of Public Transport Networks Based on Commercial Incidence Areas in Quito Historic Center

Authors: M. Fernanda Salgado, Alfonso Tierra, David S. Sandoval, Wilbert G. Aguilar

Abstract:

Public transportation buses usually vary the speed depending on the places with the number of passengers. They require having efficient travel planning, a plan that will help them choose the fast route. Initially, an estimation tool is necessary to determine the travel time of each route, clearly establishing the possibilities. In this work, we give a practical solution that makes use of a concept that defines as areas of commercial incidence. These areas are based on the hypothesis that in the commercial places there is a greater flow of people and therefore the buses remain more time in the stops. The areas have one or more segments of routes, which have an incidence factor that allows to estimate the times. In addition, initial results are presented that verify the hypotheses and that promise adequately the travel times. In a future work, we take this approach to make an efficient travel planning system.

Keywords: commercial incidence, planning, public transport, speed travel, travel time

Procedia PDF Downloads 254
18033 Donoho-Stark’s and Hardy’s Uncertainty Principles for the Short-Time Quaternion Offset Linear Canonical Transform

Authors: Mohammad Younus Bhat

Abstract:

The quaternion offset linear canonical transform (QOLCT), which isa time-shifted and frequency-modulated version of the quaternion linear canonical transform (QLCT), provides a more general framework of most existing signal processing tools. For the generalized QOLCT, the classical Heisenberg’s and Lieb’s uncertainty principles have been studied recently. In this paper, we first define the short-time quaternion offset linear canonical transform (ST-QOLCT) and drive its relationship with the quaternion Fourier transform (QFT). The crux of the paper lies in the generalization of several well-known uncertainty principles for the ST-QOLCT, including Donoho-Stark’s uncertainty principle, Hardy’s uncertainty principle, Beurling’s uncertainty principle, and the logarithmic uncertainty principle.

Keywords: Quaternion Fourier transform, Quaternion offset linear canonical transform, short-time quaternion offset linear canonical transform, uncertainty principle

Procedia PDF Downloads 212
18032 Education and Learning in Indonesia to Refer to the Democratic and Humanistic Learning System in Finland

Authors: Nur Sofi Hidayah, Ratih Tri Purwatiningsih

Abstract:

Learning is a process attempts person to obtain a new behavior changes as a whole, as a result of his own experience in the interaction with the environment. Learning involves our brain to think, while the ability of the brain to each student's performance is different. To obtain optimal learning results then need time to learn the exact hour that the brain's performance is not too heavy. Referring to the learning system in Finland which apply 45 minutes to learn and a 15-minute break is expected to be the brain work better, with the rest of the brain, the brain will be more focused and lessons can be absorbed well. It can be concluded that learning in this way students learn with brain always fresh and the best possible use of the time, but it can make students not saturated in a lesson.

Keywords: learning, working hours brain, time efficient learning, working hours in the brain receive stimulus.

Procedia PDF Downloads 399