Search results for: error bound
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2301

Search results for: error bound

1251 Augmenting Navigational Aids: The Development of an Assistive Maritime Navigation Application

Authors: A. Mihoc, K. Cater

Abstract:

On the bridge of a ship the officers are looking for visual aids to guide navigation in order to reconcile the outside world with the position communicated by the digital navigation system. Aids to navigation include: Lighthouses, lightships, sector lights, beacons, buoys, and others. They are designed to help navigators calculate their position, establish their course or avoid dangers. In poor visibility and dense traffic areas, it can be very difficult to identify these critical aids to guide navigation. The paper presents the usage of Augmented Reality (AR) as a means to present digital information about these aids to support navigation. To date, nautical navigation related mobile AR applications have been limited to the leisure industry. If proved viable, this prototype can facilitate the creation of other similar applications that could help commercial officers with navigation. While adopting a user centered design approach, the team has developed the prototype based on insights from initial research carried on board of several ships. The prototype, built on Nexus 9 tablet and Wikitude, features a head-up display of the navigational aids (lights) in the area, presented in AR and a bird’s eye view mode presented on a simplified map. The application employs the aids to navigation data managed by Hydrographic Offices and the tablet’s sensors: GPS, gyroscope, accelerometer, compass and camera. Sea trials on board of a Navy and a commercial ship revealed the end-users’ interest in using the application and further possibility of other data to be presented in AR. The application calculates the GPS position of the ship, the bearing and distance to the navigational aids; all within a high level of accuracy. However, during testing several issues were highlighted which need to be resolved as the prototype is developed further. The prototype stretched the capabilities of Wikitude, loading over 500 objects during tests in a major port. This overloaded the display and required over 45 seconds to load the data. Therefore, extra filters for the navigational aids are being considered in order to declutter the screen. At night, the camera is not powerful enough to distinguish all the lights in the area. Also, magnetic interference with the bridge of the ship generated a continuous compass error of the AR display that varied between 5 and 12 degrees. The deviation of the compass was consistent over the whole testing durations so the team is now looking at the possibility of allowing users to manually calibrate the compass. It is expected that for the usage of AR in professional maritime contexts, further development of existing AR tools and hardware is needed. Designers will also need to implement a user-centered design approach in order to create better interfaces and display technologies for enhanced solutions to aid navigation.

Keywords: compass error, GPS, maritime navigation, mobile augmented reality

Procedia PDF Downloads 321
1250 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods

Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard

Abstract:

The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.

Keywords: algorithms, genetics, matching, population

Procedia PDF Downloads 133
1249 Cancellation of Transducer Effects from Frequency Response Functions: Experimental Case Study on the Steel Plate

Authors: P. Zamani, A. Taleshi Anbouhi, M. R. Ashory, S. Mohajerzadeh, M. M. Khatibi

Abstract:

Modal analysis is a developing science in the experimental evaluation of dynamic properties of the structures. Mechanical devices such as accelerometers are one of the sources of lack of quality in measuring modal testing parameters. In this paper, eliminating the accelerometer’s mass effect of the frequency response of the structure is studied. So, a strategy is used for eliminating the mass effect by using sensitivity analysis. In this method, the amount of mass change and the place to measure the structure’s response with least error in frequency correction is chosen. Experimental modal testing is carried out on a steel plate and the effect of accelerometer’s mass is omitted using this strategy. Finally, a good agreement is achieved between numerical and experimental results.

Keywords: accelerometer mass, frequency response function, modal analysis, sensitivity analysis

Procedia PDF Downloads 436
1248 Application of Artificial Immune Systems Combined with Collaborative Filtering in Movie Recommendation System

Authors: Pei-Chann Chang, Jhen-Fu Liao, Chin-Hung Teng, Meng-Hui Chen

Abstract:

This research combines artificial immune system with user and item based collaborative filtering to create an efficient and accurate recommendation system. By applying the characteristic of antibodies and antigens in the artificial immune system and using Pearson correlation coefficient as the affinity threshold to cluster the data, our collaborative filtering can effectively find useful users and items for rating prediction. This research uses MovieLens dataset as our testing target to evaluate the effectiveness of the algorithm developed in this study. The experimental results show that the algorithm can effectively and accurately predict the movie ratings. Compared to some state of the art collaborative filtering systems, our system outperforms them in terms of the mean absolute error on the MovieLens dataset.

Keywords: artificial immune system, collaborative filtering, recommendation system, similarity

Procedia PDF Downloads 522
1247 Equity Risk Premiums and Risk Free Rates in Modelling and Prediction of Financial Markets

Authors: Mohammad Ghavami, Reza S. Dilmaghani

Abstract:

This paper presents an adaptive framework for modelling financial markets using equity risk premiums, risk free rates and volatilities. The recorded economic factors are initially used to train four adaptive filters for a certain limited period of time in the past. Once the systems are trained, the adjusted coefficients are used for modelling and prediction of an important financial market index. Two different approaches based on least mean squares (LMS) and recursive least squares (RLS) algorithms are investigated. Performance analysis of each method in terms of the mean squared error (MSE) is presented and the results are discussed. Computer simulations carried out using recorded data show MSEs of 4% and 3.4% for the next month prediction using LMS and RLS adaptive algorithms, respectively. In terms of twelve months prediction, RLS method shows a better tendency estimation compared to the LMS algorithm.

Keywords: adaptive methods, LSE, MSE, prediction of financial Markets

Procedia PDF Downloads 327
1246 A Novel Image Steganography Scheme Based on Mandelbrot Fractal

Authors: Adnan H. M. Al-Helali, Hamza A. Ali

Abstract:

Growth of censorship and pervasive monitoring on the Internet, Steganography arises as a new means of achieving secret communication. Steganography is the art and science of embedding information within electronic media used by common applications and systems. Generally, hiding information of multimedia within images will change some of their properties that may introduce few degradation or unusual characteristics. This paper presents a new image steganography approach for hiding information of multimedia (images, text, and audio) using generated Mandelbrot Fractal image as a cover. The proposed technique has been extensively tested with different images. The results show that the method is a very secure means of hiding and retrieving steganographic information. Experimental results demonstrate that an effective improvement in the values of the Peak Signal to Noise Ratio (PSNR), Mean Square Error (MSE), Normalized Cross Correlation (NCC) and Image Fidelity (IF) over the previous techniques.

Keywords: fractal image, information hiding, Mandelbrot et fractal, steganography

Procedia PDF Downloads 530
1245 The Influence of Active Breaks on the Attention/Concentration Performance in Eighth-Graders

Authors: Christian Andrä, Luisa Zimmermann, Christina Müller

Abstract:

Introduction: The positive relation between physical activity and cognition is commonly known. Relevant studies show that in everyday school life active breaks can lead to improvement in certain abilities (e.g. attention and concentration). A beneficial effect is in particular attributed to moderate activity. It is still unclear whether active breaks are beneficial after relatively short phases of cognitive load and whether the postulated effects of activity really have an immediate impact. The objective of this study was to verify whether an active break after 18 minutes of cognitive load leads to enhanced attention/concentration performance, compared to inactive breaks with voluntary mobile phone activity. Methodology: For this quasi-experimental study, 36 students [age: 14.0 (mean value) ± 0.3 (standard deviation); male/female: 21/15] of a secondary school were tested. In week 1, every student’s maximum heart rate (Hfmax) was determined through maximum effort tests conducted during physical education classes. The task was to run 3 laps of 300 m with increasing subjective effort (lap 1: 60%, lap 2: 80%, lap 3: 100% of the maximum performance capacity). Furthermore, first attention/concentration tests (D2-R) took place (pretest). The groups were matched on the basis of the pretest results. During week 2 and 3, crossover testing was conducted, comprising of 18 minutes of cognitive preload (test for concentration performance, KLT-R), a break and an attention/concentration test after a 2-minutes transition. Different 10-minutes breaks (active break: moderate physical activity with 65% Hfmax or inactive break: mobile phone activity) took place between preloading and transition. Major findings: In general, there was no impact of the different break interventions on the concentration test results (symbols processed after physical activity: 185.2 ± 31.3 / after inactive break: 184.4 ± 31.6; errors after physical activity: 5.7 ± 6.3 / after inactive break: 7.0. ± 7.2). There was, however, a noticeable development of the values over the testing periods. Although no difference in the number of processed symbols was detected (active/inactive break: period 1: 49.3 ± 8.8/46.9 ± 9.0; period 2: 47.0 ± 7.7/47.3 ± 8.4; period 3: 45.1 ± 8.3/45.6 ± 8.0; period 4: 43.8 ± 7.8/44.6 ± 8.0), error rates decreased successively after physical activity and increased gradually after an inactive break (active/inactive break: period 1: 1.9 ± 2.4/1.2 ± 1.4; period 2: 1.7 ± 1.8/ 1.5 ± 2.0, period 3: 1.2 ± 1.6/1.8 ± 2.1; period 4: 0.9 ± 1.5/2.5 ± 2.6; p= .012). Conclusion: Taking into consideration only the study’s overall results, the hypothesis must be dismissed. However, more differentiated evaluation shows that the error rates decreased after active breaks and increased after inactive breaks. Obviously, the effects of active intervention occur with a delay. The 2-minutes transition (regeneration time) used for this study seems to be insufficient due to the longer adaptation time of the cardio-vascular system in untrained individuals, which might initially affect the concentration capacity. To use the positive effects of physical activity for teaching and learning processes, physiological characteristics must also be considered. Only this will ensure optimum ability to perform.

Keywords: active breaks, attention/concentration test, cognitive performance capacity, heart rate, physical activity

Procedia PDF Downloads 305
1244 Particle Filter Implementation of a Non-Linear Dynamic Fall Model

Authors: T. Kobayashi, K. Shiba, T. Kaburagi, Y. Kurihara

Abstract:

For the elderly living alone, falls can be a serious problem encountered in daily life. Some elderly people are unable to stand up without the assistance of a caregiver. They may become unconscious after a fall, which can lead to serious aftereffects such as hypothermia, dehydration, and sometimes even death. We treat the subject as an inverted pendulum and model its angle from the equilibrium position and its angular velocity. As the model is non-linear, we implement the filtering method with a particle filter which can estimate true states of the non-linear model. In order to evaluate the accuracy of the particle filter estimation results, we calculate the root mean square error (RMSE) between the estimated angle/angular velocity and the true values generated by the simulation. The experimental results give the highest accuracy RMSE of 0.0141 rad and 0.1311 rad/s for the angle and angular velocity, respectively.

Keywords: fall, microwave Doppler sensor, non-linear dynamics model, particle filter

Procedia PDF Downloads 202
1243 Challenges of Cryogenic Fluid Metering by Coriolis Flowmeter

Authors: Evgeniia Shavrina, Yan Zeng, Boo Cheong Khoo, Vinh-Tan Nguyen

Abstract:

The present paper is aimed at providing a review of error sources in cryogenic metering by Coriolis flowmeters (CFMs). Whereas these flowmeters allow accurate water metering, high uncertainty and low repeatability are commonly observed at cryogenic fluid metering, which is often necessary for effective renewable energy production and storage. The sources of these issues might be classified as general and cryogenic specific challenges. A conducted analysis of experimental and theoretical studies shows that material behaviour at cryogenic temperatures, composition variety, and multiphase presence are the most significant cryogenic challenges. At the same time, pipeline diameter limitation, ambient vibration impact, and drawbacks of the installation may be highlighted as the most important general challenges of cryogenic metering by CFM. Finally, the techniques, which mitigate the impact of these challenges are reviewed, and future development direction is indicated.

Keywords: Coriolis flowmeter, cryogenic, multicomponent flow, multiphase flow

Procedia PDF Downloads 138
1242 3D Object Model Reconstruction Based on Polywogs Wavelet Network Parametrization

Authors: Mohamed Othmani, Yassine Khlifi

Abstract:

This paper presents a technique for compact three dimensional (3D) object model reconstruction using wavelet networks. It consists to transform an input surface vertices into signals,and uses wavelet network parameters for signal approximations. To prove this, we use a wavelet network architecture founded on several mother wavelet families. POLYnomials WindOwed with Gaussians (POLYWOG) wavelet families are used to maximize the probability to select the best wavelets which ensure the good generalization of the network. To achieve a better reconstruction, the network is trained several iterations to optimize the wavelet network parameters until the error criterion is small enough. Experimental results will shown that our proposed technique can effectively reconstruct an irregular 3D object models when using the optimized wavelet network parameters. We will prove that an accurateness reconstruction depends on the best choice of the mother wavelets.

Keywords: 3d object, optimization, parametrization, polywog wavelets, reconstruction, wavelet networks

Procedia PDF Downloads 274
1241 A Kolmogorov-Smirnov Type Goodness-Of-Fit Test of Multinomial Logistic Regression Model in Case-Control Studies

Authors: Chen Li-Ching

Abstract:

The multinomial logistic regression model is used popularly for inferring the relationship of risk factors and disease with multiple categories. This study based on the discrepancy between the nonparametric maximum likelihood estimator and semiparametric maximum likelihood estimator of the cumulative distribution function to propose a Kolmogorov-Smirnov type test statistic to assess adequacy of the multinomial logistic regression model for case-control data. A bootstrap procedure is presented to calculate the critical value of the proposed test statistic. Empirical type I error rates and powers of the test are performed by simulation studies. Some examples will be illustrated the implementation of the test.

Keywords: case-control studies, goodness-of-fit test, Kolmogorov-Smirnov test, multinomial logistic regression

Procedia PDF Downloads 446
1240 Multi-Agent Coverage Control with Bounded Gain Forgetting Composite Adaptive Controller

Authors: Mert Turanli, Hakan Temeltas

Abstract:

In this paper, we present an adaptive controller for decentralized coordination problem of multiple non-holonomic agents. The performance of the presented Multi-Agent Bounded Gain Forgetting (BGF) Composite Adaptive controller is compared against the tracking error criterion with a Feedback Linearization controller. By using the method, the sensor nodes move and reconfigure themselves in a coordinated way in response to a sensed environment. The multi-agent coordination is achieved through Centroidal Voronoi Tessellations and Coverage Control. Also, a consensus protocol is used for synchronization of the parameter vectors. The two controllers are given with their Lyapunov stability analysis and their stability is verified with simulation results. The simulations are carried out in MATLAB and ROS environments. Better performance is obtained with BGF Adaptive Controller.

Keywords: adaptive control, centroidal voronoi tessellations, composite adaptation, coordination, multi robots

Procedia PDF Downloads 339
1239 Application of Adaptive Neural Network Algorithms for Determination of Salt Composition of Waters Using Laser Spectroscopy

Authors: Tatiana A. Dolenko, Sergey A. Burikov, Alexander O. Efitorov, Sergey A. Dolenko

Abstract:

In this study, a comparative analysis of the approaches associated with the use of neural network algorithms for effective solution of a complex inverse problem – the problem of identifying and determining the individual concentrations of inorganic salts in multicomponent aqueous solutions by the spectra of Raman scattering of light – is performed. It is shown that application of artificial neural networks provides the average accuracy of determination of concentration of each salt no worse than 0.025 M. The results of comparative analysis of input data compression methods are presented. It is demonstrated that use of uniform aggregation of input features allows decreasing the error of determination of individual concentrations of components by 16-18% on the average.

Keywords: inverse problems, multi-component solutions, neural networks, Raman spectroscopy

Procedia PDF Downloads 519
1238 Efficient Relay Selection Scheme Utilizing OVSF Code in Cooperative Communication System

Authors: Yeong-Seop Ahn, Myoung-Jin Kim, Young-Min Ko, Hyoung-Kyu Song

Abstract:

This paper proposes a relay selection scheme utilizing an orthogonal variable spreading factor (OVSF) code in a cooperative communication. The relay selection scheme influences on the communication performance in the cooperative communication. Conventional relay selection schemes such as the best harmonic mean relay selection scheme or the threshold-based relay selection scheme should know information such as channel state information (CSI) in advance. The proposed relay selection scheme does not require information in advance by using a reference signal utilizing the OVSF code. The simulation result shows that bit error rate (BER) performance of proposed relay selection scheme is similar to the best harmonic mean relay selection scheme that is known as one of the optimal relay selection schemes.

Keywords: cooperative communication, relay selection, OFDM, OVSF code

Procedia PDF Downloads 627
1237 An AK-Chart for the Non-Normal Data

Authors: Chia-Hau Liu, Tai-Yue Wang

Abstract:

Traditional multivariate control charts assume that measurement from manufacturing processes follows a multivariate normal distribution. However, this assumption may not hold or may be difficult to verify because not all the measurement from manufacturing processes are normal distributed in practice. This study develops a new multivariate control chart for monitoring the processes with non-normal data. We propose a mechanism based on integrating the one-class classification method and the adaptive technique. The adaptive technique is used to improve the sensitivity to small shift on one-class classification in statistical process control. In addition, this design provides an easy way to allocate the value of type I error so it is easier to be implemented. Finally, the simulation study and the real data from industry are used to demonstrate the effectiveness of the propose control charts.

Keywords: multivariate control chart, statistical process control, one-class classification method, non-normal data

Procedia PDF Downloads 415
1236 Spelling Errors of EFL Students: An Insight into Curriculum Development

Authors: Sheikha Ali Salim Al-Breiki

Abstract:

The purpose of this study was to explore the types of the spelling errors students of grade ten make and to find out whether there were any significant differences between males and females with respect to the types of the spelling errors made. The sample of the study included 90 grade ten students from four different schools in North Batinah. The researcher manipulated the use of a test that consisted of two questions: an oral dictation test of 70 words with a contextualizing sentence and a free writing task. The misspellings were classified into nine different types. The findings revealed that the most common spelling errors among Omani grade ten students were vowel substitution, then came vowel omission in the second place and consonant substitution in the third place. Male students omitted more vowels than female students while females made more true word errors than their male counterparts. In light of the findings, the study presents some recommendations and suggestions for further studies.

Keywords: types of spelling errors, errors, ESL/EFL, error analysis

Procedia PDF Downloads 367
1235 Fuzzy Logic and Control Strategies on a Sump

Authors: Nasser Mohamed Ramli, Nurul Izzati Zulkifli

Abstract:

Sump can be defined as a reservoir which contains slurry; a mixture of solid and liquid or water, in it. Sump system is an unsteady process owing to the level response. Sump level shall be monitored carefully by using a good controller to avoid overflow. The current conventional controllers would not be able to solve problems with large time delay and nonlinearities, Fuzzy Logic controller is tested to prove its ability in solving the listed problems of slurry sump. Therefore, in order to justify the effectiveness and reliability of these controllers, simulation of the sump system was created by using MATLAB and the results were compared. According to the result obtained, instead of Proportional-Integral (PI) and Proportional-Integral and Derivative (PID), Fuzzy Logic controller showed the best result by offering quick response of 0.32 s for step input and 5 s for pulse generator, by producing small Integral Absolute Error (IAE) values that are 0.66 and 0.36 respectively.

Keywords: fuzzy, sump, level, controller

Procedia PDF Downloads 234
1234 Automatic Battery Charging for Rotor Wings Type Unmanned Aerial Vehicle

Authors: Jeyeon Kim

Abstract:

This paper describes the development of the automatic battery charging device for the rotor wings type unmanned aerial vehicle (UAV) and the positioning method that can be accurately landed on the charging device when landing. The developed automatic battery charging device is considered by simple maintenance, durability, cost and error of the positioning when landing. In order to for the UAV accurately land on the charging device, two kinds of markers (a color marker and a light marker) installed on the charging device is detected by the camera mounted on the UAV. And then, the UAV is controlled so that the detected marker becomes the center of the image and is landed on the device. We conduct the performance evaluation of the proposal positioning method by the outdoor experiments at day and night, and show the effectiveness of the system.

Keywords: unmanned aerial vehicle, automatic battery charging, positioning

Procedia PDF Downloads 353
1233 A Novel Image Steganography Method Based on Mandelbrot Fractal

Authors: Adnan H. M. Al-Helali, Hamza A. Ali

Abstract:

The growth of censorship and pervasive monitoring on the Internet, Steganography arises as a new means of achieving secret communication. Steganography is the art and science of embedding information within electronic media used by common applications and systems. Generally, hiding information of multimedia within images will change some of their properties that may introduce few degradation or unusual characteristics. This paper presents a new image steganography approach for hiding information of multimedia (images, text, and audio) using generated Mandelbrot Fractal image as a cover. The proposed technique has been extensively tested with different images. The results show that the method is a very secure means of hiding and retrieving steganographic information. Experimental results demonstrate that an effective improvement in the values of the Peak Signal to Noise Ratio (PSNR), Mean Square Error (MSE), Normalized Cross Correlation (NCC), and Image Fidelity (IF) over the pervious techniques.

Keywords: fractal image, information hiding, Mandelbrot set fractal, steganography

Procedia PDF Downloads 611
1232 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds

Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi

Abstract:

Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.

Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors

Procedia PDF Downloads 269
1231 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy

Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu

Abstract:

The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.

Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis

Procedia PDF Downloads 62
1230 Understanding and Improving Neural Network Weight Initialization

Authors: Diego Aguirre, Olac Fuentes

Abstract:

In this paper, we present a taxonomy of weight initialization schemes used in deep learning. We survey the most representative techniques in each class and compare them in terms of overhead cost, convergence rate, and applicability. We also introduce a new weight initialization scheme. In this technique, we perform an initial feedforward pass through the network using an initialization mini-batch. Using statistics obtained from this pass, we initialize the weights of the network, so the following properties are met: 1) weight matrices are orthogonal; 2) ReLU layers produce a predetermined number of non-zero activations; 3) the output produced by each internal layer has a unit variance; 4) weights in the last layer are chosen to minimize the error in the initial mini-batch. We evaluate our method on three popular architectures, and a faster converge rates are achieved on the MNIST, CIFAR-10/100, and ImageNet datasets when compared to state-of-the-art initialization techniques.

Keywords: deep learning, image classification, supervised learning, weight initialization

Procedia PDF Downloads 124
1229 Re-Evaluation of Field X Located in Northern Lake Albert Basin to Refine the Structural Interpretation

Authors: Calorine Twebaze, Jesca Balinga

Abstract:

Field X is located on the Eastern shores of L. Albert, Uganda, on the rift flank where the gross sedimentary fill is typically less than 2,000m. The field was discovered in 2006 and encountered about 20.4m of net pay across three (3) stratigraphic intervals within the discovery well. The field covers an area of 3 km2, with the structural configuration comprising a 3-way dip-closed hanging wall anticline that seals against the basement to the southeast along the bounding fault. Field X had been mapped on reprocessed 3D seismic data, which was originally acquired in 2007 and reprocessed in 2013. The seismic data quality is good across the field, and reprocessing work reduced the uncertainty in the location of the bounding fault and enhanced the lateral continuity of reservoir reflectors. The current study was a re-evaluation of Field X to refine fault interpretation and understand the structural uncertainties associated with the field. The seismic data, and three (3) wells datasets were used during the study. The evaluation followed standard workflows using Petrel software and structural attribute analysis. The process spanned from seismic- -well tie, structural interpretation, and structural uncertainty analysis. Analysis of three (3) well ties generated for the 3 wells provided a geophysical interpretation that was consistent with geological picks. The generated time-depth curves showed a general increase in velocity with burial depth. However, separation in curve trends observed below 1100m was mainly attributed to minimal lateral variation in velocity between the wells. In addition to Attribute analysis, three velocity modeling approaches were evaluated, including the Time-Depth Curve, Vo+ kZ, and Average Velocity Method. The generated models were calibrated at well locations using well tops to obtain the best velocity model for Field X. The Time-depth method resulted in more reliable depth surfaces with good structural coherence between the TWT and depth maps with minimal error at well locations of 2 to 5m. Both the NNE-SSW rift border fault and minor faults in the existing interpretation were reevaluated. However, the new interpretation delineated an E-W trending fault in the northern part of the field that had not been interpreted before. The fault was interpreted at all stratigraphic levels and thus propagates from the basement to the surface and is an active fault today. It was also noted that the entire field is less faulted with more faults in the deeper part of the field. The major structural uncertainties defined included 1) The time horizons due to reduced data quality, especially in the deeper parts of the structure, an error equal to one-third of the reflection time thickness was assumed, 2) Check shot analysis showed varying velocities within the wells thus varying depth values for each well, and 3) Very few average velocity points due to limited wells produced a pessimistic average Velocity model.

Keywords: 3D seismic data interpretation, structural uncertainties, attribute analysis, velocity modelling approaches

Procedia PDF Downloads 44
1228 An Approach for Modeling CMOS Gates

Authors: Spyridon Nikolaidis

Abstract:

A modeling approach for CMOS gates is presented based on the use of the equivalent inverter. A new model for the inverter has been developed using a simplified transistor current model which incorporates the nanoscale effects for the planar technology. Parametric expressions for the output voltage are provided as well as the values of the output and supply current to be compatible with the CCS technology. The model is parametric according the input signal slew, output load, transistor widths, supply voltage, temperature and process. The transistor widths of the equivalent inverter are determined by HSPICE simulations and parametric expressions are developed for that using a fitting procedure. Results for the NAND gate shows that the proposed approach offers sufficient accuracy with an average error in propagation delay about 5%.

Keywords: CMOS gate modeling, inverter modeling, transistor current mode, timing model

Procedia PDF Downloads 416
1227 Sentiment Analysis of Consumers’ Perceptions on Social Media about the Main Mobile Providers in Jamaica

Authors: Sherrene Bogle, Verlia Bogle, Tyrone Anderson

Abstract:

In recent years, organizations have become increasingly interested in the possibility of analyzing social media as a means of gaining meaningful feedback about their products and services. The aspect based sentiment analysis approach is used to predict the sentiment for Twitter datasets for Digicel and Lime, the main mobile companies in Jamaica, using supervised learning classification techniques. The results indicate an average of 82.2 percent accuracy in classifying tweets when comparing three separate classification algorithms against the purported baseline of 70 percent and an average root mean squared error of 0.31. These results indicate that the analysis of sentiment on social media in order to gain customer feedback can be a viable solution for mobile companies looking to improve business performance.

Keywords: machine learning, sentiment analysis, social media, supervised learning

Procedia PDF Downloads 427
1226 Upside Down Words as Initial Clinical Presentation of an Underlying Acute Ischemic Stroke

Authors: Ramuel Spirituel Mattathiah A. San Juan, Neil Ambasing

Abstract:

Background: Reversal of vision metamorphopsia is a transient form of metamorphopsia described as an upside-down alteration of the visual field in the coronal plane. Patients would describe objects, such as cups, upside down, but the tea would not spill, and people would walk on their heads. It is extremely rare as a stable finding, lasting days or weeks. We report a case wherein this type of metamorphopsia occurred only in written words and lasted for six months. Objective: To the best of our knowledge, we report the first rare occurrence of reversal of vision metamorphopsia described as inverted words as the sole initial presentation of an underlying stroke. Case Presentation: We report a 59-year-old male with poorly controlled hypertension and diabetes mellitus who presented with a 3-day history of difficulty reading, described as the words were turned upside down as if the words were inverted horizontally then with the progression of deficits such as right homonymous hemianopia and achromatopsia, prosopagnosia. Cranial magnetic resonance imaging (MRI) revealed an acute infarct on the left posterior cerebral artery territory. Follow-up after six months revealed improvement of the visual field cut but with the persistence of the higher cortical function deficits. Conclusion: We report the first rare occurrence of metamorphopsia described as purely inverted words as the sole initial presentation of an underlying stroke. The differential diagnoses of a patient presenting with text reversal metamorphopsia should include stroke in the occipitotemporal areas. It further expands the landscape of metamorphopsias due to its exclusivity to written words and prolonged duration. Knowing these clinical features will help identify the lesion locus and improve subsequent stroke care, especially in time-bound management like intravenous thrombolysis.

Keywords: rare presentation, text reversal metamorphopsia, ischemic stroke, stroke

Procedia PDF Downloads 52
1225 Evaluation of Automated Analyzers of Polycyclic Aromatic Hydrocarbons and Black Carbon in a Coke Oven Plant by Comparison with Analytical Methods

Authors: L. Angiuli, L. Trizio, R. Giua, A. Digilio, M. Tutino, P. Dambruoso, F. Mazzone, C. M. Placentino

Abstract:

In the winter of 2014 a series of measurements were performed to evaluate the behavior of real-time PAHs and black carbon analyzers in a coke oven plant located in Taranto, a city of Southern Italy. Data were collected both insides than outside the plant, at air quality monitoring sites. Contemporary measures of PM2.5 and PM1 were performed. Particle-bound PAHs were measured by two methods: (1) aerosol photoionization using an Ecochem PAS 2000 analyzer, (2) PM2.5 and PM1 quartz filter collection and analysis by gas chromatography/mass spectrometry (GC/MS). Black carbon was determined both in real-time by Magee Aethalometer AE22 analyzer than by semi-continuous Sunset Lab EC/OC instrument. Detected PM2.5 and PM1 levels were higher inside than outside the plant while PAHs real-time values were higher outside than inside. As regards PAHs, inside the plant Ecochem PAS 2000 revealed concentrations not significantly different from those determined on the filter during low polluted days, but at increasing concentrations the automated instrument underestimated PAHs levels. At the external site, Ecochem PAS 2000 real-time concentrations were steadily higher than those on the filter. In the same way, real-time black carbon values were constantly lower than EC concentrations obtained by Sunset EC/OC in the inner site, while outside the plant real-time values were comparable to Sunset EC values. Results showed that in a coke plant real-time analyzers of PAHs and black carbon in the factory configuration provide qualitative information, with no accuracy and leading to the underestimation of the concentration. A site specific calibration is needed for these instruments before their installation in high polluted sites.

Keywords: black carbon, coke oven plant, PAH, PAS, aethalometer

Procedia PDF Downloads 337
1224 Enzyme Immobilization on Functionalized Polystyrene Nanofibersfor Bioprocessing Applications

Authors: Mailin Misson, Bo Jin, Sheng Dai, Hu Zhang

Abstract:

Advances in biotechnology have witnessed a growing interest in enzyme applications for the development of green and sustainable bio processes. While known as powerful bio catalysts, enzymes are no longer of economic value when extended to large commercialization. Alternatively, immobilization technology allows enzyme recovery and continuous reuse which subsequently compensates high operating costs. Employment of enzymes on nano structured materials has been recognized as a promising approach to enhance enzyme catalytic performances. High porosity, inter connectivity and self-assembling behaviors endow nano fibers as exciting candidate for enzyme carrier in bio reactor systems. In this study, nano fibers were successfully fabricated via electro spinning system by optimizing the polymer concentration (10-30 %, w/v), applied voltage (10-30 kV) and discharge distance (11-26 cm). Microscopic images have confirmed the quality as homogeneous and good fiber alignment. The nano fibers surface was modified using strong oxidizing agent to facilitate bio molecule binding. Bovine serum albumin and β-galactosidase enzyme were employed as model bio catalysts and immobilized onto the oxidized surfaces through covalent binding. Maximum enzyme adsorption capacity of the modified nano fibers was 3000 mg/g, 3-fold higher than the unmodified counterpart (1000 mg/g). The highest immobilization yield was 80% and reached the saturation point at 2 mg/ml of enzyme concentration. The results indicate a significant increase of activity retention by the enzyme-bound modified nano fibers (80%) as compared to the nascent one (60%), signifying excellent enzyme-nano carrier bio compatibility. The immobilized enzyme was further used for the bio conversion of dairy wastes into value-added products. This study demonstrates great potential of acid-modified electrospun polystyrene nano fibers as enzyme carriers.

Keywords: immobilization, enzyme, nanocarrier, nanofibers

Procedia PDF Downloads 286
1223 Measuring How Brightness Mediates Auditory Salience

Authors: Baptiste Bouvier

Abstract:

While we are constantly flooded with stimuli in daily life, attention allows us to select the ones we specifically process and ignore the others. Some salient stimuli may sometimes pass this filter independently of our will, in a "bottom-up" way. The role of the acoustic properties of the timbre of a sound on its salience, i.e., its ability to capture the attention of a listener, is still not well understood. We implemented a paradigm called the "additional singleton paradigm", in which participants have to discriminate targets according to their duration. This task is perturbed (higher error rates and longer response times) by the presence of an irrelevant additional sound, of which we can manipulate a feature of our choice at equal loudness. This allows us to highlight the influence of the timbre features of a sound stimulus on its salience at equal loudness. We have shown that a stimulus that is brighter than the others but not louder leads to an attentional capture phenomenon in this framework. This work opens the door to the study of the influence of any timbre feature on salience.

Keywords: attention, audition, bottom-up attention, psychoacoustics, salience, timbre

Procedia PDF Downloads 159
1222 IRIS An Interactive Video Game for Children with Long-Term Illness in Hospitals

Authors: Ganetsou Evanthia, Koutsikos Emmanouil, Austin Anna Maria

Abstract:

Information technology has long served the needs of individuals for learning and entertainment, but much less for children in sickness. The aim of the proposed online video game is to provide immersive learning opportunities as well as essential social and emotional scenarios for hospital-bound children with long-term illness. Online self-paced courses on chosen school subjects, including specialised software and multisensory assessments, aim at enhancing children’s academic achievement and sense of inclusion, while doctor minigames familiarise and educate young patients on their medical conditions. Online ethical dilemmas will offer children opportunities to contemplate on the importance of medical procedures and following assigned medication, often challenging for young patients; they will therefore reflect on their condition, reevaluate their perceptions about hospitalisation, and assume greater personal responsibility for their progress. Children’s emotional and psychosocial needs are addressed by engaging in social conventions, such as interactive, daily, collaborative mini games with other hospitalised peers, like virtual competitive sports games, weekly group psychodrama sessions, and online birthday parties or sleepovers. Social bonding is also fostered by having a virtual pet to interact with and take care of, as well as a virtual nurse to discuss and reflect on the mood of the day, engage in constructive dialogue and perspective taking, and offer reminders. Access to the platform will be available throughout the day depending on the patient’s health status. The program is designed to minimise escapism and feelings of exclusion, and can flexibly be adapted to offer post-treatment and a support online system at home.

Keywords: long-term illness, children, hospital, interactive games, cognitive, socioemotional development

Procedia PDF Downloads 70