Search results for: measurement model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18457

Search results for: measurement model

14107 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning

Authors: Walid Cherif

Abstract:

Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.

Keywords: data mining, knowledge discovery, machine learning, similarity measurement, supervised classification

Procedia PDF Downloads 454
14106 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 71
14105 Studying the Impact of Agricultural Producers Support Policy in Export Market

Authors: Yazdani Saeed, Rafiei Hamed, Nekoofar Farahnaz

Abstract:

Governments Policies play a major role in national and international Markets. Pistachio is one of the most important non-oil export commodity of Iran. Therefore, in this study the relation between the producer support policies and the export of Pistachio was examined. An econometric model (VAR) was applied to test the study hypothesis. According to the estimated coefficient in VAR model, lag of producer support index has a significant and negative effect on variation of Pistachio’s export in short term. In other word, in short term, export advantage index is dependent on the amount of producers support in previous period.

Keywords: producer support, export advantage, pistachio, Iran

Procedia PDF Downloads 28
14104 Theology and Music in the XXI. Century: An Exploratory Study of Current Interrelation

Authors: Andrzej Kesiak

Abstract:

Contemporary theology is often accused of answering questions that nobody is asking, and of employing hermetic language that has lost its communication capacity. There is also a question that theology is asking itself: how theological discourse can still be influential on other disciplines and, how to overcome the separation of theology and belief. Undoubtedly, in the wider spectrum, the theological discourse has been and will be needed. The difficulty is how to find the right model of it, the model that would help theology to enter in dialogue with culture, art, science, and politics. Presumably, there is no only one such model, theology constantly needs to seek such models, and this is probably a never-ending journey; in other words, theology should adopt a profile of ‘a restless being’ if it wants to remain influential. Music, on the other hand, has always been very close to theology; in fact, a huge part of classical music is either sacred or religious. Many composers sought inspiration in religion, liturgy, religious painting and sacred texts. This paper will argue that despite all that it seems that a proper and factual dialogue is still in a starting phase. Such a thing as a reciprocal relationship between theology and music definitely exists, but it has not yet been theoretically developed enough. Correlation between musical and theological disciplines constitutes a very broad and complex discourse. Therefore this study would rather narrow the subject and put it in a specific context: Theology and Music in the XXI. Century. This paper is a text-based study; therefore it will be based on textual-analysis with elements of the text hermeneutics.

Keywords: music, theology, reciprocal relationship between theology and music, XXI Century

Procedia PDF Downloads 145
14103 Spontaneous and Posed Smile Detection: Deep Learning, Traditional Machine Learning, and Human Performance

Authors: Liang Wang, Beste F. Yuksel, David Guy Brizan

Abstract:

A computational model of affect that can distinguish between spontaneous and posed smiles with no errors on a large, popular data set using deep learning techniques is presented in this paper. A Long Short-Term Memory (LSTM) classifier, a type of Recurrent Neural Network, is utilized and compared to human classification. Results showed that while human classification (mean of 0.7133) was above chance, the LSTM model was more accurate than human classification and other comparable state-of-the-art systems. Additionally, a high accuracy rate was maintained with small amounts of training videos (70 instances). The derivation of important features to further understand the success of our computational model were analyzed, and it was inferred that thousands of pairs of points within the eyes and mouth are important throughout all time segments in a smile. This suggests that distinguishing between a posed and spontaneous smile is a complex task, one which may account for the difficulty and lower accuracy of human classification compared to machine learning models.

Keywords: affective computing, affect detection, computer vision, deep learning, human-computer interaction, machine learning, posed smile detection, spontaneous smile detection

Procedia PDF Downloads 112
14102 The Model Establishment and Analysis of TRACE/FRAPTRAN for Chinshan Nuclear Power Plant Spent Fuel Pool

Authors: J. R. Wang, H. T. Lin, Y. S. Tseng, W. Y. Li, H. C. Chen, S. W. Chen, C. Shih

Abstract:

TRACE is developed by U.S. NRC for the nuclear power plants (NPPs) safety analysis. We focus on the establishment and application of TRACE/FRAPTRAN/SNAP models for Chinshan NPP (BWR/4) spent fuel pool in this research. The geometry is 12.17 m × 7.87 m × 11.61 m for the spent fuel pool. In this study, there are three TRACE/SNAP models: one-channel, two-channel, and multi-channel TRACE/SNAP model. Additionally, the cooling system failure of the spent fuel pool was simulated and analyzed by using the above models. According to the analysis results, the peak cladding temperature response was more accurate in the multi-channel TRACE/SNAP model. The results depicted that the uncovered of the fuels occurred at 2.7 day after the cooling system failed. In order to estimate the detailed fuel rods performance, FRAPTRAN code was used in this research. According to the results of FRAPTRAN, the highest cladding temperature located on the node 21 of the fuel rod (the highest node at node 23) and the cladding burst roughly after 3.7 day.

Keywords: TRACE, FRAPTRAN, BWR, spent fuel pool

Procedia PDF Downloads 339
14101 Audio-Visual Recognition Based on Effective Model and Distillation

Authors: Heng Yang, Tao Luo, Yakun Zhang, Kai Wang, Wei Qin, Liang Xie, Ye Yan, Erwei Yin

Abstract:

Recent years have seen that audio-visual recognition has shown great potential in a strong noise environment. The existing method of audio-visual recognition has explored methods with ResNet and feature fusion. However, on the one hand, ResNet always occupies a large amount of memory resources, restricting the application in engineering. On the other hand, the feature merging also brings some interferences in a high noise environment. In order to solve the problems, we proposed an effective framework with bidirectional distillation. At first, in consideration of the good performance in extracting of features, we chose the light model, Efficientnet as our extractor of spatial features. Secondly, self-distillation was applied to learn more information from raw data. Finally, we proposed a bidirectional distillation in decision-level fusion. In more detail, our experimental results are based on a multi-model dataset from 24 volunteers. Eventually, the lipreading accuracy of our framework was increased by 2.3% compared with existing systems, and our framework made progress in audio-visual fusion in a high noise environment compared with the system of audio recognition without visual.

Keywords: lipreading, audio-visual, Efficientnet, distillation

Procedia PDF Downloads 117
14100 Assessment of Exhaust Emissions and Fuel Consumption from Means of Transport in Agriculture

Authors: Jerzy Merkisz, Piotr Lijewski, Pawel Fuc, Maciej Siedlecki, Andrzej Ziolkowski, Sylwester Weymann

Abstract:

The paper discusses the problem of load transport using farm tractors and road tractor units. This type of carriage of goods is often done with farm vehicles. The tests were performed with the PEMS equipment (Portable Emission Measurement System) under actual traffic conditions. The vehicles carried a load of 20000 kg. This research method is one of the most desired because it provides reliable information on the actual vehicle emissions and fuel consumption (carbon balance method). For the tests, a route was selected that simulated a trip from a small town to a food-processing facility located in a city. The analysis of the obtained results gave a clear answer as to what vehicles need to be used for the carriage of this type of cargo in terms of exhaust emissions and fuel consumption.

Keywords: emission, transport, fuel consumption, PEMS

Procedia PDF Downloads 505
14099 Impact Load Response of Light Rail Train Rail Guard

Authors: Eyob Hundessa Gose

Abstract:

Nowadays, it is obviously known that the construction of different infrastructures is one measurement of the development of a country; infrastructures like buildings, bridges, roads, and railways are among them. In the capital city of Ethiopia, the so-called Addis Ababa, the Light Rail Train (LRT), was built Four years ago to satisfy the demand for transportation among the people in the city. The lane of the Train and vehicle separation Media was built with a curb and rail guard installation system to show the right-of-way and for protection of vehicles entering the Train Lane, but this Rail guard fails easily when impacted by vehicles and found that the impact load response of the Rail guard is weak and the Rail guard cannot withstand impact load. This study investigates the effect of variation of parameters such as vehicle speed and different mass effects and assesses the failure mode FRP and Steel reinforcement bar rail guards of deflection and damage state.

Keywords: impact load, fiber reinforced polymer, rail guard, LS-DYNA

Procedia PDF Downloads 46
14098 SQL Generator Based on MVC Pattern

Authors: Chanchai Supaartagorn

Abstract:

Structured Query Language (SQL) is the standard de facto language to access and manipulate data in a relational database. Although SQL is a language that is simple and powerful, most novice users will have trouble with SQL syntax. Thus, we are presenting SQL generator tool which is capable of translating actions and displaying SQL commands and data sets simultaneously. The tool was developed based on Model-View-Controller (MVC) pattern. The MVC pattern is a widely used software design pattern that enforces the separation between the input, processing, and output of an application. Developers take full advantage of it to reduce the complexity in architectural design and to increase flexibility and reuse of code. In addition, we use White-Box testing for the code verification in the Model module.

Keywords: MVC, relational database, SQL, White-Box testing

Procedia PDF Downloads 409
14097 Discrete Tracking Control of Nonholonomic Mobile Robots: Backstepping Design Approach

Authors: Alexander S. Andreev, Olga A. Peregudova

Abstract:

In this paper, we propose a discrete tracking control of nonholonomic mobile robots with two degrees of freedom. The electro-mechanical model of a mobile robot moving on a horizontal surface without slipping, with two rear wheels controlled by two independent DC electric, and one front roal wheel is considered. We present back-stepping design based on the Euler approximate discrete-time model of a continuous-time plant. Theoretical considerations are verified by numerical simulation. The work was supported by RFFI (15-01-08482).

Keywords: actuator dynamics, back stepping, discrete-time controller, Lyapunov function, wheeled mobile robot

Procedia PDF Downloads 396
14096 Permeodynamic Particulate Matter Filtration for Improved Air Quality

Authors: Hamad M. Alnagran, Mohammed S. Imbabi

Abstract:

Particulate matter (PM) in the air we breathe is detrimental to health. Overcoming this problem has attracted interest and prompted research on the use of PM filtration in commercial buildings and homes to be carried out. The consensus is that tangible health benefits can result from the use of PM filters in most urban environments, to clean up the building’s fresh air supply and thereby reduce exposure of residents to airborne PM. The authors have investigated and are developing a new large-scale Permeodynamic Filtration Technology (PFT) capable of permanently filtering and removing airborne PMs from outdoor spaces, thus also benefiting internal spaces such as the interiors of buildings. Theoretical models were developed, and laboratory trials carried out to determine, and validate through measurement permeodynamic filtration efficiency and pressure drop as functions of PM particle size distributions. The conclusion is that PFT offers a potentially viable, cost effective end of pipe solution to the problem of airborne PM.

Keywords: air filtration, particulate matter, particle size distribution, permeodynamic

Procedia PDF Downloads 186
14095 Evaluation of Duncan-Chang Deformation Parameters of Granular Fill Materials Using Non-Invasive Seismic Wave Methods

Authors: Ehsan Pegah, Huabei Liu

Abstract:

Characterizing the deformation properties of fill materials in a wide stress range always has been an important issue in geotechnical engineering. The hyperbolic Duncan-Chang model is a very popular model of stress-strain relationship that captures the nonlinear deformation of granular geomaterials in a very tractable manner. It consists of a particular set of the model parameters, which are generally measured from an extensive series of laboratory triaxial tests. This practice is both time-consuming and costly, especially in large projects. In addition, undesired effects caused by soil disturbance during the sampling procedure also may yield a large degree of uncertainty in the results. Accordingly, non-invasive geophysical seismic approaches may be utilized as the appropriate alternative surveys for measuring the model parameters based on the seismic wave velocities. To this end, the conventional seismic refraction profiles were carried out in the test sites with the granular fill materials to collect the seismic waves information. The acquired shot gathers are processed, from which the P- and S-wave velocities can be derived. The P-wave velocities are extracted from the Seismic Refraction Tomography (SRT) technique while S-wave velocities are obtained by the Multichannel Analysis of Surface Waves (MASW) method. The velocity values were then utilized with the equations resulting from the rigorous theories of elasticity and soil mechanics to evaluate the Duncan-Chang model parameters. The derived parameters were finally compared with those from laboratory tests to validate the reliability of the results. The findings of this study may confidently serve as the useful references for determination of nonlinear deformation parameters of granular fill geomaterials. Those are environmentally friendly and quite economic, which can yield accurate results under the actual in-situ conditions using the surface seismic methods.

Keywords: Duncan-Chang deformation parameters, granular fill materials, seismic waves velocity, multichannel analysis of surface waves, seismic refraction tomography

Procedia PDF Downloads 171
14094 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed

Authors: Marion G. Ben-Jacob, David Wang

Abstract:

There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.

Keywords: emporium model, mathematics, pedagogy, STEM

Procedia PDF Downloads 56
14093 Scale Effects on the Wake Airflow of a Heavy Truck

Authors: Aude Pérard Lecomte, Georges Fokoua, Amine Mehel, Anne Tanière

Abstract:

Air quality in urban areas is deteriorated by pollution, mainly due to the constant increase of the traffic of different types of ground vehicles. In particular, particulate matter pollution with important concentrations in urban areas can cause serious health issues. Characterizing and understanding particle dynamics is therefore essential to establish recommendations to improve air quality in urban areas. To analyze the effects of turbulence on particulate pollutants dispersion, the first step is to focus on the single-phase flow structure and turbulence characteristics in the wake of a heavy truck model. To achieve this, Computational Fluid Dynamics (CFD) simulations were conducted with the aim of modeling the wake airflow of a full- and reduced-scale heavy truck. The Reynolds Average Navier-Stokes (RANS) approach with the Reynolds Stress Model (RSM)as the turbulence model closure was used. The simulations highlight the apparition of a large vortex coming from the under trailer. This vortex belongs to the recirculation region, located in the near-wake of the heavy truck. These vortical structures are expected to have a strong influence on particle dynamics that are emitted by the truck.

Keywords: CDF, heavy truck, recirculation region, reduced scale

Procedia PDF Downloads 202
14092 Electrostatic and Dielectric Measurements for Hair Building Fibers from DC to Microwave Frequencies

Authors: K. Y. You, Y. L. Then

Abstract:

In the recent years, the hair building fiber has become popular, in other words, it is an effective method which helps people who suffer hair loss or sparse hair since the hair building fiber is capable to create a natural look of simulated hair rapidly. In the markets, there are a lot of hair fiber brands that have been designed to formulate an intense bond with hair strands and make the hair appear more voluminous instantly. However, those products have their own set of properties. Thus, in this report, some measurement techniques are proposed to identify those products. Up to five different brands of hair fiber are tested. The electrostatic and dielectric properties of the hair fibers are macroscopically tested using design DC and high-frequency microwave techniques. Besides, the hair fibers are microscopically analysis by magnifying the structures of the fiber using scanning electron microscope (SEM). From the SEM photos, the comparison of the uniformly shaped and broken rate of the hair fibers in the different bulk samples can be observed respectively.

Keywords: hair fiber, electrostatic, dielectric properties, broken rate, microwave techniques

Procedia PDF Downloads 303
14091 Novel Ferroelectric Properties as Studied by Boson Mean Field Laser Radiation Induced from a Beer Bottle

Authors: Tadeus Atraskevic, Asch Dalbajobas, Mazahistas Pukuotukas

Abstract:

The novel ferroelectric properties appeared in the recent ten years. Many scientists consider them as non-statement science. Nevertheless, many papers are published. The Mean field theory takes an important place in the theory of ferroelectric materials which can be applied for Boson induced laser systems for ‘Star Track’ soldiers. The novel Laser, which was produced in The Vilnius Bambalio University is a ‘now-how’ among other laser systems. The laser can produce power of 30 kW during 15 seconds. Its size and compatibility distinguishes it among other devices and safety gadgets. Scientists of Bambalio University have already patented the device. The most interesting in this innovations is the process of operation. Merely it may be operated through a bottle a beer what makes the measurement so convenient, that an ordinary scientist can process all stuff without significant effort just by taking pleasure by drinking a bottle of beer. Here we would like to report on the laser system and present our unique developments.

Keywords: laser, boson, ferroelectrics, mean field theory

Procedia PDF Downloads 163
14090 Pantograph-Catenary Contact Force: Features Evaluation for Catenary Diagnostics

Authors: Mehdi Brahimi, Kamal Medjaher, Noureddine Zerhouni, Mohammed Leouatni

Abstract:

The Prognostics and Health Management is a system engineering discipline which provides solutions and models to the implantation of a predictive maintenance. The approach is based on extracting useful information from monitoring data to assess the “health” state of an industrial equipment or an asset. In this paper, we examine multiple extracted features from Pantograph-Catenary contact force in order to select the most relevant ones to achieve a diagnostics function. The feature extraction methodology is based on simulation data generated thanks to a Pantograph-Catenary simulation software called INPAC and measurement data. The feature extraction method is based on both statistical and signal processing analyses. The feature selection method is based on statistical criteria.

Keywords: catenary/pantograph interaction, diagnostics, Prognostics and Health Management (PHM), quality of current collection

Procedia PDF Downloads 279
14089 Artificial Neural Network and Statistical Method

Authors: Tomas Berhanu Bekele

Abstract:

Traffic congestion is one of the main problems related to transportation in developed as well as developing countries. Traffic control systems are based on the idea of avoiding traffic instabilities and homogenizing traffic flow in such a way that the risk of accidents is minimized and traffic flow is maximized. Lately, Intelligent Transport Systems (ITS) has become an important area of research to solve such road traffic-related issues for making smart decisions. It links people, roads and vehicles together using communication technologies to increase safety and mobility. Moreover, accurate prediction of road traffic is important to manage traffic congestion. The aim of this study is to develop an ANN model for the prediction of traffic flow and to compare the ANN model with the linear regression model of traffic flow predictions. Data extraction was carried out in intervals of 15 minutes from the video player. Video of mixed traffic flow was taken and then counted during office work in order to determine the traffic volume. Vehicles were classified into six categories, namely Car, Motorcycle, Minibus, mid-bus, Bus, and Truck vehicles. The average time taken by each vehicle type to travel the trap length was measured by time displayed on a video screen.

Keywords: intelligent transport system (ITS), traffic flow prediction, artificial neural network (ANN), linear regression

Procedia PDF Downloads 47
14088 Integrated Formulation of Project Scheduling and Material Procurement Considering Different Discount Options

Authors: Babak H. Tabrizi, Seyed Farid Ghaderi

Abstract:

On-time availability of materials in the construction sites plays an outstanding role in successful achievement of project’s deliverables. Thus, this paper has investigated formulation of project scheduling and material procurement at the same time, by a mixed-integer programming model, aiming to minimize/maximize penalty/reward to deliver the project and minimize material holding, ordering, and procurement costs, respectively. We have taken both all-units and incremental discount possibilities into consideration to address more flexibility from the procurement side with regard to real world conditions. Finally, the applicability and efficiency of the mathematical model is tested by different numerical examples.

Keywords: discount strategies, material purchasing, project planning, project scheduling

Procedia PDF Downloads 246
14087 Six Steps of Entrepreneurial Finance and Development, from Idea to Corporation Case of Kuwait

Authors: Andri Ottesen, Sam Toglaw, Mirna Safa

Abstract:

Entrepreneurial companies on their developing path from an idea to a corporation go through a similar six-step process. Each of these six development steps is supported by a distinctive financing path. This paper explores the Kuwait model for Entrepreneurial Finance and Development through in-depth interviews with ten successful Kuwaiti entrepreneurs. This paper offers insight into the development and financing of entrepreneurial companies in this oil-rich, predominantly Islamic country that are in many ways different from the steps. Western entrepreneurial companies go through. This model could be used to understand the commonalities and the difference between entrepreneurial development and financing and could be used to bridge the gap.

Keywords: entrepreneurial-financing, entrepreneurial-developing, Kuwait, Vancouver school

Procedia PDF Downloads 197
14086 Neural Networks-based Acoustic Annoyance Model for Laptop Hard Disk Drive

Authors: Yichao Ma, Chengsiong Chin, Wailok Woo

Abstract:

Since the last decade, there has been a rapid growth in digital multimedia, such as high-resolution media files and three-dimentional movies. Hence, there is a need for large digital storage such as Hard Disk Drive (HDD). As such, users expect to have a quieter HDD in their laptop. In this paper, a jury test has been conducted on a group of 34 people where 17 of them are students who is the potential consumer, and the remaining are engineers who know the HDD. A total 13 HDD sound samples have been selected from over hundred HDD noise recordings. These samples are selected based on an agreed subjective feeling. The samples are played to the participants using head acoustic playback system which enabled them to experience as similar as possible the same environment as have been recorded. Analysis has been conducted and the obtained results have indicated different group has different perception over the noises. Two neural network-based acoustic annoyance models are established based on back propagation neural network. Four psychoacoustic metrics, loudness, sharpness, roughness and fluctuation strength, are used as the input of the model, and the subjective evaluation results are taken as the output. The developed models are reasonably accurate in simulating both training and test samples.

Keywords: hdd noise, jury test, neural network model, psychoacoustic annoyance

Procedia PDF Downloads 411
14085 Measurement of Coal Fineness, Air Fuel Ratio, and Fuel Weight Distribution in a Vertical Spindle Mill’s Pulverized Fuel Pipes at Classifier Vane 40%

Authors: Jayasiler Kunasagaram

Abstract:

In power generation, coal fineness is crucial to maintain flame stability, ensure combustion efficiency, and lower emissions to the environment. In order for the pulverized coal to react effectively in the boiler furnace, the size of coal particles needs to be at least 70% finer than 74 μm. This paper presents the experiment results of coal fineness, air fuel ratio and fuel weight distribution in pulverized fuel pipes at classifier vane 40%. The aim of this experiment is to extract the pulverized coal is kinetically and investigate the data accordingly. Dirty air velocity, coal sample extraction, and coal sieving experiments were performed to measure coal fineness. The experiment results show that required coal fineness can be achieved at 40 % classifier vane. However, this does not surpass the desired value by a great margin.

Keywords: coal power, emissions, isokinetic sampling, power generation

Procedia PDF Downloads 598
14084 Analysis of Ferroresonant Overvoltages in Cable-fed Transformers

Authors: George Eduful, Ebenezer A. Jackson, Kingsford A. Atanga

Abstract:

This paper investigates the impacts of cable length and capacity of transformer on ferroresonant overvoltage in cable-fed transformers. The study was conducted by simulation using the EMTP RV. Results show that ferroresonance can cause dangerous overvoltages ranging from 2 to 5 per unit. These overvoltages impose stress on insulations of transformers and cables and subsequently result in system failures. Undertaking Basic Multiple Regression Analysis (BMR) on the results obtained, a statistical model was obtained in terms of cable length and transformer capacity. The model is useful for ferroresonant prediction and control in cable-fed transformers.

Keywords: ferroresonance, cable-fed transformers, EMTP RV, regression analysis

Procedia PDF Downloads 516
14083 Robust Batch Process Scheduling in Pharmaceutical Industries: A Case Study

Authors: Tommaso Adamo, Gianpaolo Ghiani, Antonio Domenico Grieco, Emanuela Guerriero

Abstract:

Batch production plants provide a wide range of scheduling problems. In pharmaceutical industries a batch process is usually described by a recipe, consisting of an ordering of tasks to produce the desired product. In this research work we focused on pharmaceutical production processes requiring the culture of a microorganism population (i.e. bacteria, yeasts or antibiotics). Several sources of uncertainty may influence the yield of the culture processes, including (i) low performance and quality of the cultured microorganism population or (ii) microbial contamination. For these reasons, robustness is a valuable property for the considered application context. In particular, a robust schedule will not collapse immediately when a cell of microorganisms has to be thrown away due to a microbial contamination. Indeed, a robust schedule should change locally in small proportions and the overall performance measure (i.e. makespan, lateness) should change a little if at all. In this research work we formulated a constraint programming optimization (COP) model for the robust planning of antibiotics production. We developed a discrete-time model with a multi-criteria objective, ordering the different criteria and performing a lexicographic optimization. A feasible solution of the proposed COP model is a schedule of a given set of tasks onto available resources. The schedule has to satisfy tasks precedence constraints, resource capacity constraints and time constraints. In particular time constraints model tasks duedates and resource availability time windows constraints. To improve the schedule robustness, we modeled the concept of (a, b) super-solutions, where (a, b) are input parameters of the COP model. An (a, b) super-solution is one in which if a variables (i.e. the completion times of a culture tasks) lose their values (i.e. cultures are contaminated), the solution can be repaired by assigning these variables values with a new values (i.e. the completion times of a backup culture tasks) and at most b other variables (i.e. delaying the completion of at most b other tasks). The efficiency and applicability of the proposed model is demonstrated by solving instances taken from Sanofi Aventis, a French pharmaceutical company. Computational results showed that the determined super-solutions are near-optimal.

Keywords: constraint programming, super-solutions, robust scheduling, batch process, pharmaceutical industries

Procedia PDF Downloads 599
14082 Elasticity Model for Easing Peak Hour Demand for Metrorail Transport System

Authors: P. K. Sarkar, Amit Kumar Jain

Abstract:

The demand for Urban transportation is characterised by a large scale temporal and spatial variations which causes heavy congestion inside metro trains in peak hours near Centre Business District (CBD) of the city. The conventional approach to address peak hour congestion, metro trains has been to increase the supply by way of introduction of more trains, increasing the length of the trains, optimising the time table to increase the capacity of the system. However, there is a limitation of supply side measures determined by the design capacity of the systems beyond which any addition in the capacity requires huge capital investments. The demand side interventions are essentially required to actually spread the demand across the time and space. In this study, an attempt has been made to identify the potential Transport Demand Management tools applicable to Urban Rail Transportation systems with a special focus on differential pricing. A conceptual price elasticity model has been developed to analyse the effect of various combinations of peak and nonpeak hoursfares on demands. The elasticity values for peak hour, nonpeak hour and cross elasticity have been assumed from the relevant literature available in the field. The conceptual price elasticity model so developed is based on assumptions which need to be validated with actual values of elasticities for different segments of passengers. Once validated, the model can be used to determine the peak and nonpeak hour fares with an objective to increase overall ridership, revenue, demand levelling and optimal utilisation of assets.

Keywords: urban transport, differential fares, congestion, transport demand management, elasticity

Procedia PDF Downloads 295
14081 A Principal-Agent Model for Sharing Mechanism in Integrated Project Delivery Context

Authors: Shan Li, Qiuwen Ma

Abstract:

Integrated project delivery (IPD) is a project delivery method distinguished by a shared risk/rewards mechanism and multiparty agreement. IPD has drawn increasingly attention from construction industry because of its efficiency of solving adversarial problems and reliability to deliver high-performing buildings. However, some evidence showed that some project participants obtained less profit from IPD projects than the typical projects. They attributed it to the unfair IPD sharing mechanism, which resulted in additional time and cost of negotiation on the sharing fractions among project participants. The study is aimed to investigate the reward distribution by constructing a principal-agent model. Based on cooperative game theory, it is examined how to distribute the shared project rewards between client and non-client parties, and identify the sharing fractions among non-client parties. It is found that at least half of the project savings should be allocated to the non-client parties to motivate them to create more project value. Second, the client should raise his sharing fractions when the integration among project participants is efficient. In addition, the client should allocate higher sharing fractions to the non-client party who is more able. This study can help the IPD project participants make fair and motivated sharing mechanisms.

Keywords: cooperative game theory, IPD, principal agent model, sharing mechanism

Procedia PDF Downloads 276
14080 Numerical Investigation of the Jacketing Method of Reinforced Concrete Column

Authors: S. Boukais, A. Nekmouche, N. Khelil, A. Kezmane

Abstract:

The first intent of this study is to develop a finite element model that can predict correctly the behavior of the reinforced concrete column. Second aim is to use the finite element model to investigate and evaluate the effect of the strengthening method by jacketing of the reinforced concrete column, by considering different interface contact between the old and the new concrete. Four models were evaluated, one by considering perfect contact, the other three models by using friction coefficient of 0.1, 0.3 and 0.5. The simulation was carried out by using Abaqus software. The obtained results show that the jacketing reinforcement led to significant increase of the global performance of the behavior of the simulated reinforced concrete column.

Keywords: strengthening, jacketing, rienforced concrete column, Abaqus, simulation

Procedia PDF Downloads 132
14079 Bayesian Structural Identification with Systematic Uncertainty Using Multiple Responses

Authors: André Jesus, Yanjie Zhu, Irwanda Laory

Abstract:

Structural health monitoring is one of the most promising technologies concerning aversion of structural risk and economic savings. Analysts often have to deal with a considerable variety of uncertainties that arise during a monitoring process. Namely the widespread application of numerical models (model-based) is accompanied by a widespread concern about quantifying the uncertainties prevailing in their use. Some of these uncertainties are related with the deterministic nature of the model (code uncertainty) others with the variability of its inputs (parameter uncertainty) and the discrepancy between a model/experiment (systematic uncertainty). The actual process always exhibits a random behaviour (observation error) even when conditions are set identically (residual variation). Bayesian inference assumes that parameters of a model are random variables with an associated PDF, which can be inferred from experimental data. However in many Bayesian methods the determination of systematic uncertainty can be problematic. In this work systematic uncertainty is associated with a discrepancy function. The numerical model and discrepancy function are approximated by Gaussian processes (surrogate model). Finally, to avoid the computational burden of a fully Bayesian approach the parameters that characterise the Gaussian processes were estimated in a four stage process (modular Bayesian approach). The proposed methodology has been successfully applied on fields such as geoscience, biomedics, particle physics but never on the SHM context. This approach considerably reduces the computational burden; although the extent of the considered uncertainties is lower (second order effects are neglected). To successfully identify the considered uncertainties this formulation was extended to consider multiple responses. The efficiency of the algorithm has been tested on a small scale aluminium bridge structure, subjected to a thermal expansion due to infrared heaters. Comparison of its performance with responses measured at different points of the structure and associated degrees of identifiability is also carried out. A numerical FEM model of the structure was developed and the stiffness from its supports is considered as a parameter to calibrate. Results show that the modular Bayesian approach performed best when responses of the same type had the lowest spatial correlation. Based on previous literature, using different types of responses (strain, acceleration, and displacement) should also improve the identifiability problem. Uncertainties due to parametric variability, observation error, residual variability, code variability and systematic uncertainty were all recovered. For this example the algorithm performance was stable and considerably quicker than Bayesian methods that account for the full extent of uncertainties. Future research with real-life examples is required to fully access the advantages and limitations of the proposed methodology.

Keywords: bayesian, calibration, numerical model, system identification, systematic uncertainty, Gaussian process

Procedia PDF Downloads 316
14078 Analysis of Effect of Microfinance on the Profit Level of Small and Medium Scale Enterprises in Lagos State, Nigeria

Authors: Saheed Olakunle Sanusi, Israel Ajibade Adedeji

Abstract:

The study analysed the effect of microfinance on the profit level of small and medium scale enterprises in Lagos. The data for the study were obtained by simple random sampling, and total of one hundred and fifty (150) small and medium scale enterprises (SMEs) were sampled for the study. Seventy-five (75) each are microfinance users and non-users. Data were analysed using descriptive statistics, logit model, t-test and ordinary least square (OLS) regression. The mean profit of the enterprises using microfinance is ₦16.8m, while for the non-users of microfinance is ₦5.9m. The mean profit of microfinance users is statistically different from the non-users. The result of the logit model specified for the determinant of access to microfinance showed that three of specified variables- educational status of the enterprise head, credit utilisation and volume of business investment are significant at P < 0.01. Enterprises with many years of experience, highly educated enterprise heads and high volume of business investment have more potential access to microfinance. The OLS regression model indicated that three parameters namely number of school years, the volume of business investment and (dummy) participation in microfinance were found to be significant at P < 0.05. These variables are therefore significant determinants of impacts of microfinance on profit level in the study area. The study, therefore, concludes and recommends that to improve the status of small and medium scale enterprises for an increase in profit, the full benefit of access to microfinance can be enhanced through investment in social infrastructure and human capital development. Also, concerted efforts should be made to encouraged non-users of microfinance among SMEs to use it in order to boost their profit.

Keywords: credit utilisation, logit model, microfinance, small and medium enterprises

Procedia PDF Downloads 189