Search results for: Similarity measurement
1402 Characterization of a Hypoeutectic Al Alloy Obtained by Selective Laser Melting
Authors: Jairo A. Muñoz, Alexander Komissarov, Alexander Gromov
Abstract:
In this investigation, a hypoeutectic AlSi11Cu alloy was printed. This alloy was obtained in powder form with an average particle size of 40 µm. Bars 20 mm in diameter and 100 mm in length were printed with the building direction parallel to the bars' longitudinal direction. The microstructural characterization demonstrated an Al matrix surrounded by a Si network forming a coral-like pattern. The microstructure of the alloy showed a heterogeneous behavior with a mixture of columnar and equiaxed grains. Likewise, the texture indicated that the columnar grains were preferentially oriented towards the building direction, while the equiaxed followed a texture dominated by the cube component. On the other hand, the as-printed material strength showed higher values than those obtained in the same alloy using conventional processes such as casting. In addition, strength and ductility differences were found in the printed material, depending on the measurement direction. The highest values were obtained in the radial direction (565 MPa maximum strength and 4.8% elongation to failure). The lowest values corresponded to the transverse direction (508 MPa maximum strength and 3.2 elongation to failure), which corroborate the material anisotropy.Keywords: additive manufacturing, aluminium alloy, melting pools, tensile test
Procedia PDF Downloads 1531401 Enhancing Dents through Lean Six Sigma
Authors: Prateek Guleria, Shubham Sharma, Rakesh Kumar Shukla, Harshit Sharma
Abstract:
Performance measurement of small and medium-sized businesses is the primary need for all companies to survive and thrive in a dynamic global company. A structured and systematic, integrated organization increases employee reliability, sustainability, and loyalty. This paper is a case study of a gear manufacturing industry that was facing the problem of rejection due to dents and damages in gear. The DMAIC cycle, along with different tools used in the research work includes SIPOC (Supply, Input, Process, Output, Control) Pareto analysis, Root & Cause analysis, and FMEA (Failure Mode and Effect Analysis). The six-sigma level was improved from 4.06 to 3.46, and the rejection rate was reduced from 7.44% to 1.56%. These findings highlighted the influence of a Lean Six Sigma module in the gear manufacturing unit, which has already increased operational quality and continuity to increase market success and meet customer expectations. According to the findings, applying lean six sigma tools will result in increased productivity. The results could assist businesses in deciding the quality tools that were likely to improve efficiency, competitiveness, and expense.Keywords: six sigma, DMAIC, SIPOC, failure mode, effect analysis
Procedia PDF Downloads 1121400 Heteroscedastic Parametric and Semiparametric Smooth Coefficient Stochastic Frontier Application to Technical Efficiency Measurement
Authors: Rebecca Owusu Coffie, Atakelty Hailu
Abstract:
Variants of production frontier models have emerged, however, only a limited number of them are applied in empirical research. Hence the effects of these alternative frontier models are not well understood, particularly within sub-Saharan Africa. In this paper, we apply recent advances in the production frontier to examine levels of technical efficiency and efficiency drivers. Specifically, we compare the heteroscedastic parametric and the semiparametric stochastic smooth coefficient (SPSC) models. Using rice production data from Ghana, our empirical estimates reveal that alternative specification of efficiency estimators results in either downward or upward bias in the technical efficiency estimates. Methodologically, we find that the SPSC model is more suitable and generates high-efficiency estimates. Within the parametric framework, we find that parameterization of both the mean and variance of the pre-truncated function is the best model. For the drivers of technical efficiency, we observed that longer farm distances increase inefficiency through a reduction in labor productivity. High soil quality, however, increases productivity through increased land productivity.Keywords: pre-truncated, rice production, smooth coefficient, technical efficiency
Procedia PDF Downloads 4431399 Pseudo Modal Operating Deflection Shape Based Estimation Technique of Mode Shape Using Time History Modal Assurance Criterion
Authors: Doyoung Kim, Hyo Seon Park
Abstract:
Studies of System Identification(SI) based on Structural Health Monitoring(SHM) have actively conducted for structural safety. Recently SI techniques have been rapidly developed with output-only SI paradigm for estimating modal parameters. The features of these output-only SI methods consist of Frequency Domain Decomposition(FDD) and Stochastic Subspace Identification(SSI) are using the algorithms based on orthogonal decomposition such as singular value decomposition(SVD). But the SVD leads to high level of computational complexity to estimate modal parameters. This paper proposes the technique to estimate mode shape with lower computational cost. This technique shows pseudo modal Operating Deflections Shape(ODS) through bandpass filter and suggests time history Modal Assurance Criterion(MAC). Finally, mode shape could be estimated from pseudo modal ODS and time history MAC. Analytical simulations of vibration measurement were performed and the results with mode shape and computation time between representative SI method and proposed method were compared.Keywords: modal assurance criterion, mode shape, operating deflection shape, system identification
Procedia PDF Downloads 4071398 Analysis of Vocal Pathologies Through Subglottic Pressure Measurement
Authors: Perla Elizabeth Jimarez Rocha, Carolina Daniela Tejeda Franco, Arturo Minor Martínez, Annel Gomez Coello
Abstract:
One of the biggest problems in developing new therapies for the management and treatment of voice disorders is the difficulty of objectively evaluating the results of each treatment. A system was proposed that captures and records voice signals, in addition to analyzing the vocal quality (fundamental frequency, zero crossings, energy, and amplitude spectrum), as well as the subglottic pressure (cm H2O) during the sustained phonation of the vowel / a /; a recording system is implemented, as well as an interactive system that records information on subglottic pressure. In Mexico City, a control group of 31 patients with phoniatric pathology is proposed; non-invasive tests were performed for these most common vocal pathologies (Nodules, Polyps, Irritative Laryngitis, Ventricular Dysphonia, Laryngeal Cancer, Dysphonia, and Dysphagia). The most common pathology was irritative laryngitis (32%), followed by vocal fold paralysis (unilateral and bilateral,19.4 %). We take into consideration men and women in the pathological groups due to the physiological difference. They were separated in gender by the difference in the morphology of the respiratory tract.Keywords: amplitude spectrum, energy, fundamental frequency, subglottic pressure, zero crossings
Procedia PDF Downloads 1191397 Expert Supporting System for Diagnosing Lymphoid Neoplasms Using Probabilistic Decision Tree Algorithm and Immunohistochemistry Profile Database
Authors: Yosep Chong, Yejin Kim, Jingyun Choi, Hwanjo Yu, Eun Jung Lee, Chang Suk Kang
Abstract:
For the past decades, immunohistochemistry (IHC) has been playing an important role in the diagnosis of human neoplasms, by helping pathologists to make a clearer decision on differential diagnosis, subtyping, personalized treatment plan, and finally prognosis prediction. However, the IHC performed in various tumors of daily practice often shows conflicting and very challenging results to interpret. Even comprehensive diagnosis synthesizing clinical, histologic and immunohistochemical findings can be helpless in some twisted cases. Another important issue is that the IHC data is increasing exponentially and more and more information have to be taken into account. For this reason, we reached an idea to develop an expert supporting system to help pathologists to make a better decision in diagnosing human neoplasms with IHC results. We gave probabilistic decision tree algorithm and tested the algorithm with real case data of lymphoid neoplasms, in which the IHC profile is more important to make a proper diagnosis than other human neoplasms. We designed probabilistic decision tree based on Bayesian theorem, program computational process using MATLAB (The MathWorks, Inc., USA) and prepared IHC profile database (about 104 disease category and 88 IHC antibodies) based on WHO classification by reviewing the literature. The initial probability of each neoplasm was set with the epidemiologic data of lymphoid neoplasm in Korea. With the IHC results of 131 patients sequentially selected, top three presumptive diagnoses for each case were made and compared with the original diagnoses. After the review of the data, 124 out of 131 were used for final analysis. As a result, the presumptive diagnoses were concordant with the original diagnoses in 118 cases (93.7%). The major reason of discordant cases was that the similarity of the IHC profile between two or three different neoplasms. The expert supporting system algorithm presented in this study is in its elementary stage and need more optimization using more advanced technology such as deep-learning with data of real cases, especially in differentiating T-cell lymphomas. Although it needs more refinement, it may be used to aid pathological decision making in future. A further application to determine IHC antibodies for a certain subset of differential diagnoses might be possible in near future.Keywords: database, expert supporting system, immunohistochemistry, probabilistic decision tree
Procedia PDF Downloads 2231396 Geometric Contrast of a 3D Model Obtained by Means of Digital Photogrametry with a Quasimetric Camera on UAV Classical Methods
Authors: Julio Manuel de Luis Ruiz, Javier Sedano Cibrián, Rubén Pérez Álvarez, Raúl Pereda García, Cristina Diego Soroa
Abstract:
Nowadays, the use of drones has been extended to practically any human activity. One of the main applications is focused on the surveying field. In this regard, software programs that process the images captured by the sensor from the drone in an almost automatic way have been developed and commercialized, but they only allow contrasting the results through control points. This work proposes the contrast of a 3D model obtained from a flight developed by a drone and a non-metric camera (due to its low cost), with a second model that is obtained by means of the historically-endorsed classical methods. In addition to this, the contrast is developed over a certain territory with a significant unevenness, so as to test the model generated with photogrammetry, and considering that photogrammetry with drones finds more difficulties in terms of accuracy in this kind of situations. Distances, heights, surfaces and volumes are measured on the basis of the 3D models generated, and the results are contrasted. The differences are about 0.2% for the measurement of distances and heights, 0.3% for surfaces and 0.6% when measuring volumes. Although they are not important, they do not meet the order of magnitude that is presented by salespeople.Keywords: accuracy, classical topographic, model tridimensional, photogrammetry, Uav.
Procedia PDF Downloads 1321395 Losing Benefits from Social Network Sites Usage: An Approach to Estimate the Relationship between Social Network Sites Usage and Social Capital
Authors: Maoxin Ye
Abstract:
This study examines the relationship between social network sites (SNS) usage and social capital. Because SNS usage can expand the users’ networks, and people who are connected in this networks may become resources to SNS users and lead them to advantage in some situation, it is important to estimate the relationship between SNS usage and ‘who’ is connected or what resources the SNS users can get. Additionally, ‘who’ can be divided in two aspects – people who possess high position and people who are different, hence, it is important to estimate the relationship between SNS usage and high position people and different people. This study adapts Lin’s definition of social capital and the measurement of position generator which tells us who was connected, and can be divided into the same two aspects as well. A national data of America (N = 2,255) collected by Pew Research Center is utilized to do a general regression analysis about SNS usage and social capital. The results indicate that SNS usage is negatively associated with each factor of social capital, and it suggests that, in fact, comparing with non-users, although SNS users can get more connections, the variety and resources of these connections are fewer. For this reason, we could lose benefits through SNS usage.Keywords: social network sites, social capital, position generator, general regression
Procedia PDF Downloads 2621394 Turbulence Measurement Over Rough and Smooth Bed in Open Channel Flow
Authors: Kirti Singh, Kesheo Prasad
Abstract:
A 3D Acoustic Doppler velocimeter was used in the current investigation to quantify the mean and turbulence characteristics in non-uniform open-channel flows. Results are obtained from studies done in the laboratory, analysing the behavior of sand particles under turbulent open channel flow conditions flowing through rough, porous beds. Data obtained from ADV is used to calculate turbulent flow characteristics, Reynolds stresses and turbulent kinetic energy. Theoretical formulations for the distribution of Reynolds stress and the vertical velocity have been constructed using the Reynolds equation and the continuity equation of 2D open-channel flow. The measured Reynolds stress profile and the vertical velocity are comparable with the derived expressions. This study uses the Navier-Stokes equations for analysing the behavior of the vertical velocity profile in the dominant region of full-fledged turbulent flows in open channels, and it gives a new origination of the profile. For both wide and narrow open channels, this origination can estimate the time-averaged primary velocity in the turbulent boundary layer's outer region.Keywords: turbulence, bed roughness, logarithmic law, shear stress correlations, ADV, Reynolds shear stress
Procedia PDF Downloads 1061393 Factors Affecting the Results of in vitro Gas Production Technique
Authors: O. Kahraman, M. S. Alatas, O. B. Citil
Abstract:
In determination of values of feeds which, are used in ruminant nutrition, different methods are used like in vivo, in vitro, in situ or in sacco. Generally, the most reliable results are taken from the in vivo studies. But because of the disadvantages like being hard, laborious and expensive, time consuming, being hard to keep the experiment conditions under control and too much samples are needed, the in vitro techniques are more preferred. The most widely used in vitro techniques are two-staged digestion technique and gas production technique. In vitro gas production technique is based on the measurement of the CO2 which is released as a result of microbial fermentation of the feeds. In this review, the factors affecting the results obtained from in vitro gas production technique (Hohenheim Feed Test) were discussed. Some factors must be taken into consideration when interpreting the findings obtained in these studies and also comparing the findings reported by different researchers for the same feeds. These factors were discussed in 3 groups: factors related to animal, factors related to feeds and factors related with differences in the application of method. These factors and their effects on the results were explained. Also it can be concluded that the use of in vitro gas production technique in feed evaluation routinely can be contributed to the comprehensive feed evaluation, but standardization is needed in this technique to attain more reliable results.Keywords: In vitro, gas production technique, Hohenheim feed test, standardization
Procedia PDF Downloads 5961392 Structuring Paraphrases: The Impact Sentence Complexity Has on Key Leader Engagements
Authors: Meaghan Bowman
Abstract:
Soldiers are taught about the importance of effective communication with repetition of the phrase, “Communication is key.” They receive training in preparing for, and carrying out, interactions between foreign and domestic leaders to gain crucial information about a mission. These interactions are known as Key Leader Engagements (KLEs). For the training of KLEs, doctrine mandates the skills needed to conduct these “engagements” such as how to: behave appropriately, identify key leaders, and employ effective strategies. Army officers in training learn how to confront leaders, what information to gain, and how to ask questions respectfully. Unfortunately, soldiers rarely learn how to formulate questions optimally. Since less complex questions are easier to understand, we hypothesize that semantic complexity affects content understanding, and that age and education levels may have an effect on one’s ability to form paraphrases and judge their quality. In this study, we looked at paraphrases of queries as well as judgments of both the paraphrases’ naturalness and their semantic similarity to the query. Queries were divided into three complexity categories based on the number of relations (the first number) and the number of knowledge graph edges (the second number). Two crowd-sourced tasks were completed by Amazon volunteer participants, also known as turkers, to answer the research questions: (i) Are more complex queries harder to paraphrase and judge and (ii) Do age and education level affect the ability to understand complex queries. We ran statistical tests as follows: MANOVA for query understanding and two-way ANOVA to understand the relationship between query complexity and education and age. A probe of the number of given-level queries selected for paraphrasing by crowd-sourced workers in seven age ranges yielded promising results. We found significant evidence that age plays a role and marginally significant evidence that education level plays a role. These preliminary tests, with output p-values of 0.0002 and 0.068, respectively, suggest the importance of content understanding in a communication skill set. This basic ability to communicate, which may differ by age and education, permits reproduction and quality assessment and is crucial in training soldiers for effective participation in KLEs.Keywords: engagement, key leader, paraphrasing, query complexity, understanding
Procedia PDF Downloads 1601391 Case-Based Reasoning Application to Predict Geological Features at Site C Dam Construction Project
Authors: Shahnam Behnam Malekzadeh, Ian Kerr, Tyson Kaempffer, Teague Harper, Andrew Watson
Abstract:
The Site C Hydroelectric dam is currently being constructed in north-eastern British Columbia on sub-horizontal sedimentary strata that dip approximately 15 meters from one bank of the Peace River to the other. More than 615 pressure sensors (Vibrating Wire Piezometers) have been installed on bedding planes (BPs) since construction began, with over 80 more planned before project completion. These pressure measurements are essential to monitor the stability of the rock foundation during and after construction and for dam safety purposes. BPs are identified by their clay gouge infilling, which varies in thickness from less than 1 to 20 mm and can be challenging to identify as the core drilling process often disturbs or washes away the gouge material. Without the use of depth predictions from nearby boreholes, stratigraphic markers, and downhole geophysical data, it is difficult to confidently identify BP targets for the sensors. In this paper, a Case-Based Reasoning (CBR) method was used to develop an empirical model called the Bedding Plane Elevation Prediction (BPEP) to help geologists and geotechnical engineers to predict geological features and bedding planes at new locations in a fast and accurate manner. To develop CBR, a database was developed based on 64 pressure sensors already installed on key bedding planes BP25, BP28, and BP31 on the Right Bank, including bedding plane elevations and coordinates. Thirteen (20%) of the most recent cases were selected to validate and evaluate the accuracy of the developed model, while the similarity was defined as the distance between previous cases and recent cases to predict the depth of significant BPs. The average difference between actual BP elevations and predicted elevations for above BPs was ±55cm, while the actual results showed that 69% of predicted elevations were within ±79 cm of actual BP elevations while 100% of predicted elevations for new cases were within ±99cm range. Eventually, the actual results will be used to develop the database and improve BPEP to perform as a learning machine to predict more accurate BP elevations for future sensor installations.Keywords: case-based reasoning, geological feature, geology, piezometer, pressure sensor, core logging, dam construction
Procedia PDF Downloads 781390 Use of Computer and Machine Learning in Facial Recognition
Authors: Neha Singh, Ananya Arora
Abstract:
Facial expression measurement plays a crucial role in the identification of emotion. Facial expression plays a key role in psychophysiology, neural bases, and emotional disorder, to name a few. The Facial Action Coding System (FACS) has proven to be the most efficient and widely used of the various systems used to describe facial expressions. Coders can manually code facial expressions with FACS and, by viewing video-recorded facial behaviour at a specified frame rate and slow motion, can decompose into action units (AUs). Action units are the most minor visually discriminable facial movements. FACS explicitly differentiates between facial actions and inferences about what the actions mean. Action units are the fundamental unit of FACS methodology. It is regarded as the standard measure for facial behaviour and finds its application in various fields of study beyond emotion science. These include facial neuromuscular disorders, neuroscience, computer vision, computer graphics and animation, and face encoding for digital processing. This paper discusses the conceptual basis for FACS, a numerical listing of discrete facial movements identified by the system, the system's psychometric evaluation, and the software's recommended training requirements.Keywords: facial action, action units, coding, machine learning
Procedia PDF Downloads 1041389 Quantum Decision Making with Small Sample for Network Monitoring and Control
Authors: Tatsuya Otoshi, Masayuki Murata
Abstract:
With the development and diversification of applications on the Internet, applications that require high responsiveness, such as video streaming, are becoming mainstream. Application responsiveness is not only a matter of communication delay but also a matter of time required to grasp changes in network conditions. The tradeoff between accuracy and measurement time is a challenge in network control. We people make countless decisions all the time, and our decisions seem to resolve tradeoffs between time and accuracy. When making decisions, people are known to make appropriate choices based on relatively small samples. Although there have been various studies on models of human decision-making, a model that integrates various cognitive biases, called ”quantum decision-making,” has recently attracted much attention. However, the modeling of small samples has not been examined much so far. In this paper, we extend the model of quantum decision-making to model decision-making with a small sample. In the proposed model, the state is updated by value-based probability amplitude amplification. By analytically obtaining a lower bound on the number of samples required for decision-making, we show that decision-making with a small number of samples is feasible.Keywords: quantum decision making, small sample, MPEG-DASH, Grover's algorithm
Procedia PDF Downloads 781388 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images
Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann
Abstract:
FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design
Procedia PDF Downloads 2761387 Ceratocystis manginecans Causal Agent of a Destructive Mangoes in Pakistan
Authors: Asma Rashid, Shazia Iram, Iftikhar Ahmad
Abstract:
Mango sudden death is an emerging problem in Pakistan. As its prevalence is observed in almost all mango growing areas and severity varied from 2-5% in Punjab and 5-10% in Sindh. Symptoms on affected trees include bark splitting, discoloration of the vascular tissue, wilting, gummosis and at the end rapid death. Total of n= 45 isolates were isolated from different mango growing areas of Punjab and Sindh. Pathogenicity of these fungal isolates was tested through artificial inoculation method on different hosts (potato tubers, detached mango leaves, detached mango twigs and mango plants) under controlled conditions and all were proved pathogenic with varying degree of aggressiveness in reference to control. The findings of the present study proved that out of these four methods, potato tubers inoculation method was the most ideal as this fix the inoculums on the target site. Increased fungal growth and spore numbers may be due to soft tissues of potato tubers from which Ceratocystis isolates can easily pass. Lesion area on potato tubers was in the range of 7.09-0.14 cm2 followed by detached mango twigs which were ranged from 0.48-0.09 cm2). All pathological results were proved highly significant at P<0.05 through ANOVA but isolate to isolate showed non-significant behaviour but they have the positive effect on lesion area. Re-isolation of respective fungi was achieved with 100 percent success which results in the verification of Koch’s postulates. DNA of fungal pathogens was successfully extracted through phenol chloroform method. Amplification was done through ITS, b-tubulin gene, and Transcription Elongation Factor (EF1-a) gene primers and the amplified amplicons were sequenced and compared from NCBI which showed 99-100 % similarity with Ceratocystis manginecans. Fungus Ceratocystis manginecans formed one of strongly supported sub-clades through phylogenetic tree. Results obtained through this work would be supportive in establishment of relation of isolates with their region and will give information about pathogenicity level of isolates that would be useful to develop the management policies to reduce the afflictions in orchards caused by mango sudden death.Keywords: artificial inoculation, mango, Ceratocystis manginecans, phylogenetic, screening
Procedia PDF Downloads 2441386 Intensification of Heat Transfer in Magnetically Assisted Reactor
Authors: Dawid Sołoducha, Tomasz Borowski, Marian Kordas, Rafał Rakoczy
Abstract:
The magnetic field in the past few years became an important part of many studies. Magnetic field (MF) may be used to affect the process in many ways; for example, it can be used as a factor to stabilize the system. We can use MF to steer the operation, to activate or inhibit the process, or even to affect the vital activity of microorganisms. Using various types of magnetic field generators is always connected with the delivery of some heat to the system. Heat transfer is a very important phenomenon; it can influence the process positively and negatively, so it’s necessary to measure heat stream transferred from the place of generation and prevent negative influence on the operation. The aim of the presented work was to apply various types of magnetic fields and to measure heat transfer phenomena. The results were obtained by continuous measurement at several measuring points with temperature probes. Results were compilated in the form of temperature profiles. The study investigated the undetermined heat transfer in a custom system equipped with a magnetic field generator. Experimental investigations are provided for the explanation of the influence of the various type of magnetic fields on the heat transfer process. The tested processes are described by means of the criteria which defined heat transfer intensification under the action of magnetic field.Keywords: heat transfer, magnetic field, undetermined heat transfer, temperature profile
Procedia PDF Downloads 1941385 On Estimating the Low Income Proportion with Several Auxiliary Variables
Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández
Abstract:
Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.Keywords: inclusion probability, poverty, poverty line, survey sampling
Procedia PDF Downloads 4561384 Statistical Pattern Recognition for Biotechnological Process Characterization Based on High Resolution Mass Spectrometry
Authors: S. Fröhlich, M. Herold, M. Allmer
Abstract:
Early stage quantitative analysis of host cell protein (HCP) variations is challenging yet necessary for comprehensive bioprocess development. High resolution mass spectrometry (HRMS) provides a high-end technology for accurate identification alongside with quantitative information. Hereby we describe a flexible HRMS assay platform to quantify HCPs relevant in microbial expression systems such as E. Coli in both up and downstream development by means of MVDA tools. Cell pellets were lysed and proteins extracted, purified samples not further treated before applying the SMART tryptic digest kit. Peptides separation was optimized using an RP-UHPLC separation platform. HRMS-MSMS analysis was conducted on an Orbitrap Velos Elite applying CID. Quantification was performed label-free taking into account ionization properties and physicochemical peptide similarities. Results were analyzed using SIEVE 2.0 (Thermo Fisher Scientific) and SIMCA (Umetrics AG). The developed HRMS platform was applied to an E. Coli expression set with varying productivity and the corresponding downstream process. Selected HCPs were successfully quantified within the fmol range. Analysing HCP networks based on pattern analysis facilitated low level quantification and enhanced validity. This approach is of high relevance for high-throughput screening experiments during upstream development, e.g. for titer determination, dynamic HCP network analysis or product characterization. Considering the downstream purification process, physicochemical clustering of identified HCPs is of relevance to adjust buffer conditions accordingly. However, the technology provides an innovative approach for label-free MS based quantification relying on statistical pattern analysis and comparison. Absolute quantification based on physicochemical properties and peptide similarity score provides a technological approach without the need of sophisticated sample preparation strategies and is therefore proven to be straightforward, sensitive and highly reproducible in terms of product characterization.Keywords: process analytical technology, mass spectrometry, process characterization, MVDA, pattern recognition
Procedia PDF Downloads 2471383 Assessment of Exploitation Vulnerability of Quantum Communication Systems with Phase Encryption
Authors: Vladimir V. Nikulin, Bekmurza H. Aitchanov, Olimzhon A. Baimuratov
Abstract:
Quantum communication technology takes advantage of the intrinsic properties of laser carriers, such as very high data rates and low power requirements, to offer unprecedented data security. Quantum processes at the physical layer of encryption are used for signal encryption with very competitive performance characteristics. The ultimate range of applications for QC systems spans from fiber-based to free-space links and from secure banking operations to mobile airborne and space-borne networking where they are subjected to channel distortions. Under practical conditions, the channel can alter the optical wave front characteristics, including its phase. In addition, phase noise of the communication source and photo-detection noises alter the signal to bring additional ambiguity into the measurement process. If quantized values of photons are used to encrypt the signal, exploitation of quantum communication links becomes extremely difficult. In this paper, we present the results of analysis and simulation studies of the effects of noise on phase estimation for quantum systems with different number of encryption bases and operating at different power levels.Keywords: encryption, phase distortion, quantum communication, quantum noise
Procedia PDF Downloads 5511382 Improving the Frequency Response of a Circular Dual-Mode Resonator with a Reconfigurable Bandwidth
Authors: Muhammad Haitham Albahnassi, Adnan Malki, Shokri Almekdad
Abstract:
In this paper, a method for reconfiguring bandwidth in a circular dual-mode resonator is presented. The method concerns the optimized geometry of a structure that may be used to host the tuning elements, which are typically RF (Radio Frequency) switches. The tuning elements themselves, and their performance during tuning, are not the focus of this paper. The designed resonator is able to reconfigure its fractional bandwidth by adjusting the inter-coupling level between the degenerate modes, while at the same time improving its response by adjusting the external-coupling level and keeping the center frequency fixed. The inter-coupling level has been adjusted by changing the dimensions of the perturbation element, while the external-coupling level has been adjusted by changing one of the feeder dimensions. The design was arrived at via optimization. Agreeing simulation and measurement results of the designed and implemented filters showed good improvements in return loss values and the stability of the center frequency.Keywords: dual-mode resonators, perturbation theory, reconfigurable filters, software defined radio, cognitine radio
Procedia PDF Downloads 1661381 Experimental Investigation of the Aeroacoustics Field for a Rectangular Jet Impinging on a Slotted Plate: Stereoscopic Particle Image Velocimetry Measurement before and after the Plate
Authors: Nour Eldin Afyouni, Hassan Assoum, Kamel Abed-Meraim, Anas Sakout
Abstract:
The acoustic of an impinging jet holds significant importance in the engineering field. In HVAC systems, the jet impingement, in some cases, generates noise that destroys acoustic comfort. This paper presents an experimental study of a rectangular air jet impinging on a slotted plate to investigate the correlation between sound emission and turbulence dynamics. The experiment was conducted with an impact ratio L/H = 4 and a Reynolds number Re = 4700. The survey shows that coherent structures within the impinging jet are responsible for self-sustaining tone production. To achieve this, a specific experimental setup consisting of two simultaneous Stereoscopic Particle Image Velocimetry (S-PIV) measurements was developed to track vortical structures both before and after the plate, in addition to acoustic measurements. The results reveal a significant correlation between acoustic waves and the passage of coherent structures. Variations in the arrangement of vortical structures between the upstream and downstream sides of the plate were observed. This analysis of flow dynamics can enhance our understanding of slot noise.Keywords: impinging jet, coherent structures, SPIV, aeroacoustics
Procedia PDF Downloads 811380 An Experimental Investigation of the Effect of Control Algorithm on the Energy Consumption and Temperature Distribution of a Household Refrigerator
Authors: G. Peker, Tolga N. Aynur, E. Tinar
Abstract:
In order to determine the energy consumption level and cooling characteristics of a domestic refrigerator controlled with various cooling system algorithms, a side by side type (SBS) refrigerator was tested in temperature and humidity controlled chamber conditions. Two different control algorithms; so-called drop-in and frequency controlled variable capacity compressor algorithms, were tested on the same refrigerator. Refrigerator cooling characteristics were investigated for both cases and results were compared with each other. The most important comparison parameters between the two algorithms were taken as; temperature distribution, energy consumption, evaporation and condensation temperatures, and refrigerator run times. Standard energy consumption tests were carried out on the same appliance and resulted in almost the same energy consumption levels, with a difference of %1,5. By using these two different control algorithms, the power consumptions character/profile of the refrigerator was found to be similar. By following the associated energy measurement standard, the temperature values of the test packages were measured to be slightly higher for the frequency controlled algorithm compared to the drop-in algorithm. This paper contains the details of this experimental study conducted with different cooling control algorithms and compares the findings based on the same standard conditions.Keywords: control algorithm, cooling, energy consumption, refrigerator
Procedia PDF Downloads 3701379 Multivariate Analysis of the Relationship between Professional Burnout, Emotional Intelligence and Health Level in Teachers University of Guayaquil
Authors: Viloria Marin Hermes, Paredes Santiago Maritza, Viloria Paredes Jonathan
Abstract:
The aim of this study is to assess the prevalence of Burnout syndrome in a sample of 600 professors at the University of Guayaquil (Ecuador) using the Maslach Burnout Inventory (M.B.I.). In addition, assessment was made of the effects on health from professional burnout using the General Health Questionnaire (G.H.Q.-28), and the influence of Emotional Intelligence on prevention of its symptoms using the Spanish version of the Trait Meta-Mood Scale (T.M.M.S.-24). After confirmation of the underlying factor structure, the three measurement tools showed high levels of internal consistency, and specific cut-off points were proposed for the group of Latin American academics in the M.B.I. Statistical analysis showed the syndrome is present extensively, particularly on medium levels, with notably low scores given for Professional Self-Esteem. The application of Canonical Correspondence Analysis revealed that low levels of self-esteem are related to depression, with a lack of personal resources related to anxiety and insomnia, whereas the ability to perceive and control emotions and feelings improves perceptions of professional effectiveness and performance.Keywords: burnout, academics, emotional intelligence, general health, canonical correspondence analysis
Procedia PDF Downloads 3691378 Indoor Environment Quality and Occupant Resilience Toward Climate Change: A Case Study from Gold Coast, Australia
Authors: Soheil Roumi, Fan Zhang, Rodney Stewart
Abstract:
Indoor environmental quality (IEQ) indexes represented the suitability of a place to study, work, and live. Many indexes have been introduced based on the physical measurement or occupant surveys in commercial buildings. The earlier studies did not elaborate on the relationship between energy consumption and IEQ in office buildings. Such a relationship can provide a comprehensive overview of the building's performance. Also, it would find the potential of already constructed buildings under the upcoming climate change. A commercial building in southeast Queensland, Australia, was evaluated in this study. Physical measurements of IEQ and Energy areconducted, and their relationship will be determined using statistical analysis. The case study building is modelled in TRNSys software, and it will be validatedusingthe actual building's BMS data. Then, the modelled buildingwill be simulated by predicted weather data developed by the commonwealth scientific and industrial research organisation of Australia to investigate the occupant resilience and energy consumption. Finally, recommendations will be presented to consume less energy while providinga proper indoor environment for office occupants.Keywords: IEQ, office buildings, thermal comfort, occupant resilience
Procedia PDF Downloads 1101377 A New Approach of Preprocessing with SVM Optimization Based on PSO for Bearing Fault Diagnosis
Authors: Tawfik Thelaidjia, Salah Chenikher
Abstract:
Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, feature extraction from faulty bearing vibration signals is performed by a combination of the signal’s Kurtosis and features obtained through the preprocessing of the vibration signal samples using Db2 discrete wavelet transform at the fifth level of decomposition. In this way, a 7-dimensional vector of the vibration signal feature is obtained. After feature extraction from vibration signal, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. To improve the classification accuracy for bearing fault prediction, particle swarm optimization (PSO) is employed to simultaneously optimize the SVM kernel function parameter and the penalty parameter. The results have shown feasibility and effectiveness of the proposed approachKeywords: condition monitoring, discrete wavelet transform, fault diagnosis, kurtosis, machine learning, particle swarm optimization, roller bearing, rotating machines, support vector machine, vibration measurement
Procedia PDF Downloads 4361376 Multiple Linear Regression for Rapid Estimation of Subsurface Resistivity from Apparent Resistivity Measurements
Authors: Sabiu Bala Muhammad, Rosli Saad
Abstract:
Multiple linear regression (MLR) models for fast estimation of true subsurface resistivity from apparent resistivity field measurements are developed and assessed in this study. The parameters investigated were apparent resistivity (ρₐ), horizontal location (X) and depth (Z) of measurement as the independent variables; and true resistivity (ρₜ) as the dependent variable. To achieve linearity in both resistivity variables, datasets were first transformed into logarithmic domain following diagnostic checks of normality of the dependent variable and heteroscedasticity to ensure accurate models. Four MLR models were developed based on hierarchical combination of the independent variables. The generated MLR coefficients were applied to another data set to estimate ρₜ values for validation. Contours of the estimated ρₜ values were plotted and compared to the observed data plots at the colour scale and blanking for visual assessment. The accuracy of the models was assessed using coefficient of determination (R²), standard error (SE) and weighted mean absolute percentage error (wMAPE). It is concluded that the MLR models can estimate ρₜ for with high level of accuracy.Keywords: apparent resistivity, depth, horizontal location, multiple linear regression, true resistivity
Procedia PDF Downloads 2731375 The Effect of Information Technology on the Quality of Accounting Information
Authors: Mohammad Hadi Khorashadi Zadeh, Amin Karkon, Hamid Golnari
Abstract:
This study aimed to investigate the impact of information technology on the quality of accounting information was made in 2014. A survey of 425 executives of listed companies in Tehran Stock Exchange, using the Cochran formula simple random sampling method, 84 managers of these companies as the sample size was considered. Methods of data collection based on questionnaire information technology some of the questions of the impact of information technology was standardized questionnaires and the questions were designed according to existing components. After the distribution and collection of questionnaires, data analysis and hypothesis testing using structural equation modeling Smart PLS2 and software measurement model and the structure was conducted in two parts. In the first part of the questionnaire technical characteristics including reliability, validity, convergent and divergent validity for PLS has been checked and in the second part, application no significant coefficients were used to examine the research hypotheses. The results showed that IT and its dimensions (timeliness, relevance, accuracy, adequacy, and the actual transfer rate) affect the quality of accounting information of listed companies in Tehran Stock Exchange influence.Keywords: information technology, information quality, accounting, transfer speed
Procedia PDF Downloads 2761374 Developing Fault Tolerance Metrics of Web and Mobile Applications
Authors: Ahmad Mohsin, Irfan Raza Naqvi, Syda Fatima Usamn
Abstract:
Applications with higher fault tolerance index are considered more reliable and trustworthy to drive quality. In recent years application development has been shifted from traditional desktop and web to native and hybrid application(s) for the web and mobile platforms. With the emergence of Internet of things IOTs, cloud and big data trends, the need for measuring Fault Tolerance for these complex nature applications has increased to evaluate their performance. There is a phenomenal gap between fault tolerance metrics development and measurement. Classic quality metric models focused on metrics for traditional systems ignoring the essence of today’s applications software, hardware & deployment characteristics. In this paper, we have proposed simple metrics to measure fault tolerance considering general requirements for Web and Mobile Applications. We have aligned factors – subfactors, using GQM for metrics development considering the nature of mobile we apps. Systematic Mathematical formulation is done to measure metrics quantitatively. Three web mobile applications are selected to measure Fault Tolerance factors using formulated metrics. Applications are then analysed on the basis of results from observations in a controlled environment on different mobile devices. Quantitative results are presented depicting Fault tolerance in respective applications.Keywords: web and mobile applications, reliability, fault tolerance metric, quality metrics, GQM based metrics
Procedia PDF Downloads 3431373 Analyzing Current Transformers Saturation Characteristics for Different Connected Burden Using LabVIEW Data Acquisition Tool
Authors: D. Subedi, S. Pradhan
Abstract:
Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However when the power system experiences an abnormal situation leading to huge current flow, then this huge current is proportionally injected to the protection and metering circuit. Since the protection and metering equipment’s are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and also on the reliability of the protection and metering system. This paper shows the effect of burden on the Accuracy Limiting factor/ Instrument security factor of current transformers and also the change in saturation characteristics of the CT’s. The response of the CT to varying levels of overcurrent at different connected burden will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer saturation characteristics with changes in burden will be discussed.Keywords: accuracy limiting factor, burden, current transformer, instrument security factor, saturation characteristics
Procedia PDF Downloads 414