Search results for: Signal Processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5008

Search results for: Signal Processing

1618 Statistical Comparison of Machine and Manual Translation: A Corpus-Based Study of Gone with the Wind

Authors: Yanmeng Liu

Abstract:

This article analyzes and compares the linguistic differences between machine translation and manual translation, through a case study of the book Gone with the Wind. As an important carrier of human feeling and thinking, the literature translation poses a huge difficulty for machine translation, and it is supposed to expose distinct translation features apart from manual translation. In order to display linguistic features objectively, tentative uses of computerized and statistical evidence to the systematic investigation of large scale translation corpora by using quantitative methods have been deployed. This study compiles bilingual corpus with four versions of Chinese translations of the book Gone with the Wind, namely, Piao by Chunhai Fan, Piao by Huairen Huang, translations by Google Translation and Baidu Translation. After processing the corpus with the software of Stanford Segmenter, Stanford Postagger, and AntConc, etc., the study analyzes linguistic data and answers the following questions: 1. How does the machine translation differ from manual translation linguistically? 2. Why do these deviances happen? This paper combines translation study with the knowledge of corpus linguistics, and concretes divergent linguistic dimensions in translated text analysis, in order to present linguistic deviances in manual and machine translation. Consequently, this study provides a more accurate and more fine-grained understanding of machine translation products, and it also proposes several suggestions for machine translation development in the future.

Keywords: corpus-based analysis, linguistic deviances, machine translation, statistical evidence

Procedia PDF Downloads 144
1617 A Neural Network Classifier for Estimation of the Degree of Infestation by Late Blight on Tomato Leaves

Authors: Gizelle K. Vianna, Gabriel V. Cunha, Gustavo S. Oliveira

Abstract:

Foliage diseases in plants can cause a reduction in both quality and quantity of agricultural production. Intelligent detection of plant diseases is an essential research topic as it may help monitoring large fields of crops by automatically detecting the symptoms of foliage diseases. This work investigates ways to recognize the late blight disease from the analysis of tomato digital images, collected directly from the field. A pair of multilayer perceptron neural network analyzes the digital images, using data from both RGB and HSL color models, and classifies each image pixel. One neural network is responsible for the identification of healthy regions of the tomato leaf, while the other identifies the injured regions. The outputs of both networks are combined to generate the final classification of each pixel from the image and the pixel classes are used to repaint the original tomato images by using a color representation that highlights the injuries on the plant. The new images will have only green, red or black pixels, if they came from healthy or injured portions of the leaf, or from the background of the image, respectively. The system presented an accuracy of 97% in detection and estimation of the level of damage on the tomato leaves caused by late blight.

Keywords: artificial neural networks, digital image processing, pattern recognition, phytosanitary

Procedia PDF Downloads 327
1616 Collective Intelligence-Based Early Warning Management for Agriculture

Authors: Jarbas Lopes Cardoso Jr., Frederic Andres, Alexandre Guitton, Asanee Kawtrakul, Silvio E. Barbin

Abstract:

The important objective of the CyberBrain Mass Agriculture Alarm Acquisition and Analysis (CBMa4) project is to minimize the impacts of diseases and disasters on rice cultivation. For example, early detection of insects will reduce the volume of insecticides that is applied to the rice fields through the use of CBMa4 platform. In order to reach this goal, two major factors need to be considered: (1) the social network of smart farmers; and (2) the warning data alarm acquisition and analysis component. This paper outlines the process for collecting the warning and improving the decision-making result to the warning. It involves two sub-processes: the warning collection and the understanding enrichment. Human sensors combine basic suitable data processing techniques in order to extract warning related semantic according to collective intelligence. We identify each warning by a semantic content called 'warncons' with multimedia metaphors and metadata related to these metaphors. It is important to describe the metric to measuring the relation among warncons. With this knowledge, a collective intelligence-based decision-making approach determines the action(s) to be launched regarding one or a set of warncons.

Keywords: agricultural engineering, warning systems, social network services, context awareness

Procedia PDF Downloads 382
1615 Limestone Briquette Production and Characterization

Authors: André C. Silva, Mariana R. Barros, Elenice M. S. Silva, Douglas. Y. Marinho, Diego F. Lopes, Débora N. Sousa, Raphael S. Tomáz

Abstract:

Modern agriculture requires productivity, efficiency and quality. Therefore, there is need for agricultural limestone implementation that provides adequate amounts of calcium and magnesium carbonates in order to correct soil acidity. During the limestone process, fine particles (with average size under 400#) are generated. These particles do not have economic value in agricultural and metallurgical sectors due their size. When limestone is used for agriculture purposes, these fine particles can be easily transported by wind generated air pollution. Therefore, briquetting, a mineral processing technique, was used to mitigate this problem resulting in an agglomerated product suitable for agriculture use. Briquetting uses compressive pressure to agglomerate fine particles. It can be aided by agglutination agents, allowing adjustments in shape, size and mechanical parameters of the mass. Briquettes can generate extra profits for mineral industry, presenting as a distinct product for agriculture, and can reduce the environmental liabilities of the fine particles storage or disposition. The produced limestone briquettes were subjected to shatter and water action resistance tests. The results show that after six minutes completely submerged in water, the briquettes where fully diluted, a highly favorable result considering its use for soil acidity correction.

Keywords: agglomeration, briquetting, limestone, soil acidity correction

Procedia PDF Downloads 390
1614 Differentiation between Different Rangeland Sites Using Principal Component Analysis in Semi-Arid Areas of Sudan

Authors: Nancy Ibrahim Abdalla, Abdelaziz Karamalla Gaiballa

Abstract:

Rangelands in semi-arid areas provide a good source for feeding huge numbers of animals and serving environmental, economic and social importance; therefore, these areas are considered economically very important for the pastoral sector in Sudan. This paper investigates the means of differentiating between different rangelands sites according to soil types using principal component analysis to assist in monitoring and assessment purposes. Three rangeland sites were identified in the study area as flat sandy sites, sand dune site, and hard clay site. Principal component analysis (PCA) was used to reduce the number of factors needed to distinguish between rangeland sites and produce a new set of data including the most useful spectral information to run satellite image processing. It was performed using selected types of data (two vegetation indices, topographic data and vegetation surface reflectance within the three bands of MODIS data). Analysis with PCA indicated that there is a relatively high correspondence between vegetation and soil of the total variance in the data set. The results showed that the use of the principal component analysis (PCA) with the selected variables showed a high difference, reflected in the variance and eigenvalues and it can be used for differentiation between different range sites.

Keywords: principal component analysis, PCA, rangeland sites, semi-arid areas, soil types

Procedia PDF Downloads 186
1613 Qualitative and Quantitative Traits of Processed Farmed Fish in N. W. Greece

Authors: Cosmas Nathanailides, Fotini Kakali, Kostas Karipoglou

Abstract:

The filleting yield and the chemical composition of farmed sea bass (Dicentrarchus labrax); rainbow trout (Oncorynchus mykiss) and meagre (Argyrosomus regius) was investigated in farmed fish in NW Greece. The results provide an estimate of the quantity of fish required to produce one kilogram of fillet weight, an estimation which is required for the operational management of fish processing companies. Furthermore in this work, the ratio of feed input required to produce one kilogram of fish fillet (FFCR) is presented for the first time as a useful indicator of the ecological footprint of consuming farmed fish. The lowest lipid content appeared in meagre (1,7%) and the highest in trout (4,91%). The lowest fillet yield and fillet yield feed conversion ratio (FYFCR) was in meagre (FY=42,17%, FFCR=2,48), the best fillet yield (FY=53,8%) and FYFCR (2,10) was exhibited in farmed rainbow trout. This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.

Keywords: farmed fish, flesh quality, filleting yield, lipid

Procedia PDF Downloads 309
1612 Preparation and Cutting Performance of Boron-Doped Diamond Coating on Cemented Carbide Cutting Tools with High Cobalt Content

Authors: Zhaozhi Liu, Feng Xu, Junhua Xu, Xiaolong Tang, Ying Liu, Dunwen Zuo

Abstract:

Chemical vapor deposition (CVD) diamond coated cutting tool has excellent cutting performance, it is the most ideal tool for the processing of nonferrous metals and alloys, composites, nonmetallic materials and other difficult-to-machine materials efficiently and accurately. Depositing CVD diamond coating on the cemented carbide with high cobalt content can improve its toughness and strength, therefore, it is very important to research on the preparation technology and cutting properties of CVD diamond coated cemented carbide cutting tool with high cobalt content. The preparation technology of boron-doped diamond (BDD) coating has been studied and the coated drills were prepared. BDD coating were deposited on the drills by using the optimized parameters and the SEM results show that there are no cracks or collapses in the coating. Cutting tests with the prepared drills against the silumin and aluminum base printed circuit board (PCB) have been studied. The results show that the wear amount of the coated drill is small and the machined surface has a better precision. The coating does not come off during the test, which shows good adhesion and cutting performance of the drill.

Keywords: cemented carbide with high cobalt content, CVD boron-doped diamond, cutting test, drill

Procedia PDF Downloads 420
1611 Analysis of Hard Turning Process of AISI D3-Thermal Aspects

Authors: B. Varaprasad, C. Srinivasa Rao

Abstract:

In the manufacturing sector, hard turning has emerged as vital machining process for cutting hardened steels. Besides many advantages of hard turning operation, one has to implement to achieve close tolerances in terms of surface finish, high product quality, reduced machining time, low operating cost and environmentally friendly characteristics. In the present study, three-dimensional CAE (Computer Aided Engineering) based simulation of  hard turning by using commercial software DEFORM 3D has been compared to experimental results of  stresses, temperatures and tool forces in machining of AISI D3 steel using mixed Ceramic inserts (CC6050). In the present analysis, orthogonal cutting models are proposed, considering several processing parameters such as cutting speed, feed, and depth of cut. An exhaustive friction modeling at the tool-work interfaces is carried out. Work material flow around the cutting edge is carefully modeled with adaptive re-meshing simulation capability. In process simulations, feed rate and cutting speed are constant (i.e.,. 0.075 mm/rev and 155 m/min), and analysis is focused on stresses, forces, and temperatures during machining. Close agreement is observed between CAE simulation and experimental values.

Keywords: hard turning, computer aided engineering, computational machining, finite element method

Procedia PDF Downloads 454
1610 Wildfires Assessed By Remote Sensed Images And Burned Land Monitoring

Authors: Maria da Conceição Proença

Abstract:

This case study implements the evaluation of burned areas that suffered successive wildfires in Portugal mainland during the summer of 2017, killing more than 60 people. It’s intended to show that this evaluation can be done with remote sensing data free of charges in a simple laptop, with open-source software, describing the not-so-simple methodology step by step, to make it available for county workers in city halls of the areas attained, where the availability of information is essential for the immediate planning of mitigation measures, such as restoring road access, allocate funds for the recovery of human dwellings and assess further restoration of the ecological system. Wildfires also devastate forest ecosystems having a direct impact on vegetation cover and killing or driving away from the animal population. The economic interest is also attained, as the pinewood burned becomes useless for the noblest applications, so its value decreases, and resin extraction ends for several years. The tools described in this paper enable the location of the areas where took place the annihilation of natural habitats and establish a baseline for major changes in forest ecosystems recovery. Moreover, the result allows the follow up of the surface fuel loading, enabling the targeting and evaluation of restoration measures in a time basis planning.

Keywords: image processing, remote sensing, wildfires, burned areas evaluation, sentinel-2

Procedia PDF Downloads 211
1609 Advanced Technologies for Detector Readout in Particle Physics

Authors: Y. Venturini, C. Tintori

Abstract:

Given the continuous demand for improved readout performances in particle and dark matter physics, CAEN SpA is pushing on the development of advanced technologies for detector readout. We present the Digitizers 2.0, the result of the success of the previous Digitizers generation, combined with expanded capabilities and a renovation of the user experience introducing the open FPGA. The first product of the family is the VX2740 (64 ch, 125 MS/s, 16 bit) for advanced waveform recording and Digital Pulse Processing, fitting with the special requirements of Dark Matter and Neutrino experiments. In parallel, CAEN is developing the FERS-5200 platform, a Front-End Readout System designed to read out large multi-detector arrays, such as SiPMs, multi-anode PMTs, silicon strip detectors, wire chambers, GEM, gas tubes, and others. This is a highly-scalable distributed platform, based on small Front-End cards synchronized and read out by a concentrator board, allowing to build extremely large experimental setup. We plan to develop a complete family of cost-effective Front-End cards tailored to specific detectors and applications. The first one available is the A5202, a 64-channel unit for SiPM readout based on CITIROC ASIC by Weeroc.

Keywords: dark matter, digitizers, front-end electronics, open FPGA, SiPM

Procedia PDF Downloads 128
1608 Study of Natural Patterns on Digital Image Correlation Using Simulation Method

Authors: Gang Li, Ghulam Mubashar Hassan, Arcady Dyskin, Cara MacNish

Abstract:

Digital image correlation (DIC) is a contactless full-field displacement and strain reconstruction technique commonly used in the field of experimental mechanics. Comparing with physical measuring devices, such as strain gauges, which only provide very restricted coverage and are expensive to deploy widely, the DIC technique provides the result with full-field coverage and relative high accuracy using an inexpensive and simple experimental setup. It is very important to study the natural patterns effect on the DIC technique because the preparation of the artificial patterns is time consuming and hectic process. The objective of this research is to study the effect of using images having natural pattern on the performance of DIC. A systematical simulation method is used to build simulated deformed images used in DIC. A parameter (subset size) used in DIC can have an effect on the processing and accuracy of DIC and even cause DIC to failure. Regarding to the picture parameters (correlation coefficient), the higher similarity of two subset can lead the DIC process to fail and make the result more inaccurate. The pictures with good and bad quality for DIC methods have been presented and more importantly, it is a systematic way to evaluate the quality of the picture with natural patterns before they install the measurement devices.

Keywords: Digital Image Correlation (DIC), deformation simulation, natural pattern, subset size

Procedia PDF Downloads 419
1607 Probing Language Models for Multiple Linguistic Information

Authors: Bowen Ding, Yihao Kuang

Abstract:

In recent years, large-scale pre-trained language models have achieved state-of-the-art performance on a variety of natural language processing tasks. The word vectors produced by these language models can be viewed as dense encoded presentations of natural language that in text form. However, it is unknown how much linguistic information is encoded and how. In this paper, we construct several corresponding probing tasks for multiple linguistic information to clarify the encoding capabilities of different language models and performed a visual display. We firstly obtain word presentations in vector form from different language models, including BERT, ELMo, RoBERTa and GPT. Classifiers with a small scale of parameters and unsupervised tasks are then applied on these word vectors to discriminate their capability to encode corresponding linguistic information. The constructed probe tasks contain both semantic and syntactic aspects. The semantic aspect includes the ability of the model to understand semantic entities such as numbers, time, and characters, and the grammatical aspect includes the ability of the language model to understand grammatical structures such as dependency relationships and reference relationships. We also compare encoding capabilities of different layers in the same language model to infer how linguistic information is encoded in the model.

Keywords: language models, probing task, text presentation, linguistic information

Procedia PDF Downloads 110
1606 Heart Rate Variability Analysis for Early Stage Prediction of Sudden Cardiac Death

Authors: Reeta Devi, Hitender Kumar Tyagi, Dinesh Kumar

Abstract:

In present scenario, cardiovascular problems are growing challenge for researchers and physiologists. As heart disease have no geographic, gender or socioeconomic specific reasons; detecting cardiac irregularities at early stage followed by quick and correct treatment is very important. Electrocardiogram is the finest tool for continuous monitoring of heart activity. Heart rate variability (HRV) is used to measure naturally occurring oscillations between consecutive cardiac cycles. Analysis of this variability is carried out using time domain, frequency domain and non-linear parameters. This paper presents HRV analysis of the online dataset for normal sinus rhythm (taken as healthy subject) and sudden cardiac death (SCD subject) using all three methods computing values for parameters like standard deviation of node to node intervals (SDNN), square root of mean of the sequences of difference between adjacent RR intervals (RMSSD), mean of R to R intervals (mean RR) in time domain, very low-frequency (VLF), low-frequency (LF), high frequency (HF) and ratio of low to high frequency (LF/HF ratio) in frequency domain and Poincare plot for non linear analysis. To differentiate HRV of healthy subject from subject died with SCD, k –nearest neighbor (k-NN) classifier has been used because of its high accuracy. Results show highly reduced values for all stated parameters for SCD subjects as compared to healthy ones. As the dataset used for SCD patients is recording of their ECG signal one hour prior to their death, it is therefore, verified with an accuracy of 95% that proposed algorithm can identify mortality risk of a patient one hour before its death. The identification of a patient’s mortality risk at such an early stage may prevent him/her meeting sudden death if in-time and right treatment is given by the doctor.

Keywords: early stage prediction, heart rate variability, linear and non-linear analysis, sudden cardiac death

Procedia PDF Downloads 342
1605 Molecular Characterization of Grain Storage Proteins in Some Hordeum Species

Authors: Manar Makhoul, Buthainah Alsalamah, Salam Lawand, Hassan Azzam

Abstract:

The major storage proteins in endosperm of 33 cultivated and wild barley genotypes (H.vulgare, H. spontaneum, H. bulbosum, H. murinum, H. marinum) were analyzed to demonstrate the variation in the hordein polypeptides encoded by multigene families in grains. The SDS-PAGE revealed 13 and 17 alleles at the Hor1 and the Hor2 loci respectively, with frequencies from 0.83 to 14 and 0.56 to 13.41% respectively, while seven alleles at the Hor3 locus with frequencies from 3.63 to 30.91% were recognized. The phylogenetic analysis indicated to relevance of the polymorphism in hordein patterns as successful tool in identifying the individual genotypes and discriminating the species according to genome type. We also reported in this research complete nucleotide sequence B-hordein genes of seven wild and cultivated barley genotypes. A 152bp upstream sequence of B-hordein promoter contained a TATA box, CATC box, AAAG motif, N-motif and E-motif. In silico analysis of B-Hordein sequences demonstrated that the coding regions were not interrupted by any intron, and included the complete ORF which varied between 882 and 906 bp, and encoded mature proteins with 293-301 residues characterized by high contents of glutamine (29%), and proline (18%). Comparison of the predicted polypeptide sequences with the published ones suggested that all S-rich prolamins genes are descended from common ancestor. The sequence started at N-terminal with a signal peptide, and then followed directly by two domains; a repetitive one based on the repetition of the repeat unit PQQPFPQQ and C-terminal domain. Also, it was found that positions of the eight cysteine residues were highly conserved in all the B-hordein sequences, but Hordeum bulbosum had additional unpaired one. The phylogenetic tree of B-hordein polypeptide separated the genotypes in distinct seven subgroups. In general, the high homology between B-hordeins and LMW glutenin subunits suggests similar bread-making influences for these B-hordeins.

Keywords: hordeum, phylogenetic tree, sequencing, storage protein

Procedia PDF Downloads 267
1604 An Investigation of the Effects of Emotional Experience Induction on Mirror Neurons System Activity with Regard to Spectrum of Depressive Symptoms

Authors: Elyas Akbari, Jafar Hasani, Newsha Dehestani, Mohammad Khaleghi, Alireza Moradi

Abstract:

The aim of the present study was to assess the effect of emotional experience induction in the mirror neurons systems (MNS) activity with regard to the spectrum of depressive symptoms. For this purpose, at first stage, 449 students of Kharazmi University of Tehran were selected randomly and completed the second version of the Beck Depression Inventory (BDI-II). Then, 36 students with standard Z-score equal or above +1.5 and equal or equal or below -1.5 were selected to construct two groups of high and low spectrum of depressive symptoms. In the next stage, the basic activity of MNS was recorded (mu wave) before presenting the positive and negative emotional video clips by Electroencephalography (EEG) technique. The findings related to emotion induction (neutral, negative and positive emotion) demonstrated that the activity of recorded mirror neuron areas had a significant difference between the depressive and non-depressive groups. These findings suggest that probably processing of negative emotions in depressive individuals is due to the idea that the mirror neurons in motor cortex matched up the activity of cognitive regions with the person’s schema. Considering the results of the present study, it could be said that the MNS provides a substrate where emotional disorders can be studied and evaluated.

Keywords: emotional experiences, mirror neurons, depressive symptoms, negative and positive emotion

Procedia PDF Downloads 358
1603 Effective Parameter Selection for Audio-Based Music Mood Classification for Christian Kokborok Song: A Regression-Based Approach

Authors: Sanchali Das, Swapan Debbarma

Abstract:

Music mood classification is developing in both the areas of music information retrieval (MIR) and natural language processing (NLP). Some of the Indian languages like Hindi English etc. have considerable exposure in MIR. But research in mood classification in regional language is very less. In this paper, powerful audio based feature for Kokborok Christian song is identified and mood classification task has been performed. Kokborok is an Indo-Burman language especially spoken in the northeastern part of India and also some other countries like Bangladesh, Myanmar etc. For performing audio-based classification task, useful audio features are taken out by jMIR software. There are some standard audio parameters are there for the audio-based task but as known to all that every language has its unique characteristics. So here, the most significant features which are the best fit for the database of Kokborok song is analysed. The regression-based model is used to find out the independent parameters that act as a predictor and predicts the dependencies of parameters and shows how it will impact on overall classification result. For classification WEKA 3.5 is used, and selected parameters create a classification model. And another model is developed by using all the standard audio features that are used by most of the researcher. In this experiment, the essential parameters that are responsible for effective audio based mood classification and parameters that do not significantly change for each of the Christian Kokborok songs are analysed, and a comparison is also shown between the two above model.

Keywords: Christian Kokborok song, mood classification, music information retrieval, regression

Procedia PDF Downloads 222
1602 GAILoc: Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence

Authors: Getaneh Berie Tarekegn

Abstract:

A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 75
1601 Catchment Yield Prediction in an Ungauged Basin Using PyTOPKAPI

Authors: B. S. Fatoyinbo, D. Stretch, O. T. Amoo, D. Allopi

Abstract:

This study extends the use of the Drainage Area Regionalization (DAR) method in generating synthetic data and calibrating PyTOPKAPI stream yield for an ungauged basin at a daily time scale. The generation of runoff in determining a river yield has been subjected to various topographic and spatial meteorological variables, which integers form the Catchment Characteristics Model (CCM). Many of the conventional CCM models adapted in Africa have been challenged with a paucity of adequate, relevance and accurate data to parameterize and validate the potential. The purpose of generating synthetic flow is to test a hydrological model, which will not suffer from the impact of very low flows or very high flows, thus allowing to check whether the model is structurally sound enough or not. The employed physically-based, watershed-scale hydrologic model (PyTOPKAPI) was parameterized with GIS-pre-processing parameters and remote sensing hydro-meteorological variables. The validation with mean annual runoff ratio proposes a decent graphical understanding between observed and the simulated discharge. The Nash-Sutcliffe efficiency and coefficient of determination (R²) values of 0.704 and 0.739 proves strong model efficiency. Given the current climate variability impact, water planner can now assert a tool for flow quantification and sustainable planning purposes.

Keywords: catchment characteristics model, GIS, synthetic data, ungauged basin

Procedia PDF Downloads 327
1600 Aviation versus Aerospace: A Differential Analysis of Workforce Jobs via Text Mining

Authors: Sarah Werner, Michael J. Pritchard

Abstract:

From pilots to engineers, the skills development within the aerospace industry is exceptionally broad. Employers often struggle with finding the right mixture of qualified skills to fill their organizational demands. This effort to find qualified talent is further complicated by the industrial delineation between two key areas: aviation and aerospace. In a broad sense, the aerospace industry overlaps with the aviation industry. In turn, the aviation industry is a smaller sector segment within the context of the broader definition of the aerospace industry. Furthermore, it could be conceptually argued that -in practice- there is little distinction between these two sectors (i.e., aviation and aerospace). However, through our unstructured text analysis of over 6,000 job listings captured, our team found a clear delineation between aviation-related jobs and aerospace-related jobs. Using techniques in natural language processing, our research identifies an integrated workforce skill pattern that clearly breaks between these two sectors. While the aviation sector has largely maintained its need for pilots, mechanics, and associated support personnel, the staffing needs of the aerospace industry are being progressively driven by integrative engineering needs. Increasingly, this is leading many aerospace-based organizations towards the acquisition of 'system level' staffing requirements. This research helps to better align higher educational institutions with the current industrial staffing complexities within the broader aerospace sector.

Keywords: aerospace industry, job demand, text mining, workforce development

Procedia PDF Downloads 272
1599 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges

Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch

Abstract:

Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.

Keywords: big data interpretation, datathon, systems toxicology, verification

Procedia PDF Downloads 278
1598 Results concerning the University: Industry Partnership for a Research Project Implementation (MUROS) in the Romanian Program Star

Authors: Loretta Ichim, Dan Popescu, Grigore Stamatescu

Abstract:

The paper reports the collaboration between a top university from Romania and three companies for the implementation of a research project in a multidisciplinary domain, focusing on the impact and benefits both for the education and industry. The joint activities were developed under the Space Technology and Advanced Research Program (STAR), funded by the Romanian Space Agency (ROSA) for a university-industry partnership. The context was defined by linking the European Space Agency optional programs, with the development and promotion national research, with the educational and industrial capabilities in the aeronautics, security and related areas by increasing the collaboration between academic and industrial entities as well as by realizing high-level scientific production. The project name is Multisensory Robotic System for Aerial Monitoring of Critical Infrastructure Systems (MUROS), which was carried 2013-2016. The project included the University POLITEHNICA of Bucharest (coordinator) and three companies, which manufacture and market unmanned aerial systems. The project had as main objective the development of an integrated system for combined ground wireless sensor networks and UAV monitoring in various application scenarios for critical infrastructure surveillance. This included specific activities related to fundamental and applied research, technology transfer, prototype implementation and result dissemination. The core area of the contributions laid in distributed data processing and communication mechanisms, advanced image processing and embedded system development. Special focus is given by the paper to analyzing the impact the project implementation in the educational process, directly or indirectly, through the faculty members (professors and students) involved in the research team. Three main directions are discussed: a) enabling students to carry out internships at the partner companies, b) handling advanced topics and industry requirements at the master's level, c) experiments and concept validation for doctoral thesis. The impact of the research work (as the educational component) developed by the faculty members on the increasing performances of the companies’ products is highlighted. The collaboration between university and companies was well balanced both for contributions and results. The paper also presents the outcomes of the project which reveals the efficient collaboration between high education and industry: master thesis, doctoral thesis, conference papers, journal papers, technical documentation for technology transfer, prototype, and patent. The experience can provide useful practices of blending research and education within an academia-industry cooperation framework while the lessons learned represent a starting point in debating the new role of advanced research and development performing companies in association with higher education. This partnership, promoted at UE level, has a broad impact beyond the constrained scope of a single project and can develop into long-lasting collaboration while benefiting all stakeholders: students, universities and the surrounding knowledge-based economic and industrial ecosystem. Due to the exchange of experiences between the university (UPB) and the manufacturing company (AFT Design), a new project, SIMUL, under the Bridge Grant Program (Romanian executive agency UEFISCDI) was started (2016 – 2017). This project will continue the educational research for innovation on master and doctoral studies in MUROS thematic (collaborative multi-UAV application for flood detection).

Keywords: education process, multisensory robotic system, research and innovation project, technology transfer, university-industry partnership

Procedia PDF Downloads 239
1597 Mechanical Properties of Spark Plasma Sintered 2024 AA Reinforced with TiB₂ and Nano Yttrium

Authors: Suresh Vidyasagar Chevuri, D. B. Karunakar Chevuri

Abstract:

The main advantages of 'Metal Matrix Nano Composites (MMNCs)' include excellent mechanical performance, good wear resistance, low creep rate, etc. The method of fabrication of MMNCs is quite a challenge, which includes processing techniques like Spark Plasma Sintering (SPS), etc. The objective of the present work is to fabricate aluminum based MMNCs with the addition of small amounts of yttrium using Spark Plasma Sintering and to evaluate their mechanical and microstructure properties. Samples of 2024 AA with yttrium ranging from 0.1% to 0.5 wt% keeping 1 wt% TiB2 constant are fabricated by Spark Plasma Sintering (SPS). The mechanical property like hardness is determined using Vickers hardness testing machine. The metallurgical characterization of the samples is evaluated by Optical Microscopy (OM), Field Emission Scanning Electron Microscopy (FE-SEM) and X-Ray Diffraction (XRD). Unreinforced 2024 AA sample is also fabricated as a benchmark to compare its properties with that of the composite developed. It is found that the yttrium addition increases the above-mentioned properties to some extent and then decreases gradually when yttrium wt% increases beyond a point between 0.3 and 0.4 wt%. High density is achieved in the samples fabricated by spark plasma sintering when compared to any other fabrication route, and uniform distribution of yttrium is observed.

Keywords: spark plasma sintering, 2024 AA, yttrium addition, microstructure characterization, mechanical properties

Procedia PDF Downloads 224
1596 Toward Indoor and Outdoor Surveillance using an Improved Fast Background Subtraction Algorithm

Authors: El Harraj Abdeslam, Raissouni Naoufal

Abstract:

The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes in variance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.

Keywords: video surveillance, background subtraction, contrast limited histogram equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes

Procedia PDF Downloads 256
1595 Transfer Function Model-Based Predictive Control for Nuclear Core Power Control in PUSPATI TRIGA Reactor

Authors: Mohd Sabri Minhat, Nurul Adilla Mohd Subha

Abstract:

The 1MWth PUSPATI TRIGA Reactor (RTP) in Malaysia Nuclear Agency has been operating more than 35 years. The existing core power control is using conventional controller known as Feedback Control Algorithm (FCA). It is technically challenging to keep the core power output always stable and operating within acceptable error bands for the safety demand of the RTP. Currently, the system could be considered unsatisfactory with power tracking performance, yet there is still significant room for improvement. Hence, a new design core power control is very important to improve the current performance in tracking and regulating reactor power by controlling the movement of control rods that suit the demand of highly sensitive of nuclear reactor power control. In this paper, the proposed Model Predictive Control (MPC) law was applied to control the core power. The model for core power control was based on mathematical models of the reactor core, MPC, and control rods selection algorithm. The mathematical models of the reactor core were based on point kinetics model, thermal hydraulic models, and reactivity models. The proposed MPC was presented in a transfer function model of the reactor core according to perturbations theory. The transfer function model-based predictive control (TFMPC) was developed to design the core power control with predictions based on a T-filter towards the real-time implementation of MPC on hardware. This paper introduces the sensitivity functions for TFMPC feedback loop to reduce the impact on the input actuation signal and demonstrates the behaviour of TFMPC in term of disturbance and noise rejections. The comparisons of both tracking and regulating performance between the conventional controller and TFMPC were made using MATLAB and analysed. In conclusion, the proposed TFMPC has satisfactory performance in tracking and regulating core power for controlling nuclear reactor with high reliability and safety.

Keywords: core power control, model predictive control, PUSPATI TRIGA reactor, TFMPC

Procedia PDF Downloads 241
1594 Detection of Micro-Unmanned Ariel Vehicles Using a Multiple-Input Multiple-Output Digital Array Radar

Authors: Tareq AlNuaim, Mubashir Alam, Abdulrazaq Aldowesh

Abstract:

The usage of micro-Unmanned Ariel Vehicles (UAVs) has witnessed an enormous increase recently. Detection of such drones became a necessity nowadays to prevent any harmful activities. Typically, such targets have low velocity and low Radar Cross Section (RCS), making them indistinguishable from clutter and phase noise. Multiple-Input Multiple-Output (MIMO) Radars have many potentials; it increases the degrees of freedom on both transmit and receive ends. Such architecture allows for flexibility in operation, through utilizing the direct access to every element in the transmit/ receive array. MIMO systems allow for several array processing techniques, permitting the system to stare at targets for longer times, which improves the Doppler resolution. In this paper, a 2×2 MIMO radar prototype is developed using Software Defined Radio (SDR) technology, and its performance is evaluated against a slow-moving low radar cross section micro-UAV used by hobbyists. Radar cross section simulations were carried out using FEKO simulator, achieving an average of -14.42 dBsm at S-band. The developed prototype was experimentally evaluated achieving more than 300 meters of detection range for a DJI Mavic pro-drone

Keywords: digital beamforming, drone detection, micro-UAV, MIMO, phased array

Procedia PDF Downloads 139
1593 Aqueous Two Phase Extraction of Jonesia denitrificans Xylanase 6 in PEG 1000/Phosphate System

Authors: Nawel Boucherba, Azzedine Bettache, Abdelaziz Messis, Francis Duchiron, Said Benallaoua

Abstract:

The impetus for research in the field of bioseparation has been sparked by the difficulty and complexity in the downstream processing of biological products. Indeed, 50% to 90% of the production cost for a typical biological product resides in the purification strategy. There is a need for efficient and economical large scale bioseparation techniques which will achieve high purity and high recovery while maintaining the biological activity of the molecule. One such purification technique which meets these criteria involves the partitioning of biomolecules between two immiscible phases in an aqueous system (ATPS). The Production of xylanases is carried out in 500ml of a liquid medium based on birchwood xylan. In each ATPS, PEG 1000 is added to a mixture consisting of dipotassium phosphate, sodium chloride and the culture medium inoculated with the strain Jonesia denitrificans, the mixture was adjusted to different pH. The concentration of PEG 1000 was varied: 8 to 16 % and the NaCl percentages are also varied from 2 to 4% while maintaining the other parameters constant. The results showed that the best ATPS for purification of xylanases is composed of PEG 1000 at 8.33%, 13.14 % of K2HPO4, 1.62% NaCl at pH 7. We obtained a yield of 96.62 %, a partition coefficient of 86.66 and a purification factor of 2.9. The zymogram showed that the activity is mainly detected in the top phase.

Keywords: Jonesia denitrificans BN13, xylanase, aqueous two phases system, zymogram

Procedia PDF Downloads 399
1592 Investigation of the Effects of Processing Parameters on Pla Based 3D Printed Tensile Samples

Authors: Saifullah Karimullah

Abstract:

Additive manufacturing techniques are becoming more common with the latest technological advancements. It is composed to bring a revolution in the way products are designed, planned, manufactured, and distributed to end users. Fused deposition modeling (FDM) based 3D printing is one of those promising aspects that have revolutionized the prototyping processes. The purpose of this design and study project is to design a customized laboratory-scale FDM-based 3D printer from locally available sources. The primary goal is to design and fabricate the FDM-based 3D printer. After the fabrication, a tensile test specimen would be designed in Solid Works or [Creo computer-aided design (CAD)] software. A .stl file is generated of the tensile test specimen through slicing software and the G-codes are inserted via a computer for the test specimen to be printed. Different parameters were under studies like printing speed, layer thickness and infill density of the printed object. Some parameters were kept constant such as temperature, extrusion rate, raster orientation etc. Different tensile test specimens were printed for a different sets of parameters of the FDM-based 3d printer. The tensile test specimen were subjected to tensile tests using a universal testing machine (UTM). Design Expert software has been used for analyses, So Different results were obtained from the different tensile test specimens. The best, average and worst specimen were also observed under a compound microscope to investigate the layer bonding in between.

Keywords: additive manufacturing techniques, 3D printing, CAD software, UTM machine

Procedia PDF Downloads 103
1591 Bioinformatics Approach to Identify Physicochemical and Structural Properties Associated with Successful Cell-free Protein Synthesis

Authors: Alexander A. Tokmakov

Abstract:

Cell-free protein synthesis is widely used to synthesize recombinant proteins. It allows genome-scale expression of various polypeptides under strictly controlled uniform conditions. However, only a minor fraction of all proteins can be successfully expressed in the systems of protein synthesis that are currently used. The factors determining expression success are poorly understood. At present, the vast volume of data is accumulated in cell-free expression databases. It makes possible comprehensive bioinformatics analysis and identification of multiple features associated with successful cell-free expression. Here, we describe an approach aimed at identification of multiple physicochemical and structural properties of amino acid sequences associated with protein solubility and aggregation and highlight major correlations obtained using this approach. The developed method includes: categorical assessment of the protein expression data, calculation and prediction of multiple properties of expressed amino acid sequences, correlation of the individual properties with the expression scores, and evaluation of statistical significance of the observed correlations. Using this approach, we revealed a number of statistically significant correlations between calculated and predicted features of protein sequences and their amenability to cell-free expression. It was found that some of the features, such as protein pI, hydrophobicity, presence of signal sequences, etc., are mostly related to protein solubility, whereas the others, such as protein length, number of disulfide bonds, content of secondary structure, etc., affect mainly the expression propensity. We also demonstrated that amenability of polypeptide sequences to cell-free expression correlates with the presence of multiple sites of post-translational modifications. The correlations revealed in this study provide a plethora of important insights into protein folding and rationalization of protein production. The developed bioinformatics approach can be of practical use for predicting expression success and optimizing cell-free protein synthesis.

Keywords: bioinformatics analysis, cell-free protein synthesis, expression success, optimization, recombinant proteins

Procedia PDF Downloads 419
1590 Taguchi-Based Surface Roughness Optimization for Slotted and Tapered Cylindrical Products in Milling and Turning Operations

Authors: Vineeth G. Kuriakose, Joseph C. Chen, Ye Li

Abstract:

The research follows a systematic approach to optimize the parameters for parts machined by turning and milling processes. The quality characteristic chosen is surface roughness since the surface finish plays an important role for parts that require surface contact. A tapered cylindrical surface is designed as a test specimen for the research. The material chosen for machining is aluminum alloy 6061 due to its wide variety of industrial and engineering applications. HAAS VF-2 TR computer numerical control (CNC) vertical machining center is used for milling and HAAS ST-20 CNC machine is used for turning in this research. Taguchi analysis is used to optimize the surface roughness of the machined parts. The L9 Orthogonal Array is designed for four controllable factors with three different levels each, resulting in 18 experimental runs. Signal to Noise (S/N) Ratio is calculated for achieving the specific target value of 75 ± 15 µin. The controllable parameters chosen for turning process are feed rate, depth of cut, coolant flow and finish cut and for milling process are feed rate, spindle speed, step over and coolant flow. The uncontrollable factors are tool geometry for turning process and tool material for milling process. Hypothesis testing is conducted to study the significance of different uncontrollable factors on the surface roughnesses. The optimal parameter settings were identified from the Taguchi analysis and the process capability Cp and the process capability index Cpk were improved from 1.76 and 0.02 to 3.70 and 2.10 respectively for turning process and from 0.87 and 0.19 to 3.85 and 2.70 respectively for the milling process. The surface roughnesses were improved from 60.17 µin to 68.50 µin, reducing the defect rate from 52.39% to 0% for the turning process and from 93.18 µin to 79.49 µin, reducing the defect rate from 71.23% to 0% for the milling process. The purpose of this study is to efficiently utilize the Taguchi design analysis to improve the surface roughness.

Keywords: surface roughness, Taguchi parameter design, CNC turning, CNC milling

Procedia PDF Downloads 155
1589 Acoustic Emission for Tool-Chip Interface Monitoring during Orthogonal Cutting

Authors: D. O. Ramadan, R. S. Dwyer-Joyce

Abstract:

The measurement of the interface conditions in a cutting tool contact is essential information for performance monitoring and control. This interface provides the path for the heat flux to the cutting tool. This elevate in the cutting tool temperature leads to motivate the mechanism of tool wear, thus affect the life of the cutting tool and the productivity. This zone is representative by the tool-chip interface. Therefore, understanding and monitoring this interface is considered an important issue in machining. In this paper, an acoustic emission (AE) technique was used to find the correlation between AE parameters and the tool-chip interface. For this reason, a response surface design (RSD) has been used to analyse and optimize the machining parameters. The experiment design was based on the face centered, central composite design (CCD) in the Minitab environment. According to this design, a series of orthogonal cutting experiments for different cutting conditions were conducted on a Triumph 2500 lathe machine to study the sensitivity of the acoustic emission (AE) signal to change in tool-chip contact length. The cutting parameters investigated were the cutting speed, depth of cut, and feed and the experiments were performed for 6082-T6 aluminium tube. All the orthogonal cutting experiments were conducted unlubricated. The tool-chip contact area was investigated using a scanning electron microscope (SEM). The results obtained in this paper indicate that there is a strong dependence of the root mean square (RMS) on the cutting speed, where the RMS increases with increasing the cutting speed. A dependence on the tool-chip contact length has been also observed. However there was no effect observed of changing the cutting depth and feed on the RMS. These dependencies have been clarified in terms of the strain and temperature in the primary and secondary shear zones, also the tool-chip sticking and sliding phenomenon and the effect of these mechanical variables on dislocation activity at high strain rates. In conclusion, the acoustic emission technique has the potential to monitor in situ the tool-chip interface in turning and consequently could indicate the approaching end of life of a cutting tool.

Keywords: Acoustic emission, tool-chip interface, orthogonal cutting, monitoring

Procedia PDF Downloads 487