Search results for: reward processing
1784 Statistical Shape Analysis of the Human Upper Airway
Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar
Abstract:
The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.Keywords: medical imaging, image processing, FEM/BEM, statistical modelling
Procedia PDF Downloads 5141783 The Implementation of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications
Authors: Mohamed R. Mhereeg
Abstract:
The paper discusses the implementation of the MultiAgent classification System (MACS) and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies, which are the .NET widows service based agents, the Windows Communication Foundation (WCF) services, the Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). Microsoft's .NET windows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW. The Monitoring Agents (MAs) were configured to execute automatically to monitor excel spreadsheets development activities by content. Data gathered by the Monitoring Agents from various resources over a period of time was collected and filtered by a Database Updater Agent (DUA) residing in the .NET client application of the system. This agent then transfers and stores the data in Oracle server database via Oracle stored procedures for further processing that leads to the classification of the end user developers.Keywords: MACS, implementation, multi-agent, SOA, autonomous, WCF
Procedia PDF Downloads 2741782 Water Detection in Aerial Images Using Fuzzy Sets
Authors: Caio Marcelo Nunes, Anderson da Silva Soares, Gustavo Teodoro Laureano, Clarimar Jose Coelho
Abstract:
This paper presents a methodology to pixel recognition in aerial images using fuzzy $c$-means algorithm. This algorithm is a alternative to recognize areas considering uncertainties and inaccuracies. Traditional clustering technics are used in recognizing of multispectral images of earth's surface. This technics recognize well-defined borders that can be easily discretized. However, in the real world there are many areas with uncertainties and inaccuracies which can be mapped by clustering algorithms that use fuzzy sets. The methodology presents in this work is applied to multispectral images obtained from Landsat-5/TM satellite. The pixels are joined using the $c$-means algorithm. After, a classification process identify the types of surface according the patterns obtained from spectral response of image surface. The classes considered are, exposed soil, moist soil, vegetation, turbid water and clean water. The results obtained shows that the fuzzy clustering identify the real type of the earth's surface.Keywords: aerial images, fuzzy clustering, image processing, pattern recognition
Procedia PDF Downloads 4831781 Feature Extraction of MFCC Based on Fisher-Ratio and Correlated Distance Criterion for Underwater Target Signal
Authors: Han Xue, Zhang Lanyue
Abstract:
In order to seek more effective feature extraction technology, feature extraction method based on MFCC combined with vector hydrophone is exposed in the paper. The sound pressure signal and particle velocity signal of two kinds of ships are extracted by using MFCC and its evolution form, and the extracted features are fused by using fisher-ratio and correlated distance criterion. The features are then identified by BP neural network. The results showed that MFCC, First-Order Differential MFCC and Second-Order Differential MFCC features can be used as effective features for recognition of underwater targets, and the fusion feature can improve the recognition rate. Moreover, the results also showed that the recognition rate of the particle velocity signal is higher than that of the sound pressure signal, and it reflects the superiority of vector signal processing.Keywords: vector information, MFCC, differential MFCC, fusion feature, BP neural network
Procedia PDF Downloads 5301780 A Guide to User-Friendly Bash Prompt: Adding Natural Language Processing Plus Bash Explanation to the Command Interface
Authors: Teh Kean Kheng, Low Soon Yee, Burra Venkata Durga Kumar
Abstract:
In 2022, as the future world becomes increasingly computer-related, more individuals are attempting to study coding for themselves or in school. This is because they have discovered the value of learning code and the benefits it will provide them. But learning coding is difficult for most people. Even senior programmers that have experience for a decade year still need help from the online source while coding. The reason causing this is that coding is not like talking to other people; it has the specific syntax to make the computer understand what we want it to do, so coding will be hard for normal people if they don’t have contact in this field before. Coding is hard. If a user wants to learn bash code with bash prompt, it will be harder because if we look at the bash prompt, we will find that it is just an empty box and waiting for a user to tell the computer what we want to do, if we don’t refer to the internet, we will not know what we can do with the prompt. From here, we can conclude that the bash prompt is not user-friendly for new users who are learning bash code. Our goal in writing this paper is to give an idea to implement a user-friendly Bash prompt in Ubuntu OS using Artificial Intelligent (AI) to lower the threshold of learning in Bash code, to make the user use their own words and concept to write and learn Bash code.Keywords: user-friendly, bash code, artificial intelligence, threshold, semantic similarity, lexical similarity
Procedia PDF Downloads 1421779 The Implementation of the Javanese Lettered-Manuscript Image Preprocessing Stage Model on the Batak Lettered-Manuscript Image
Authors: Anastasia Rita Widiarti, Agus Harjoko, Marsono, Sri Hartati
Abstract:
This paper presents the results of a study to test whether the Javanese character manuscript image preprocessing model that have been more widely applied, can also be applied to segment of the Batak characters manuscripts. The treatment process begins by converting the input image into a binary image. After the binary image is cleaned of noise, then the segmentation lines using projection profile is conducted. If unclear histogram projection is found, then the smoothing process before production indexes line segments is conducted. For each line image which has been produced, then the segmentation scripts in the line is applied, with regard of the connectivity between pixels which making up the letters that there is no characters are truncated. From the results of manuscript preprocessing system prototype testing, it is obtained the information about the system truth percentage value on pieces of Pustaka Batak Podani Ma AjiMamisinon manuscript ranged from 65% to 87.68% with a confidence level of 95%. The value indicates the truth percentage shown the initial processing model in Javanese characters manuscript image can be applied also to the image of the Batak characters manuscript.Keywords: connected component, preprocessing, manuscript image, projection profiles
Procedia PDF Downloads 4001778 Hydrodynamic Study of Laminar Flow in Agitated Vessel by a Curved Blade Agitator
Authors: A. Benmoussa, M. Bouanini, M. Rebhi
Abstract:
The mixing and agitation of fluid in stirred tank is one of the most important unit operations for many industries such as chemical, biotechnological, pharmaceutical, petrochemical, cosmetic, and food processing. Therefore, determining the level of mixing and overall behaviour and performance of the mixing tanks are crucial from the product quality and process economics point of views. The most fundamental needs for the analysis of these processes from both a theoretical and industrial perspective is the knowledge of the hydrodynamic behaviour and the flow structure in such tanks. Depending on the purpose of the operation carried out in mixer, the best choice for geometry of the tank and agitator type can vary widely. Initially, a local and global study namely the velocity and power number on a typical agitation system agitated by a mobile-type two-blade straight (d/D=0.5) allowed us to test the reliability of the CFD, the result were compared with those of experimental literature, a very good concordance was observed. The stream function, the velocity profile, the velocity fields and power number are analyzed. It was shown that the hydrodynamics is modified by the curvature of the mobile which plays a key role.Keywords: agitated tanks, curved blade agitator, laminar flow, CFD modelling
Procedia PDF Downloads 4161777 Towards a Large Scale Deep Semantically Analyzed Corpus for Arabic: Annotation and Evaluation
Authors: S. Alansary, M. Nagi
Abstract:
This paper presents an approach of conducting semantic annotation of Arabic corpus using the Universal Networking Language (UNL) framework. UNL is intended to be a promising strategy for providing a large collection of semantically annotated texts with formal, deep semantics rather than shallow. The result would constitute a semantic resource (semantic graphs) that is editable and that integrates various phenomena, including predicate-argument structure, scope, tense, thematic roles and rhetorical relations, into a single semantic formalism for knowledge representation. The paper will also present the Interactive Analysis tool for automatic semantic annotation (IAN). In addition, the cornerstone of the proposed methodology which are the disambiguation and transformation rules, will be presented. Semantic annotation using UNL has been applied to a corpus of 20,000 Arabic sentences representing the most frequent structures in the Arabic Wikipedia. The representation, at different linguistic levels was illustrated starting from the morphological level passing through the syntactic level till the semantic representation is reached. The output has been evaluated using the F-measure. It is 90% accurate. This demonstrates how powerful the formal environment is, as it enables intelligent text processing and search.Keywords: semantic analysis, semantic annotation, Arabic, universal networking language
Procedia PDF Downloads 5821776 Improving Monitoring and Fault Detection of Solar Panels Using Arduino Mega in WSN
Authors: Ali Al-Dahoud, Mohamed Fezari, Thamer Al-Rawashdeh, Ismail Jannoud
Abstract:
Monitoring and detecting faults on a set of Solar panels, using a wireless sensor network (WNS) is our contribution in this paper, This work is part of the project we are working on at Al-Zaytoonah University. The research problem has been exposed by engineers and technicians or operators dealing with PV panels maintenance, in order to monitor and detect faults within solar panels which affect considerably the energy produced by the solar panels. The proposed solution is based on installing WSN nodes with appropriate sensors for more often occurred faults on the 45 solar panels installed on the roof of IT faculty. A simulation has been done on nodes distribution and a study for the design of a node with appropriate sensors taking into account the priorities of the processing faults. Finally, a graphic user interface is designed and adapted to telemonitoring panels using WSN. The primary tests of hardware implementation gave interesting results, the sensors calibration and interference transmission problem have been solved. A friendly GUI using high level language Visial Basic was developed to carry out the monitoring process and to save data on Exel File.Keywords: Arduino Mega microcnotroller, solar panels, fault-detection, simulation, node design
Procedia PDF Downloads 4651775 Prevention of Biocompounds and Amino Acid Losses in Vernonia amygdalina duringPost Harvest Treatment Using Hot Oil-Aqueous Mixture
Authors: Nneka Nkechi Uchegbu, Temitope Omolayo Fasuan
Abstract:
This study investigated how to reduce bio-compounds and amino acids in V. amygdalina leaf during processing as a functional food ingredient. Fresh V. amygdalina leaf was processed using thermal oil-aqueous mixtures (soybean oil: aqueous and palm oil: aqueous) at 1:40 and 130 (v/v), respectively. Results indicated that the hot soybean oil-aqueous mixture was the most effective in preserving the bio-compounds and amino acids with retention potentials of 80.95% of the bio-compounds at the rate of 90-100%. Hot palm oil-aqueous mixture retained 61.90% of the bio-compounds at the rate of 90-100% and hot aqueous retained 9.52% of the bio-compounds at the same rate. During the debittering process, seven new bio-compounds were formed in the leaves treated with hot soybean oil-aqueous mixture, six in palm oil-aqueous mixture, and only four in hot aqueous leaves. The bio-compounds in the treated leaves have potential functions as antitumor, antioxidants, antihistaminic, anti-ovarian cancer, anti-inflammatory, antiarthritic, hepatoprotective, antihistaminic, haemolytic 5-α reductase inhibitor, nt, immune-stimulant, diuretic, antiandrogenic, and anaemiagenic. Alkaloids and polyphenols were retained at the rate of 81.34-98.50% using oil: aqueous mixture while aqueous recorded the rate of 33.47-41.46%. Most of the essential amino acids were retained at a rate above 90% through the aid of oil. The process is scalable and could be employed for domestic and industrial applications.Keywords: V. amygdalina leaf, bio-compounds, oil-aqueous mixture, amino acids
Procedia PDF Downloads 1461774 Logistic Model Tree and Expectation-Maximization for Pollen Recognition and Grouping
Authors: Endrick Barnacin, Jean-Luc Henry, Jack Molinié, Jimmy Nagau, Hélène Delatte, Gérard Lebreton
Abstract:
Palynology is a field of interest for many disciplines. It has multiple applications such as chronological dating, climatology, allergy treatment, and even honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time-consuming task that requires the intervention of experts in the field, which is becoming increasingly rare due to economic and social conditions. So, the automation of this task is a necessity. Pollen slides analysis is mainly a visual process as it is carried out with the naked eye. That is the reason why a primary method to automate palynology is the use of digital image processing. This method presents the lowest cost and has relatively good accuracy in pollen retrieval. In this work, we propose a system combining recognition and grouping of pollen. It consists of using a Logistic Model Tree to classify pollen already known by the proposed system while detecting any unknown species. Then, the unknown pollen species are divided using a cluster-based approach. Success rates for the recognition of known species have been achieved, and automated clustering seems to be a promising approach.Keywords: pollen recognition, logistic model tree, expectation-maximization, local binary pattern
Procedia PDF Downloads 1821773 Effects of Bilingual Education in the Teaching and Learning Practices in the Continuous Improvement and Development of k12 Program
Authors: Miriam Sebastian
Abstract:
This research focused on the effects of bilingual education as medium of instruction to the academic performance of selected intermediate students of Miriam’s Academy of Valenzuela Inc. . An experimental design was used, with language of instruction as the independent variable and the different literacy skills as dependent variables. The sample consisted of experimental students comprises of 30 students were exposed to bilingual education (Filipino and English) . They were given pretests and were divided into three groups: Monolingual Filipino, Monolingual English, and Bilingual. They were taught different literacy skills for eight weeks and were then administered the posttests. Data was analyzed and evaluated in the light of the central processing and script-dependent hypotheses. Based on the data, it can be inferred that monolingual instruction in either Filipino or English had a stronger effect on the students’ literacy skills compared to bilingual instruction. Moreover, mother tongue-based instruction, as compared to second-language instruction, had stronger effect on the preschoolers’ literacy skills. Such results have implications not only for mother tongue-based (MTB) but also for English as a second language (ESL) instruction in the countryKeywords: bilingualism, effects, monolingual, function, multilingual, mother tongue
Procedia PDF Downloads 1271772 Processing and Characterization of Glass-Epoxy Composites Filled with Linz-Donawitz (LD) Slag
Authors: Pravat Ranjan Pati, Alok Satapathy
Abstract:
Linz-Donawitz (LD) slag a major solid waste generated in huge quantities during steel making. It comes from slag formers such as burned lime/dolomite and from oxidizing of silica, iron etc. while refining the iron into steel in the LD furnace. Although a number of ways for its utilization have been suggested, its potential as a filler material in polymeric matrices has not yet been explored. The present work reports the possible use of this waste in glass fiber reinforced epoxy composites as a filler material. Hybrid composites consisting of bi-directional e-glass-fiber reinforced epoxy filled with different LD slag content (0, 7.5, 15, 22.5 wt%) are prepared by simple hand lay-up technique. The composites are characterized in regard to their density, porosity, micro-hardness and strength properties. X-ray diffractography is carried out in order to ascertain the various phases present in LDS. This work shows that LD slag, in spite of being a waste, possesses fairly good filler characteristics as it modifies the strength properties and improves the composite micro-hardness of the polymeric resin.Keywords: characterization, glass-epoxy composites, LD slag, waste utilization
Procedia PDF Downloads 3921771 Hyperspectral Band Selection for Oil Spill Detection Using Deep Neural Network
Authors: Asmau Mukhtar Ahmed, Olga Duran
Abstract:
Hydrocarbon (HC) spills constitute a significant problem that causes great concern to the environment. With the latest technology (hyperspectral images) and state of the earth techniques (image processing tools), hydrocarbon spills can easily be detected at an early stage to mitigate the effects caused by such menace. In this study; a controlled laboratory experiment was used, and clay soil was mixed and homogenized with different hydrocarbon types (diesel, bio-diesel, and petrol). The different mixtures were scanned with HYSPEX hyperspectral camera under constant illumination to generate the hypersectral datasets used for this experiment. So far, the Short Wave Infrared Region (SWIR) has been exploited in detecting HC spills with excellent accuracy. However, the Near-Infrared Region (NIR) is somewhat unexplored with regards to HC contamination and how it affects the spectrum of soils. In this study, Deep Neural Network (DNN) was applied to the controlled datasets to detect and quantify the amount of HC spills in soils in the Near-Infrared Region. The initial results are extremely encouraging because it indicates that the DNN was able to identify features of HC in the Near-Infrared Region with a good level of accuracy.Keywords: hydrocarbon, Deep Neural Network, short wave infrared region, near-infrared region, hyperspectral image
Procedia PDF Downloads 1141770 Optical Flow Based System for Cross Traffic Alert
Authors: Giuseppe Spampinato, Salvatore Curti, Ivana Guarneri, Arcangelo Bruna
Abstract:
This document describes an advanced system and methodology for Cross Traffic Alert (CTA), able to detect vehicles that move into the vehicle driving path from the left or right side. The camera is supposed to be not only on a vehicle still, e.g. at a traffic light or at an intersection, but also moving slowly, e.g. in a car park. In all of the aforementioned conditions, a driver’s short loss of concentration or distraction can easily lead to a serious accident. A valid support to avoid these kinds of car crashes is represented by the proposed system. It is an extension of our previous work, related to a clustering system, which only works on fixed cameras. Just a vanish point calculation and simple optical flow filtering, to eliminate motion vectors due to the car relative movement, is performed to let the system achieve high performances with different scenarios, cameras and resolutions. The proposed system just uses as input the optical flow, which is hardware implemented in the proposed platform and since the elaboration of the whole system is really speed and power consumption, it is inserted directly in the camera framework, allowing to execute all the processing in real-time.Keywords: clustering, cross traffic alert, optical flow, real time, vanishing point
Procedia PDF Downloads 2031769 Microarray Data Visualization and Preprocessing Using R and Bioconductor
Authors: Ruchi Yadav, Shivani Pandey, Prachi Srivastava
Abstract:
Microarrays provide a rich source of data on the molecular working of cells. Each microarray reports on the abundance of tens of thousands of mRNAs. Virtually every human disease is being studied using microarrays with the hope of finding the molecular mechanisms of disease. Bioinformatics analysis plays an important part of processing the information embedded in large-scale expression profiling studies and for laying the foundation for biological interpretation. A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. One of the most popular platforms for microarray analysis is Bioconductor, an open source and open development software project based on the R programming language. This paper describes specific procedures for conducting quality assessment, visualization and preprocessing of Affymetrix Gene Chip and also details the different bioconductor packages used to analyze affymetrix microarray data and describe the analysis and outcome of each plots.Keywords: microarray analysis, R language, affymetrix visualization, bioconductor
Procedia PDF Downloads 4801768 Assessment of the Contribution of Geographic Information System Technology in Non Revenue Water: Case Study Dar Es Salaam Water and Sewerage Authority Kawe - Mzimuni Street
Authors: Victor Pesco Kassa
Abstract:
This research deals with the assessment of the contribution of GIS Technology in NRW. This research was conducted at Dar, Kawe Mzimuni Street. The data collection was obtained from existing source which is DAWASA HQ. The interpretation of the data was processed by using ArcGIS software. The data collected from the existing source reveals a good coverage of DAWASA’s water network at Mzimuni Street. Most of residents are connected to the DAWASA’s customer service. Also the collected data revealed that by using GIS DAWASA’s customer Geodatabase has been improved. Through GIS we can prepare customer location map purposely for site surveying also this map will be able to show different type of customer that are connected to DAWASA’s water service. This is a perfect contribution of the GIS Technology to address and manage the problem of NRW in DAWASA. Finally, the study recommends that the same study should be conducted in other DAWASA’s zones such as Temeke, Boko and Bagamoyo not only at Kawe Mzimuni Street. Through this study it is observed that ArcGIS software can offer powerful tools for managing and processing information geographically and in water and sanitation authorities such as DAWASA.Keywords: DAWASA, NRW, Esri, EURA, ArcGIS
Procedia PDF Downloads 831767 Phonological and Syntactic Evidence from Arabic in Favor of Biolinguistics
Authors: Marwan Jarrah
Abstract:
This research paper provides two pieces of phonological and syntactic evidence from Arabic for biolinguistics perspective of language processing. The first piece of evidence concerns the instances where a singular noun is converted to a plural noun in Arabic. Based on the findings of several research papers, this study shows that a singular word does not lose any of its moras when it is pluralized either regularly or irregularly. This mora conservation principle complies with the general physical law of the conservation of mass which states that mass is neither created nor destroyed but changed from one form into another. The second piece of evidence concerns the observation that when the object in some Arabic dialects including Jordanian Arabic and Najdi Arabic is a topic and positioned in situ (i.e. after the verb), the verb agrees with it, something that generates an agreeing inflection marker of the verb that agrees in Number, Person, and Gender with the in-situ topicalized object. This interaction between the verb and the object in such cases is invoked because of the extra feature the object bears, i.e. TOPIC feature. We suggest that such an interaction complies with the general natural law that elements become active when they, e.g., get an additional electron, when the mass number is not equal to the atomic number.Keywords: biolinguistics, Arabic, physics, interaction
Procedia PDF Downloads 2301766 Improvement of Bone Scintography Image Using Image Texture Analysis
Authors: Yousif Mohamed Y. Abdallah, Eltayeb Wagallah
Abstract:
Image enhancement allows the observer to see details in images that may not be immediately observable in the original image. Image enhancement is the transformation or mapping of one image to another. The enhancement of certain features in images is accompanied by undesirable effects. To achieve maximum image quality after denoising, a new, low order, local adaptive Gaussian scale mixture model and median filter were presented, which accomplishes nonlinearities from scattering a new nonlinear approach for contrast enhancement of bones in bone scan images using both gamma correction and negative transform methods. The usual assumption of a distribution of gamma and Poisson statistics only lead to overestimation of the noise variance in regions of low intensity but to underestimation in regions of high intensity and therefore to non-optional results. The contrast enhancement results were obtained and evaluated using MatLab program in nuclear medicine images of the bones. The optimal number of bins, in particular the number of gray-levels, is chosen automatically using entropy and average distance between the histogram of the original gray-level distribution and the contrast enhancement function’s curve.Keywords: bone scan, nuclear medicine, Matlab, image processing technique
Procedia PDF Downloads 5101765 Emotion Recognition Using Artificial Intelligence
Authors: Rahul Mohite, Lahcen Ouarbya
Abstract:
This paper focuses on the interplay between humans and computer systems and the ability of these systems to understand and respond to human emotions, including non-verbal communication. Current emotion recognition systems are based solely on either facial or verbal expressions. The limitation of these systems is that it requires large training data sets. The paper proposes a system for recognizing human emotions that combines both speech and emotion recognition. The system utilizes advanced techniques such as deep learning and image recognition to identify facial expressions and comprehend emotions. The results show that the proposed system, based on the combination of facial expression and speech, outperforms existing ones, which are based solely either on facial or verbal expressions. The proposed system detects human emotion with an accuracy of 86%, whereas the existing systems have an accuracy of 70% using verbal expression only and 76% using facial expression only. In this paper, the increasing significance and demand for facial recognition technology in emotion recognition are also discussed.Keywords: facial reputation, expression reputation, deep gaining knowledge of, photo reputation, facial technology, sign processing, photo type
Procedia PDF Downloads 1211764 Reading Knowledge Development and Its Phases with Generation Z
Authors: Onur Özdemir, M.Erhan ORHAN
Abstract:
Knowledge Development (KD) is just one of the important phases of Knowledge Management (KM). KD is the phase in which intelligence is used to see the big picture. In order to understand whether information is important or not, we have to use the intelligence cycle that includes four main steps: aiming, collecting data, processing and utilizing. KD also needs these steps. To make a precise decision, the decision maker has to be aware of his subordinates’ ideas. If the decision maker ignores the ideas of his subordinates or participants of the organization, it is not possible for him to get the target. KD is a way of using wisdom to accumulate the puzzle. If the decision maker does not bring together the puzzle pieces, he cannot get the big picture, and this shows its effects on the battlefield. In order to understand the battlefield, the decision maker has to use the intelligence cycle. To convert information to knowledge, KD is the main means for the intelligence cycle. On the other hand, the “Z Generation” born after the millennium are really the game changers. They have different attitudes from their elders. Their understanding of life is different - the definition of freedom and independence have different meanings to them than others. Decision makers have to consider these factors and rethink their decisions accordingly. This article tries to explain the relation between KD and Generation Z. KD is the main method of target managing. But if leaders neglect their people, the world will be seeing much more movements like the Arab Spring and other insurgencies.Keywords: knowledge development, knowledge management, generation Z, intelligence cycle
Procedia PDF Downloads 5171763 Catalytic Decomposition of High Energy Materials Using Nanoparticles of Copper Chromite
Authors: M. Sneha Reddy, M. Arun Kumar, V. Kameswara Rao
Abstract:
Chromites are binary transition metal oxides with a general formula of ACr₂O₄, where A = Mn²⁺, Fe²⁺, Co²⁺, Ni²⁺, and Cu²⁺. Chromites have a normal-type spinel structure with interesting applications in the areas of applied physics, material sciences, and geophysics. They have attracted great consideration because of their unique physicochemical properties and tremendous technological applications in nanodevices, sensor elements, and high-temperature ceramics with useful optical properties. Copper chromite is one of the most efficient spinel oxides, having pronounced commercial application as a catalyst in various chemical reactions like oxidation, hydrogenation, alkylation, dehydrogenation, decomposition of organic compounds, and hydrogen production. Apart from its usage in chemical industries, CuCr₂O₄ finds its major application as a burn rate modifier in solid propellant processing for space launch vehicles globally. Herein we synthesized the nanoparticles of copper chromite using the co-precipitation method. The synthesized nanoparticles were characterized by XRD, TEM, SEM, BET, and TG-DTA. The synthesized nanoparticles of copper chromites were used as a catalyst for the thermal decomposition of various high-energy materials.Keywords: copper chromite, coprecipitation method, high energy materials, catalytic thermal decomposition
Procedia PDF Downloads 771762 A Systamatic Review on Experimental, FEM Analysis and Simulation of Metal Spinning Process
Authors: Amol M. Jadhav, Sharad S. Chudhari, S. S. Khedkar
Abstract:
This review presents a through survey of research paper work on the experimental analysis, FEM Analysis & simulation of the metal spinning process. In this literature survey all the papers being taken from Elsevier publication and most of the from journal of material processing technology. In a last two decade or so, metal spinning process gradually used as chip less formation for the production of engineering component in a small to medium batch quantities. The review aims to provide include into the experimentation, FEM analysis of various components, simulation of metal spinning process and act as guide for research working on metal spinning processes. The review of existing work has several gaps in current knowledge of metal spinning processes. The evaluation of experiment is thickness strain, the spinning force, the twisting angle, the surface roughness of the conventional & shear metal spinning process; the evaluation of FEM of metal spinning to path definition with sufficient fine mesh to capture behavior of work piece; The evaluation of feed rate of roller, direction of roller,& type of roller stimulated. The metal spinning process has the more flexible to produce a wider range of product shape & to form more challenge material.Keywords: metal spinning, FEM analysis, simulation of metal spinning, mechanical engineering
Procedia PDF Downloads 3871761 Computational Fluid Dynamics Simulations of Thermal and Flow Fields inside a Desktop Personal Computer Cabin
Authors: Mohammad Salehi, Mohammad Erfan Doraki
Abstract:
In this paper, airflow analysis inside a desktop computer case is performed by simulating computational fluid dynamics. The purpose is to investigate the cooling process of the central processing unit (CPU) with thermal capacities of 80 and 130 watts. The airflow inside the computer enclosure, selected from the microATX model, consists of the main components of heat production such as CPU, hard disk drive, CD drive, floppy drive, memory card and power supply unit; According to the amount of thermal power produced by the CPU with 80 and 130 watts of power, two different geometries have been used for a direct and radial heat sink. First, the independence of the computational mesh and the validation of the solution were performed, and after ensuring the correctness of the numerical solution, the results of the solution were analyzed. The simulation results showed that changes in CPU temperature and other components linearly increased with increasing CPU heat output. Also, the ambient air temperature has a significant effect on the maximum processor temperature.Keywords: computational fluid dynamics, CPU cooling, computer case simulation, heat sink
Procedia PDF Downloads 1221760 The Capacity of Mel Frequency Cepstral Coefficients for Speech Recognition
Authors: Fawaz S. Al-Anzi, Dia AbuZeina
Abstract:
Speech recognition is of an important contribution in promoting new technologies in human computer interaction. Today, there is a growing need to employ speech technology in daily life and business activities. However, speech recognition is a challenging task that requires different stages before obtaining the desired output. Among automatic speech recognition (ASR) components is the feature extraction process, which parameterizes the speech signal to produce the corresponding feature vectors. Feature extraction process aims at approximating the linguistic content that is conveyed by the input speech signal. In speech processing field, there are several methods to extract speech features, however, Mel Frequency Cepstral Coefficients (MFCC) is the popular technique. It has been long observed that the MFCC is dominantly used in the well-known recognizers such as the Carnegie Mellon University (CMU) Sphinx and the Markov Model Toolkit (HTK). Hence, this paper focuses on the MFCC method as the standard choice to identify the different speech segments in order to obtain the language phonemes for further training and decoding steps. Due to MFCC good performance, the previous studies show that the MFCC dominates the Arabic ASR research. In this paper, we demonstrate MFCC as well as the intermediate steps that are performed to get these coefficients using the HTK toolkit.Keywords: speech recognition, acoustic features, mel frequency, cepstral coefficients
Procedia PDF Downloads 2591759 Petra: Simplified, Scalable Verification Using an Object-Oriented, Compositional Process Calculus
Authors: Aran Hakki, Corina Cirstea, Julian Rathke
Abstract:
Formal methods are yet to be utilized in mainstream software development due to issues in scaling and implementation costs. This work is about developing a scalable, simplified, pragmatic, formal software development method with strong correctness properties and guarantees that are easy prove. The method aims to be easy to learn, use and apply without extensive training and experience in formal methods. Petra is proposed as an object-oriented, process calculus with composable data types and sequential/parallel processes. Petra has a simple denotational semantics, which includes a definition of Correct by Construction. The aim is for Petra is to be standard which can be implemented to execute on various mainstream programming platforms such as Java. Work towards an implementation of Petra as a Java EDSL (Embedded Domain Specific Language) is also discussed.Keywords: compositionality, formal method, software verification, Java, denotational semantics, rewriting systems, rewriting semantics, parallel processing, object-oriented programming, OOP, programming language, correct by construction
Procedia PDF Downloads 1451758 SnSₓ, Cu₂ZnSnS₄ Nanostructured Thin Layers for Thin-Film Solar Cells
Authors: Elena A. Outkina, Marina V. Meledina, Aliaksandr A. Khodin
Abstract:
Nanostructured thin films of SnSₓ, Cu₂ZnSnS₄ (CZTS) semiconductors were fabricated by chemical processing to produce thin-film photoactive layers for photocells as a prospective lowest-cost and environment-friendly alternative to Si, Cu(In, Ga)Se₂, and other traditional solar cells materials. To produce SnSₓ layers, the modified successive ionic layer adsorption and reaction (SILAR) technique were investigated, including successive cyclic dipping into Na₂S solution and SnCl₂, NaCl, triethanolamine solution. To fabricate CZTS layers, the cyclic dipping into CuSO₄ with ZnSO₄, SnCl₂, and Na₂S solutions was used with intermediate rinsing in distilled water. The nano-template aluminum/alumina substrate was used to control deposition processes. Micromorphology and optical characteristics of the fabricated layers have been investigated. Analysis of 2D-like layers deposition features using nano-template substrate is presented, including the effect of nanotips in a template on surface charge redistribution and transport.Keywords: kesterite, nanotemplate, SILAR, solar cell, tin sulphide
Procedia PDF Downloads 1421757 Life Cycle-Based Analysis of Meat Production: Ecosystem Impacts
Authors: Michelle Zeyuan Ma, Hermann Heilmeier
Abstract:
Recently, meat production ecosystem impacts initiated many hot discussions and researchers, and it is a difficult implementation to reduce such impacts due to the demand of meat products. It calls for better management and control of ecosystem impacts from every aspects of meat production. This article analyzes the ecosystem impacts of meat production based on meat products life cycle. The analysis shows that considerable ecosystem impacts are caused by different meat production steps: initial establishment phase, animal raising, slaughterhouse processing, meat consumption, and wastes management. Based on this analysis, the impacts are summarized as: leading factor for biodiversity loss; water waste, land use waste and land degradation; greenhouse gases emissions; pollution to air, water, and soil; related major diseases. The article also provides a discussion on a solution-sustainable food system, which could help in reducing ecosystem impacts. The analysis method is based on the life cycle level, it provides a concept of the whole meat industry ecosystem impacts, and the analysis result could be useful to manage or control meat production ecosystem impacts from investor, producer and consumer sides.Keywords: eutrophication, life cycle based analysis, sustainable food, waste management
Procedia PDF Downloads 2201756 Multi-Channel Charge-Coupled Device Sensors Real-Time Cell Growth Monitor System
Authors: Han-Wei Shih, Yao-Nan Wang, Ko-Tung Chang, Lung-Ming Fu
Abstract:
A multi-channel cell growth real-time monitor and evaluation system using charge-coupled device (CCD) sensors with 40X lens integrating a NI LabVIEW image processing program is proposed and demonstrated. The LED light source control of monitor system is utilizing 8051 microprocessor integrated with NI LabVIEW software. In this study, the same concentration RAW264.7 cells growth rate and morphology in four different culture conditions (DMEM, LPS, G1, G2) were demonstrated. The real-time cells growth image was captured and analyzed by NI Vision Assistant every 10 minutes in the incubator. The image binarization technique was applied for calculating cell doubling time and cell division index. The cells doubling time and cells division index of four group with DMEM, LPS, LPS+G1, LPS+G2 are 12.3 hr,10.8 hr,14.0 hr,15.2 hr and 74.20%, 78.63%, 69.53%, 66.49%. The image magnification of multi-channel CCDs cell real-time monitoring system is about 100X~200X which compares with the traditional microscope.Keywords: charge-coupled device (CCD), RAW264.7, doubling time, division index
Procedia PDF Downloads 3581755 Enhancing Spatial Interpolation: A Multi-Layer Inverse Distance Weighting Model for Complex Regression and Classification Tasks in Spatial Data Analysis
Authors: Yakin Hajlaoui, Richard Labib, Jean-François Plante, Michel Gamache
Abstract:
This study introduces the Multi-Layer Inverse Distance Weighting Model (ML-IDW), inspired by the mathematical formulation of both multi-layer neural networks (ML-NNs) and Inverse Distance Weighting model (IDW). ML-IDW leverages ML-NNs' processing capabilities, characterized by compositions of learnable non-linear functions applied to input features, and incorporates IDW's ability to learn anisotropic spatial dependencies, presenting a promising solution for nonlinear spatial interpolation and learning from complex spatial data. it employ gradient descent and backpropagation to train ML-IDW, comparing its performance against conventional spatial interpolation models such as Kriging and standard IDW on regression and classification tasks using simulated spatial datasets of varying complexity. the results highlight the efficacy of ML-IDW, particularly in handling complex spatial datasets, exhibiting lower mean square error in regression and higher F1 score in classification.Keywords: deep learning, multi-layer neural networks, gradient descent, spatial interpolation, inverse distance weighting
Procedia PDF Downloads 52