Search results for: information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13528

Search results for: information processing

12958 A Recommender System for Job Seekers to Show up Companies Based on Their Psychometric Preferences and Company Sentiment Scores

Authors: A. Ashraff

Abstract:

The increasing importance of the web as a medium for electronic and business transactions has served as a catalyst or rather a driving force for the introduction and implementation of recommender systems. Recommender Systems play a major role in processing and analyzing thousands of data rows or reviews and help humans make a purchase decision of a product or service. It also has the ability to predict whether a particular user would rate a product or service based on the user’s profile behavioral pattern. At present, Recommender Systems are being used extensively in every domain known to us. They are said to be ubiquitous. However, in the field of recruitment, it’s not being utilized exclusively. Recent statistics show an increase in staff turnover, which has negatively impacted the organization as well as the employee. The reasons being company culture, working flexibility (work from home opportunity), no learning advancements, and pay scale. Further investigations revealed that there are lacking guidance or support, which helps a job seeker find the company that will suit him best, and though there’s information available about companies, job seekers can’t read all the reviews by themselves and get an analytical decision. In this paper, we propose an approach to study the available review data on IT companies (score their reviews based on user review sentiments) and gather information on job seekers, which includes their Psychometric evaluations. Then presents the job seeker with useful information or rather outputs on which company is most suitable for the job seeker. The theoretical approach, Algorithmic approach and the importance of such a system will be discussed in this paper.

Keywords: psychometric tests, recommender systems, sentiment analysis, hybrid recommender systems

Procedia PDF Downloads 104
12957 Theory and Practice of Wavelets in Signal Processing

Authors: Jalal Karam

Abstract:

The methods of Fourier, Laplace, and Wavelet Transforms provide transfer functions and relationships between the input and the output signals in linear time invariant systems. This paper shows the equivalence among these three methods and in each case presenting an application of the appropriate (Fourier, Laplace or Wavelet) to the convolution theorem. In addition, it is shown that the same holds for a direct integration method. The Biorthogonal wavelets Bior3.5 and Bior3.9 are examined and the zeros distribution of their polynomials associated filters are located. This paper also presents the significance of utilizing wavelets as effective tools in processing speech signals for common multimedia applications in general, and for recognition and compression in particular. Theoretically and practically, wavelets have proved to be effective and competitive. The practical use of the Continuous Wavelet Transform (CWT) in processing and analysis of speech is then presented along with explanations of how the human ear can be thought of as a natural wavelet transformer of speech. This generates a variety of approaches for applying the (CWT) to many paradigms analysing speech, sound and music. For perception, the flexibility of implementation of this transform allows the construction of numerous scales and we include two of them. Results for speech recognition and speech compression are then included.

Keywords: continuous wavelet transform, biorthogonal wavelets, speech perception, recognition and compression

Procedia PDF Downloads 411
12956 Multi-Class Text Classification Using Ensembles of Classifiers

Authors: Syed Basit Ali Shah Bukhari, Yan Qiang, Saad Abdul Rauf, Syed Saqlaina Bukhari

Abstract:

Text Classification is the methodology to classify any given text into the respective category from a given set of categories. It is highly important and vital to use proper set of pre-processing , feature selection and classification techniques to achieve this purpose. In this paper we have used different ensemble techniques along with variance in feature selection parameters to see the change in overall accuracy of the result and also on some other individual class based features which include precision value of each individual category of the text. After subjecting our data through pre-processing and feature selection techniques , different individual classifiers were tested first and after that classifiers were combined to form ensembles to increase their accuracy. Later we also studied the impact of decreasing the classification categories on over all accuracy of data. Text classification is highly used in sentiment analysis on social media sites such as twitter for realizing people’s opinions about any cause or it is also used to analyze customer’s reviews about certain products or services. Opinion mining is a vital task in data mining and text categorization is a back-bone to opinion mining.

Keywords: Natural Language Processing, Ensemble Classifier, Bagging Classifier, AdaBoost

Procedia PDF Downloads 227
12955 Improved Super-Resolution Using Deep Denoising Convolutional Neural Network

Authors: Pawan Kumar Mishra, Ganesh Singh Bisht

Abstract:

Super-resolution is the technique that is being used in computer vision to construct high-resolution images from a single low-resolution image. It is used to increase the frequency component, recover the lost details and removing the down sampling and noises that caused by camera during image acquisition process. High-resolution images or videos are desired part of all image processing tasks and its analysis in most of digital imaging application. The target behind super-resolution is to combine non-repetition information inside single or multiple low-resolution frames to generate a high-resolution image. Many methods have been proposed where multiple images are used as low-resolution images of same scene with different variation in transformation. This is called multi-image super resolution. And another family of methods is single image super-resolution that tries to learn redundancy that presents in image and reconstruction the lost information from a single low-resolution image. Use of deep learning is one of state of art method at present for solving reconstruction high-resolution image. In this research, we proposed Deep Denoising Super Resolution (DDSR) that is a deep neural network for effectively reconstruct the high-resolution image from low-resolution image.

Keywords: resolution, deep-learning, neural network, de-blurring

Procedia PDF Downloads 510
12954 Between AACR2 and RDA What Changes Occurs in Them

Authors: Ibrahim Abdullahi Mohammad

Abstract:

A library catalogue exists not only as an inventory of the collections of the particular library, but also as a retrieval device. It is provided to assist the library user in finding whatever information or information resources they may be looking for. The paper proposes that this location objective of the library catalogue can only be fulfilled, if the library catalogue is constructed, bearing in mind the information needs and searching behavior of the library user. Comparing AACR2 and RDA viz-a-viz the changes RDA has introduced into bibliographic standards, the paper tries to establish the level of viability of RDA in relation to AACR2.

Keywords: library catalogue, information retrieval, AACR2, RDA

Procedia PDF Downloads 47
12953 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map

Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo

Abstract:

Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.

Keywords: RDM, multi-source data, big data, U-City

Procedia PDF Downloads 429
12952 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction

Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun

Abstract:

The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.

Keywords: usability, qualitative data, text-processing algorithm, natural language processing

Procedia PDF Downloads 281
12951 Designing a Method to Control and Determine the Financial Performance of the Real Cost Sub-System in the Information Management System of Construction Projects

Authors: Alireza Ghaffari, Hassan Saghi

Abstract:

Project management is more complex than managing the day-to-day affairs of an organization. When the project dimensions are broad and multiple projects have to be monitored in different locations, the integrated management becomes even more complicated. One of the main concerns of project managers is the integrated project management, which is mainly rooted in the lack of accurate and accessible information from different projects in various locations. The collection of dispersed information from various parts of the network, their integration and finally the selective reporting of this information is among the goals of integrated information systems. It can help resolve the main problem, which is bridging the information gap between executives and senior managers in the organization. Therefore, the main objective of this study is to design and implement an important subset of a project management information system in order to successfully control the cost of construction projects so that its results can be used to design raw software forms and proposed relationships between different project units for the collection of necessary information.

Keywords: financial performance, cost subsystem, PMIS, project management

Procedia PDF Downloads 103
12950 Simulation of 3-D Direction-of-Arrival Estimation Using MUSIC Algorithm

Authors: Duckyong Kim, Jong Kang Park, Jong Tae Kim

Abstract:

DOA (Direction of Arrival) estimation is an important method in array signal processing and has a wide range of applications such as direction finding, beam forming, and so on. In this paper, we briefly introduce the MUSIC (Multiple Signal Classification) Algorithm, one of DOA estimation methods for analyzing several targets. Then we apply the MUSIC algorithm to the two-dimensional antenna array to analyze DOA estimation in 3D space through MATLAB simulation. We also analyze the design factors that can affect the accuracy of DOA estimation through simulation, and proceed with further consideration on how to apply the system.

Keywords: DOA estimation, MUSIC algorithm, spatial spectrum, array signal processing

Procedia PDF Downloads 369
12949 A Method for Processing Unwanted Target Caused by Reflection in Secondary Surveillance Radar

Authors: Khanh D.Do, Loi V.Nguyen, Thanh N.Nguyen, Thang M.Nguyen, Vu T.Tran

Abstract:

Along with the development of Secondary surveillance radar (SSR) in air traffic surveillance systems, the Multipath phenomena has always been a noticeable problem. This following article discusses the geometrical aspect and power aspect of the Multipath interference caused by reflection in SSR and proposes a method to deal with these unwanted multipath targets (ghosts) by false-target position predicting and adaptive target suppressing. A field-experiment example is mentioned at the end of the article to demonstrate the efficiency of this measure.

Keywords: multipath, secondary surveillance radar, digital signal processing, reflection

Procedia PDF Downloads 157
12948 Developing a Model for Information Giving Behavior in Virtual Communities

Authors: Pui-Lai To, Chechen Liao, Tzu-Ling Lin

Abstract:

Virtual communities have created a range of new social spaces in which to meet and interact with one another. Both as a stand-alone model or as a supplement to sustain competitive advantage for normal business models, building virtual communities has been hailed as one of the major strategic innovations of the new economy. However for a virtual community to evolve, the biggest challenge is how to make members actively give information or provide advice. Even in busy virtual communities, usually, only a small fraction of members post information actively. In order to investigate the determinants of information giving willingness of those contributors who usually actively provide their opinions, we proposed a model to understand the reasons for contribution in communities. The study will definitely serve as a basis for the future growth of information giving in virtual communities.

Keywords: information giving, social identity, trust, virtual community

Procedia PDF Downloads 318
12947 Sustainability in the Purchase of Airline Tickets: Analysis of Digital Communication from the Perspective of Neuroscience

Authors: Rodríguez Sánchez Carla, Sancho-Esper Franco, Guillen-Davo Marina

Abstract:

Tourism is one of the most important sectors worldwide since it is an important economic engine for today's society. It is also one of the sectors that most negatively affect the environment in terms of CO₂ emissions due to this expansion. In light of this, airlines are developing Voluntary Carbon Offset (VCO). There is important evidence focused on analyzing the features of these VCO programs and their efficacy in reducing CO₂ emissions, and findings are mixed without a clear consensus. Different research approaches have centered on analyzing factors and consequences of VCO programs, such as economic modelling based on panel data, survey research based on traveler responses or experimental research analyzing customer decisions in a simulated context. This study belongs to the latter group because it tries to understand how different characteristics of an online ticket purchase website affect the willingness of a traveler to choose a sustainable one. The proposed behavioral model is based on several theories, such as the nudge theory, the dual processing ELM and the cognitive dissonance theory. This randomized experiment aims at overcoming previous studies based on self-reported measures that mainly study sustainable behavioral intention rather than actual decision-making. It also complements traditional self-reported independent variables by gathering objective information from an eye-tracking device. This experiment analyzes the influence of two characteristics of the online purchase website: i) the type of information regarding flight CO₂ emissions (quantitative vs. qualitative) and the comparison framework related to the sustainable purchase decision (negative: alternative with more emissions than the average flight of the route vs. positive: alternative with less emissions than the average flight of the route), therefore it is a 2x2 experiment with four alternative scenarios. A pretest was run before the actual experiment to refine the experiment features and to check the manipulations. Afterward, a different sample of students answered the pre-test questionnaire aimed at recruiting the cases and measuring several pre-stimulus measures. One week later, students came to the neurolab at the University setting to be part of the experiment, made their decision regarding online purchases and answered the post-test survey. A final sample of 21 students was gathered. The committee of ethics of the institution approved the experiment. The results show that qualitative information generates more sustainable decisions (less contaminant alternative) than quantitative information. Moreover, evidence shows that subjects are more willing to choose the sustainable decision to be more ecological (comparison of the average with the less contaminant alternative) rather than to be less contaminant (comparison of the average with the more contaminant alternative). There are also interesting differences in the information processing variables from the eye tracker. Both the total time to make the choice and the specific times by area of interest (AOI) differ depending on the assigned scenario. These results allow for a better understanding of the factors that condition the decision of a traveler to be part of a VCO program and provide useful information for airline managers to promote these programs to reduce environmental impact.

Keywords: voluntary carbon offset, airline, online purchase, carbon emission, sustainability, randomized experiment

Procedia PDF Downloads 72
12946 Microfabrication of Three-Dimensional SU-8 Structures Using Positive SPR Photoresist as a Sacrificial Layer for Integration of Microfluidic Components on Biosensors

Authors: Su Yin Chiam, Qing Xin Zhang, Jaehoon Chung

Abstract:

Complementary metal-oxide-semiconductor (CMOS) integrated circuits (ICs) have obtained increased attention in the biosensor community because CMOS technology provides cost-effective and high-performance signal processing at a mass-production level. In order to supply biological samples and reagents effectively to the sensing elements, there are increasing demands for seamless integration of microfluidic components on the fabricated CMOS wafers by post-processing. Although the PDMS microfluidic channels replicated from separately prepared silicon mold can be typically aligned and bonded onto the CMOS wafers, it remains challenging owing the inherently limited aligning accuracy ( > ± 10 μm) between the two layers. Here we present a new post-processing method to create three-dimensional microfluidic components using two different polarities of photoresists, an epoxy-based negative SU-8 photoresist and positive SPR220-7 photoresist. The positive photoresist serves as a sacrificial layer and the negative photoresist was utilized as a structural material to generate three-dimensional structures. Because both photoresists are patterned using a standard photolithography technology, the dimensions of the structures can be effectively controlled as well as the alignment accuracy, moreover, is dramatically improved (< ± 2 μm) and appropriately can be adopted as an alternative post-processing method. To validate the proposed processing method, we applied this technique to build cell-trapping structures. The SU8 photoresist was mainly used to generate structures and the SPR photoresist was used as a sacrificial layer to generate sub-channel in the SU8, allowing fluid to pass through. The sub-channel generated by etching the sacrificial layer works as a cell-capturing site. The well-controlled dimensions enabled single-cell capturing on each site and high-accuracy alignment made cells trapped exactly on the sensing units of CMOS biosensors.

Keywords: SU-8, microfluidic, MEMS, microfabrication

Procedia PDF Downloads 516
12945 Arabic Quran Search Tool Based on Ontology

Authors: Mohammad Alqahtani, Eric Atwell

Abstract:

This paper reviews and classifies most of the important types of search techniques that have been applied on the holy Quran. Then, it addresses the limitations in these techniques. Additionally, this paper surveys most existing Quranic ontologies and what are their deficiencies. Finally, it explains a new search tool called: A semantic search tool for Al Quran based on Qur’anic ontologies. This tool will overcome all limitations in the existing Quranic search applications.

Keywords: holy Quran, natural language processing (NLP), semantic search, information retrieval (IR), ontology

Procedia PDF Downloads 568
12944 Design and Construction of a Maize Dehusking Machine for Small and Medium-Scale Farmers

Authors: Francis Ojo Ologunagba, Monday Olatunbosun Ale, Lewis A. Olutayo

Abstract:

The economic successes of commercial development of agricultural product processing depend upon the adaptability of each processing stage to mechanization. In maize processing, one of its post-harvest operations that is still facing a major challenge is dehusking. Therefore, a maize dehusking machine that could replace the prevalent traditional method of dehusking maize in developing countries, especially Nigeria was designed, constructed and tested at the Department of Agricultural and Bio-Environmental Engineering Technology, Rufus Giwa Polytechnic, Owo. The basic features of the machine are feeding unit (hopper), housing frame, dehusking unit, drive mechanism and discharge outlets. The machine was tested with maize of 50mm average diameter at 13% moisture content and 2.5mm machine roller clearance. Test results showed appreciable performance with the dehusking efficiency of 92% and throughput capacity of 200 Kg/hr at a machine speed of 400rpm. The estimated production cost of the machine at the time of construction is forty-five thousand, one hundred and eighty nairas (₦45,180) excluding the cost of the electric motor. It is therefore recommended for small and medium-scale maize farmers and processors in Nigeria.

Keywords: construction, dehusking, design, efficiency, maize

Procedia PDF Downloads 318
12943 Combined Synchrotron Radiography and Diffraction for in Situ Study of Reactive Infiltration of Aluminum into Iron Porous Preform

Authors: S. Djaziri, F. Sket, A. Hynowska, S. Milenkovic

Abstract:

The use of Fe-Al based intermetallics as an alternative to Cr/Ni based stainless steels is very promising for industrial applications that use critical raw materials parts under extreme conditions. However, the development of advanced Fe-Al based intermetallics with appropriate mechanical properties presents several challenges that involve appropriate processing and microstructure control. A processing strategy is being developed which aims at producing a net-shape porous Fe-based preform that is infiltrated with molten Al or Al-alloy. In the present work, porous Fe-based preforms produced by two different methods (selective laser melting (SLM) and Kochanek-process (KE)) are studied during infiltration with molten aluminum. In the objective to elucidate the mechanisms underlying the formation of Fe-Al intermetallic phases during infiltration, an in-house furnace has been designed for in situ observation of infiltration at synchrotron facilities combining x-ray radiography (XR) and x-ray diffraction (XRD) techniques. The feasibility of this approach has been demonstrated, and information about the melt flow front propagation has been obtained. In addition, reactive infiltration has been achieved where a bi-phased intermetallic layer has been identified to be formed between the solid Fe and liquid Al. In particular, a tongue-like Fe₂Al₅ phase adhering to the Fe and a needle-like Fe₄Al₁₃ phase adhering to the Al were observed. The growth of the intermetallic compound was found to be dependent on the temperature gradient present along the preform as well as on the reaction time which will be discussed in view of the different obtained results.

Keywords: combined synchrotron radiography and diffraction, Fe-Al intermetallic compounds, in-situ molten Al infiltration, porous solid Fe preforms

Procedia PDF Downloads 223
12942 The Role of Libraries in the Context of Indian Knowledge Based Society

Authors: Sanjeev Sharma

Abstract:

We are living in the information age. Information is not only important to an individual but also to researchers, scientists, academicians and all others who are doing work in their respective fields. The 21st century which is also known as the electronic era has brought several changes in the mechanism of the libraries in their working environment. In the present scenario, acquisition of information resources and implementation of new strategies have brought a revolution in the library’s structures and their principles. In the digital era, the role of the library has become important as new information is coming at every minute. The knowledge society wants to seek information at their desk. The libraries are managing electronic services and web-based information sources constantly in a democratic way. The basic objective of every library is to save the time of user which is based on the quality and user-orientation of services. With the advancement of information communication and technology, the libraries should pay more devotion to the development trends of the information society that would help to adjust their development strategies and information needs of the knowledge society. The knowledge-based society demands to re-define the position and objectives of all the institutions which work with information, knowledge, and culture. The situation is the era of digital India is changing at a fast speed. Everyone wants information 24x7 and libraries have been recognized as one of the key elements for open access to information, which is crucial not only to individual but also to democratic knowledge-based information society. Libraries are especially important now a day the whole concept of education is focusing more and more independent e-learning and their acting. The citizens of India must be able to find and use the relevant information. Here we can see libraries enter the stage: The essential features of libraries are to acquire, organize, store and retrieve for use and preserve publicly available material irrespective of the print as well as non-print form in which it is packaged in such a way that, when it is needed, it can be found and put to use.

Keywords: knowledge, society, libraries, culture

Procedia PDF Downloads 138
12941 A Pedagogical Study of Computational Design in a Simulated Building Information Modeling-Cloud Environment

Authors: Jaehwan Jung, Sung-Ah Kim

Abstract:

Building Information Modeling (BIM) provides project stakeholders with various information about property and geometry of entire component as a 3D object-based parametric building model. BIM represents a set of Information and solutions that are expected to improve collaborative work process and quality of the building design. To improve collaboration among project participants, the BIM model should provide the necessary information to remote participants in real time and manage the information in the process. The purpose of this paper is to propose a process model that can apply effective architectural design collaborative work process in architectural design education in BIM-Cloud environment.

Keywords: BIM, cloud computing, collaborative design, digital design education

Procedia PDF Downloads 426
12940 Part of Speech Tagging Using Statistical Approach for Nepali Text

Authors: Archit Yajnik

Abstract:

Part of Speech Tagging has always been a challenging task in the era of Natural Language Processing. This article presents POS tagging for Nepali text using Hidden Markov Model and Viterbi algorithm. From the Nepali text, annotated corpus training and testing data set are randomly separated. Both methods are employed on the data sets. Viterbi algorithm is found to be computationally faster and accurate as compared to HMM. The accuracy of 95.43% is achieved using Viterbi algorithm. Error analysis where the mismatches took place is elaborately discussed.

Keywords: hidden markov model, natural language processing, POS tagging, viterbi algorithm

Procedia PDF Downloads 323
12939 Intelligent Rheumatoid Arthritis Identification System Based Image Processing and Neural Classifier

Authors: Abdulkader Helwan

Abstract:

Rheumatoid joint inflammation is characterized as a perpetual incendiary issue which influences the joints by hurting body tissues Therefore, there is an urgent need for an effective intelligent identification system of knee Rheumatoid arthritis especially in its early stages. This paper is to develop a new intelligent system for the identification of Rheumatoid arthritis of the knee utilizing image processing techniques and neural classifier. The system involves two principle stages. The first one is the image processing stage in which the images are processed using some techniques such as RGB to gryascale conversion, rescaling, median filtering, background extracting, images subtracting, segmentation using canny edge detection, and features extraction using pattern averaging. The extracted features are used then as inputs for the neural network which classifies the X-ray knee images as normal or abnormal (arthritic) based on a backpropagation learning algorithm which involves training of the network on 400 X-ray normal and abnormal knee images. The system was tested on 400 x-ray images and the network shows good performance during that phase, resulting in a good identification rate 97%.

Keywords: rheumatoid arthritis, intelligent identification, neural classifier, segmentation, backpropoagation

Procedia PDF Downloads 530
12938 Early Detection of Breast Cancer in Digital Mammograms Based on Image Processing and Artificial Intelligence

Authors: Sehreen Moorat, Mussarat Lakho

Abstract:

A method of artificial intelligence using digital mammograms data has been proposed in this paper for detection of breast cancer. Many researchers have developed techniques for the early detection of breast cancer; the early diagnosis helps to save many lives. The detection of breast cancer through mammography is effective method which detects the cancer before it is felt and increases the survival rate. In this paper, we have purposed image processing technique for enhancing the image to detect the graphical table data and markings. Texture features based on Gray-Level Co-Occurrence Matrix and intensity based features are extracted from the selected region. For classification purpose, neural network based supervised classifier system has been used which can discriminate between benign and malignant. Hence, 68 digital mammograms have been used to train the classifier. The obtained result proved that automated detection of breast cancer is beneficial for early diagnosis and increases the survival rates of breast cancer patients. The proposed system will help radiologist in the better interpretation of breast cancer.

Keywords: medical imaging, cancer, processing, neural network

Procedia PDF Downloads 256
12937 Acceleration of Lagrangian and Eulerian Flow Solvers via Graphics Processing Units

Authors: Pooya Niksiar, Ali Ashrafizadeh, Mehrzad Shams, Amir Hossein Madani

Abstract:

There are many computationally demanding applications in science and engineering which need efficient algorithms implemented on high performance computers. Recently, Graphics Processing Units (GPUs) have drawn much attention as compared to the traditional CPU-based hardware and have opened up new improvement venues in scientific computing. One particular application area is Computational Fluid Dynamics (CFD), in which mature CPU-based codes need to be converted to GPU-based algorithms to take advantage of this new technology. In this paper, numerical solutions of two classes of discrete fluid flow models via both CPU and GPU are discussed and compared. Test problems include an Eulerian model of a two-dimensional incompressible laminar flow case and a Lagrangian model of a two phase flow field. The CUDA programming standard is used to employ an NVIDIA GPU with 480 cores and a C++ serial code is run on a single core Intel quad-core CPU. Up to two orders of magnitude speed up is observed on GPU for a certain range of grid resolution or particle numbers. As expected, Lagrangian formulation is better suited for parallel computations on GPU although Eulerian formulation represents significant speed up too.

Keywords: CFD, Eulerian formulation, graphics processing units, Lagrangian formulation

Procedia PDF Downloads 405
12936 A Modified Shannon Entropy Measure for Improved Image Segmentation

Authors: Mohammad A. U. Khan, Omar A. Kittaneh, M. Akbar, Tariq M. Khan, Husam A. Bayoud

Abstract:

The Shannon Entropy measure has been widely used for measuring uncertainty. However, in partial settings, the histogram is used to estimate the underlying distribution. The histogram is dependent on the number of bins used. In this paper, a modification is proposed that makes the Shannon entropy based on histogram consistent. For providing the benefits, two application are picked in medical image processing applications. The simulations are carried out to show the superiority of this modified measure for image segmentation problem. The improvement may be contributed to robustness shown to uneven background in images.

Keywords: Shannon entropy, medical image processing, image segmentation, modification

Procedia PDF Downloads 493
12935 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 273
12934 Assessment of the Contribution of Geographic Information System Technology in Non Revenue Water: Case Study Dar Es Salaam Water and Sewerage Authority Kawe - Mzimuni Street

Authors: Victor Pesco Kassa

Abstract:

This research deals with the assessment of the contribution of GIS Technology in NRW. This research was conducted at Dar, Kawe Mzimuni Street. The data collection was obtained from existing source which is DAWASA HQ. The interpretation of the data was processed by using ArcGIS software. The data collected from the existing source reveals a good coverage of DAWASA’s water network at Mzimuni Street. Most of residents are connected to the DAWASA’s customer service. Also the collected data revealed that by using GIS DAWASA’s customer Geodatabase has been improved. Through GIS we can prepare customer location map purposely for site surveying also this map will be able to show different type of customer that are connected to DAWASA’s water service. This is a perfect contribution of the GIS Technology to address and manage the problem of NRW in DAWASA. Finally, the study recommends that the same study should be conducted in other DAWASA’s zones such as Temeke, Boko and Bagamoyo not only at Kawe Mzimuni Street. Through this study it is observed that ArcGIS software can offer powerful tools for managing and processing information geographically and in water and sanitation authorities such as DAWASA.

Keywords: DAWASA, NRW, Esri, EURA, ArcGIS

Procedia PDF Downloads 78
12933 Reading Knowledge Development and Its Phases with Generation Z

Authors: Onur Özdemir, M.Erhan ORHAN

Abstract:

Knowledge Development (KD) is just one of the important phases of Knowledge Management (KM). KD is the phase in which intelligence is used to see the big picture. In order to understand whether information is important or not, we have to use the intelligence cycle that includes four main steps: aiming, collecting data, processing and utilizing. KD also needs these steps. To make a precise decision, the decision maker has to be aware of his subordinates’ ideas. If the decision maker ignores the ideas of his subordinates or participants of the organization, it is not possible for him to get the target. KD is a way of using wisdom to accumulate the puzzle. If the decision maker does not bring together the puzzle pieces, he cannot get the big picture, and this shows its effects on the battlefield. In order to understand the battlefield, the decision maker has to use the intelligence cycle. To convert information to knowledge, KD is the main means for the intelligence cycle. On the other hand, the “Z Generation” born after the millennium are really the game changers. They have different attitudes from their elders. Their understanding of life is different - the definition of freedom and independence have different meanings to them than others. Decision makers have to consider these factors and rethink their decisions accordingly. This article tries to explain the relation between KD and Generation Z. KD is the main method of target managing. But if leaders neglect their people, the world will be seeing much more movements like the Arab Spring and other insurgencies.

Keywords: knowledge development, knowledge management, generation Z, intelligence cycle

Procedia PDF Downloads 512
12932 Assessment of Seeding and Weeding Field Robot Performance

Authors: Victor Bloch, Eerikki Kaila, Reetta Palva

Abstract:

Field robots are an important tool for enhancing efficiency and decreasing the climatic impact of food production. There exists a number of commercial field robots; however, since this technology is still new, the robot advantages and limitations, as well as methods for optimal using of robots, are still unclear. In this study, the performance of a commercial field robot for seeding and weeding was assessed. A research 2-ha sugar beet field with 0.5m row width was used for testing, which included robotic sowing of sugar beet and weeding five times during the first two months of the growing. About three and five percent of the field were used as untreated and chemically weeded control areas, respectively. The plant detection was based on the exact plant location without image processing. The robot was equipped with six seeding and weeding tools, including passive between-rows harrow hoes and active hoes cutting inside rows between the plants, and it moved with a maximal speed of 0.9 km/h. The robot's performance was assessed by image processing. The field images were collected by an action camera with a height of 2 m and a resolution 27M pixels installed on the robot and by a drone with a 16M pixel camera flying at 4 m height. To detect plants and weeds, the YOLO model was trained with transfer learning from two available datasets. A preliminary analysis of the entire field showed that in the areas treated by the robot, the weed average density varied across the field from 6.8 to 9.1 weeds/m² (compared with 0.8 in the chemically treated area and 24.3 in the untreated area), the weed average density inside rows was 2.0-2.9 weeds / m (compared with 0 on the chemically treated area), and the emergence rate was 90-95%. The information about the robot's performance has high importance for the application of robotics for field tasks. With the help of the developed method, the performance can be assessed several times during the growth according to the robotic weeding frequency. When it’s used by farmers, they can know the field condition and efficiency of the robotic treatment all over the field. Farmers and researchers could develop optimal strategies for using the robot, such as seeding and weeding timing, robot settings, and plant and field parameters and geometry. The robot producers can have quantitative information from an actual working environment and improve the robots accordingly.

Keywords: agricultural robot, field robot, plant detection, robot performance

Procedia PDF Downloads 81
12931 The Effects of Cardiovascular Risk on Age-Related Cognitive Decline in Healthy Older Adults

Authors: A. Badran, M. Hollocks, H. Markus

Abstract:

Background: Common risk factors for cardiovascular disease are associated with age-related cognitive decline. There has been much interest in treating modifiable cardiovascular risk factors in the hope of reducing cognitive decline. However, there is currently no validated neuropsychological test to assess the subclinical cognitive effects of vascular risk. The Brief Memory and Executive Test (BMET) is a clinical screening tool, which was originally designed to be sensitive and specific to Vascular Cognitive Impairment (VCI), an impairment characterised by decline in frontally-mediated cognitive functions (e.g. Executive Function and Processing Speed). Objective: To cross-sectionally assess the validity of the BMET as a measure of the subclinical effects of vascular risk on cognition, in an otherwise healthy elderly cohort. Methods: Data from 346 participants (57 ± 10 years) without major neurological or psychiatric disorders were included in this study, gathered as part of a previous multicentre validation study for the BMET. Framingham Vascular Age was used as a surrogate measure of vascular risk, incorporating several established risk factors. Principal Components Analysis of the subtests was used to produce common constructs: an index for Memory and another for Executive Function/Processing Speed. Univariate General Linear models were used to relate Vascular Age to performance on Executive Function/Processing Speed and Memory subtests of the BMET, adjusting for Age, Premorbid Intelligence and Ethnicity. Results: Adverse vascular risk was associated with poorer performance on both the Memory and Executive Function/Processing Speed indices, adjusted for Age, Premorbid Intelligence and Ethnicity (p=0.011 and p<0.001, respectively). Conclusions: Performance on the BMET reflects the subclinical effects of vascular risk on cognition, in age-related cognitive decline. Vascular risk is associated with decline in both Executive Function/Processing Speed and Memory groups of subtests. Future studies are needed to explore whether treating vascular risk factors can effectively reduce age-related cognitive decline.

Keywords: age-related cognitive decline, vascular cognitive impairment, subclinical cerebrovascular disease, cognitive aging

Procedia PDF Downloads 463
12930 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases

Authors: Mohammad A. Bani-Khaled

Abstract:

In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.

Keywords: coupled dynamics, geometric complexity, proper orthogonal decomposition (POD), thin walled beams

Procedia PDF Downloads 416
12929 A Process for Prevention of Browning in Fresh Cut Tender Jackfruit

Authors: Ramachandra Pradhan, Sandeep Singh Rama, Sabyasachi Mishra

Abstract:

Jackfruit (Artocarpus heterophyllus L.) in its tender form is consumed as a vegetable and popular for its flavour, colour and meat like texture. In South Asian countries like Bangladesh, India, Pakistan and Indonesia the market value for tender jackfruit is very high. However, due to lack of technology the marketing and transportation of the fruit is a challenge. The processing activities like washing, sorting, peeling and cutting enhances oxidative stress in fresh cut jackfruit. It is also having the ill effects on quality of fresh cut tender jackfruit by an increase in microbial contaminations, excessive tissue softening, and depletion of phytochemicals and browning. Hence, this study was conducted as a solution to the above problem. Fresh cut tender Jackfruit slices were processed by using the independent parameters such as concentration of CaCl2 (2-5%), concentration of citric acid (1-2.5%) and treatment time (4-10 min.) and the depended variables were Browning index (BI), colour change (ΔE), Firmness (F) and Overall all acceptability (OAA) after the treatment. From the response variables the best combination of independent variables was resulted as 3% concentration of CaCl2 and 2% concentration of citric acid for 6 minutes. At these optimised processing treatments, the browning can be prevented for fresh cut tender jackfruit. This technology can be used by the researcher, scientists, industries, etc. for further processing of tender jackfruit.

Keywords: tender jackfruit, browning index, firmness, texture

Procedia PDF Downloads 256