Search results for: computer processing of large databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12469

Search results for: computer processing of large databases

11959 Large Panel Technology Apartments of Yesterday and Today: Quality Aspects

Authors: Barbara Gronostajska

Abstract:

Currently, housing conditions of buildings executed in large panel technology are deteriorating. The article presents modernization solutions implemented throughout the variety of architectural activities (adding of balconies and staircases, connecting apartments) which guarantee very intriguing results that meet the needs and expectations of the modern society.

Keywords: housing estate, apartments, flats, modernization, plate blocks

Procedia PDF Downloads 467
11958 Improved Image Retrieval for Efficient Localization in Urban Areas Using Location Uncertainty Data

Authors: Mahdi Salarian, Xi Xu, Rashid Ansari

Abstract:

Accurate localization of mobile devices based on camera-acquired visual media information usually requires a search over a very large GPS-referenced image database. This paper proposes an efficient method for limiting the search space for image retrieval engine by extracting and leveraging additional media information about Estimated Positional Error (EP E) to address complexity and accuracy issues in the search, especially to be used for compensating GPS location inaccuracy in dense urban areas. The improved performance is achieved by up to a hundred-fold reduction in the search area used in available reference methods while providing improved accuracy. To test our procedure we created a database by acquiring Google Street View (GSV) images for down town of Chicago. Other available databases are not suitable for our approach due to lack of EP E for the query images. We tested the procedure using more than 200 query images along with EP E acquired mostly in the densest areas of Chicago with different phones and in different conditions such as low illumination and from under rail tracks. The effectiveness of our approach and the effect of size and sector angle of the search area are discussed and experimental results demonstrate how our proposed method can improve performance just by utilizing a data that is available for mobile systems such as smart phones.

Keywords: localization, retrieval, GPS uncertainty, bag of word

Procedia PDF Downloads 271
11957 The Influence of Different Technologies on the Infiltration Properties and Soil Surface Crusting Processing in the North Bohemia Region

Authors: Miroslav Dumbrovsky, Lucie Larisova

Abstract:

The infiltration characteristic of the soil surface is one of the major factors that determines the potential soil degradation risk. The physical, chemical and biological characteristic of soil is changed by the processing of soil. The infiltration soil ability has an important role in soil and water conservation. The subject of the contribution is the evaluation of the influence of the conventional tillage and reduced tillage technology on soil surface crusting processing and infiltration properties of the soil in the North Bohemia region. Field experimental work at the area was carried out in the years 2013-2016 on Cambisol district medium-heavy clayey soil. The research was conducted on sloping erosion-endangered blocks of compacted arable land. The areas were chosen each year in the way that one of the experimental areas was handled by conventional tillage technologies and the other by reduced tillage technologies. Intact soil samples were taken into Kopecký´s cylinders in the three landscape positions, at a depth of 10 cm (representing topsoil) and 30 cm (representing subsoil). The cumulative infiltration was measured using a mini-disc infiltrometer near the consumption points. The Zhang method (1997), which provides an estimate of the unsaturated hydraulic conductivity K(h), was used for the evaluation of the infiltration tests of the mini-disc infiltrometer. The soil profile processed by conventional tillage showed a higher degree of compaction and soil crusting processing. The bulk density was between 1.10–1.67 g.cm⁻³, compared to the land processed by the reduced tillage technology, where the values were between 0.80–1.29 g.cm⁻³. Unsaturated hydraulic conductivity values were about one-third higher within the reduced tillage technology soil processing.

Keywords: soil crusting processing, unsaturated hydraulic conductivity, cumulative infiltration, bulk density, porosity

Procedia PDF Downloads 225
11956 MXene-Based Self-Sensing of Damage in Fiber Composites

Authors: Latha Nataraj, Todd Henry, Micheal Wallock, Asha Hall, Christine Hatter, Babak Anasori, Yury Gogotsi

Abstract:

Multifunctional composites with enhanced strength and toughness for superior damage tolerance are essential for advanced aerospace and military applications. Detection of structural changes prior to visible damage may be achieved by incorporating fillers with tunable properties such as two-dimensional (2D) nanomaterials with high aspect ratios and more surface-active sites. While 2D graphene with large surface areas, good mechanical properties, and high electrical conductivity seems ideal as a filler, the single-atomic thickness can lead to bending and rolling during processing, requiring post-processing to bond to polymer matrices. Lately, an emerging family of 2D transition metal carbides and nitrides, MXenes, has attracted much attention since their discovery in 2011. Metallic electronic conductivity and good mechanical properties, even with increased polymer content, coupled with hydrophilicity make MXenes a good candidate as a filler material in polymer composites and exceptional as multifunctional damage indicators in composites. Here, we systematically study MXene-based (Ti₃C₂) coated on glass fibers for fiber reinforced polymer composite for self-sensing using microscopy and micromechanical testing. Further testing is in progress through the investigation of local variations in optical, acoustic, and thermal properties within the damage sites in response to strain caused by mechanical loading.

Keywords: damage sensing, fiber composites, MXene, self-sensing

Procedia PDF Downloads 109
11955 Cognitive and Behavioral Disorders in Patients with Precuneal Infarcts

Authors: F. Ece Cetin, H. Nezih Ozdemir, Emre Kumral

Abstract:

Ischemic stroke of the precuneal cortex (PC) alone is extremely rare. This study aims to evaluate the clinical, neurocognitive, and behavioural characteristics of isolated PC infarcts. We assessed neuropsychological and behavioral findings in 12 patients with isolated PC infarct among 3800 patients with ischemic stroke. To determine the most frequently affected brain locus in patients, we first overlapped the ischemic area of patients with specific cognitive disorders and patients without specific cognitive disorders. Secondly, we compared both overlap maps using the 'subtraction plot' function of MRIcroGL. Patients showed various types of cognitive disorders. All patients experienced more than one category of cognitive disorder, except for two patients with only one cognitive disorder. Lesion topographical analysis showed that damage within the anterior precuneal region might lead to consciousness disorders (25%), self-processing impairment (42%), visuospatial disorders (58%), and lesions in the posterior precuneal region caused episodic and semantic memory impairment (33%). The whole precuneus is involved in at least one body awareness disorder. The cause of the stroke was cardioembolism in 5 patients (42%), large artery disease in 3 (25%), and unknown in 4 (33%). This study showed a wide variety of neuropsychological and behavioural disorders in patients with precuneal infarct. Future studies are needed to achieve a proper definition of the function of the precuneus in relation to the extended cortical areas. Precuneal cortex region infarcts have been found to predict a source of embolism from the large arteries or heart.

Keywords: cognition, pericallosal artery, precuneal cortex, ischemic stroke

Procedia PDF Downloads 116
11954 Scientific Production on Lean Supply Chains Published in Journals Indexed by SCOPUS and Web of Science Databases: A Bibliometric Study

Authors: T. Botelho de Sousa, F. Raphael Cabral Furtado, O. Eduardo da Silva Ferri, A. Batista, W. Augusto Varella, C. Eduardo Pinto, J. Mimar Santa Cruz Yabarrena, S. Gibran Ruwer, F. Müller Guerrini, L. Adalberto Philippsen Júnior

Abstract:

Lean Supply Chain Management (LSCM) is an emerging research field in Operations Management (OM). As a strategic model that focuses on reduced cost and waste with fulfilling the needs of customers, LSCM attracts great interest among researchers and practitioners. The purpose of this paper is to present an overview of Lean Supply Chains literature, based on bibliometric analysis through 57 papers published in indexed journals by SCOPUS and/or Web of Science databases. The results indicate that the last three years (2015, 2016, and 2017) were the most productive on LSCM discussion, especially in Supply Chain Management and International Journal of Lean Six Sigma journals. India, USA, and UK are the most productive countries; nevertheless, cross-country studies by collaboration among researchers were detected, by social network analysis, as a research practice, appearing to play a more important role on LSCM studies. Despite existing limitation, such as limited indexed journal database, bibliometric analysis helps to enlighten ongoing efforts on LSCM researches, including most used technical procedures and collaboration network, showing important research gaps, especially, for development countries researchers.

Keywords: Lean Supply Chains, Bibliometric Study, SCOPUS, Web of Science

Procedia PDF Downloads 330
11953 On the Implementation of The Pulse Coupled Neural Network (PCNN) in the Vision of Cognitive Systems

Authors: Hala Zaghloul, Taymoor Nazmy

Abstract:

One of the great challenges of the 21st century is to build a robot that can perceive and act within its environment and communicate with people, while also exhibiting the cognitive capabilities that lead to performance like that of people. The Pulse Coupled Neural Network, PCNN, is a relative new ANN model that derived from a neural mammal model with a great potential in the area of image processing as well as target recognition, feature extraction, speech recognition, combinatorial optimization, compressed encoding. PCNN has unique feature among other types of neural network, which make it a candid to be an important approach for perceiving in cognitive systems. This work show and emphasis on the potentials of PCNN to perform different tasks related to image processing. The main drawback or the obstacle that prevent the direct implementation of such technique, is the need to find away to control the PCNN parameters toward perform a specific task. This paper will evaluate the performance of PCNN standard model for processing images with different properties, and select the important parameters that give a significant result, also, the approaches towards find a way for the adaptation of the PCNN parameters to perform a specific task.

Keywords: cognitive system, image processing, segmentation, PCNN kernels

Procedia PDF Downloads 262
11952 Studying the Value-Added Chain for the Fish Distribution Process at Quang Binh Fishing Port in Vietnam

Authors: Van Chung Nguyen

Abstract:

The purpose of this study is to study the current status of the value chain for fish distribution at Quang Binh Fishing Port with 360 research samples in which the research subjects are fishermen, traders, retailers, and businesses. The research uses the approach of applying the value chain theoretical framework of Kaplinsky and Morris to quantify and describe market channels and actors participating in the value chain and analyze the value-added process of these companies according to market channels. The analysis results show that fishermen directly catch fish with high economic efficiency, but processing enterprises and, especially retailers, are the agents to obtain higher added value. Processing enterprises play a role that is not really clear due to outdated processing technology; in contrast, retailers have the highest added value. This shows that the added value of the fish supply chain at Quang Binh fishing port is still limited, leading to low output quality. Therefore, the selling price of fish to the market is still high compared to the abundant fish resources, leading to low consumption and limiting exports due to the quality of processing enterprises. This reduces demand and fishing capacity, and productivity is lower than potential. To improve the fish value chain at fishing ports, it is necessary to focus on improving product quality, strengthening linkages between actors, building brands and product consumption markets at the same time, improving the capacity of export processing enterprises.

Keywords: Quang Binh fishing port, value chain, market, distributions channel

Procedia PDF Downloads 55
11951 Recent Development on Application of Microwave Energy on Process Metallurgy

Authors: Mamdouh Omran, Timo Fabritius

Abstract:

A growing interest in microwave heating has emerged recently. Many researchers have begun to pay attention to microwave energy as an alternative technique for processing various primary and secondary raw materials. Compared to conventional methods, microwave processing offers several advantages, such as selective heating, rapid heating, and volumetric heating. The present study gives a summary on our recent works related to the use of microwave energy for the recovery of valuable metals from primary and secondary raw materials. The research is mainly focusing on: Application of microwave for the recovery and recycling of metals from different metallurgical industries wastes (i.e. electric arc furnace (EAF) dust, blast furnace (BF), basic oxygen furnace (BOF) sludge). Application of microwave for upgrading and recovery of valuable metals from primary raw materials (i.e. iron ore). The results indicated that microwave heating is a promising and effective technique for processing primary and secondary steelmaking wastes. After microwave treatment of iron ore for 60 s and 900 W, about a 28.30% increase in grindability.Wet high intensity magnetic separation (WHIMS) indicated that the magnetic separation increased from 34% to 98% after microwave treatment for 90 s and 900 W. In the case of EAF dust, after microwave processing at 1100 W for 20 min, Zinc removal from 64 % to ~ 97 %, depending on mixture ratio and treatment time.

Keywords: dielectric properties, microwave heating, raw materials, secondary raw materials

Procedia PDF Downloads 71
11950 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained

Authors: Homa Ghave, Parmis Shahmaleki

Abstract:

This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.

Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function

Procedia PDF Downloads 250
11949 Password Cracking on Graphics Processing Unit Based Systems

Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik

Abstract:

Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper is mainly to propose how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.

Keywords: GPGPU, password cracking, secret key, user authentication

Procedia PDF Downloads 267
11948 Private Coded Computation of Matrix Multiplication

Authors: Malihe Aliasgari, Yousef Nejatbakhsh

Abstract:

The era of Big Data and the immensity of real-life datasets compels computation tasks to be performed in a distributed fashion, where the data is dispersed among many servers that operate in parallel. However, massive parallelization leads to computational bottlenecks due to faulty servers and stragglers. Stragglers refer to a few slow or delay-prone processors that can bottleneck the entire computation because one has to wait for all the parallel nodes to finish. The problem of straggling processors, has been well studied in the context of distributed computing. Recently, it has been pointed out that, for the important case of linear functions, it is possible to improve over repetition strategies in terms of the tradeoff between performance and latency by carrying out linear precoding of the data prior to processing. The key idea is that, by employing suitable linear codes operating over fractions of the original data, a function may be completed as soon as enough number of processors, depending on the minimum distance of the code, have completed their operations. The problem of matrix-matrix multiplication in the presence of practically big sized of data sets faced with computational and memory related difficulties, which makes such operations are carried out using distributed computing platforms. In this work, we study the problem of distributed matrix-matrix multiplication W = XY under storage constraints, i.e., when each server is allowed to store a fixed fraction of each of the matrices X and Y, which is a fundamental building of many science and engineering fields such as machine learning, image and signal processing, wireless communication, optimization. Non-secure and secure matrix multiplication are studied. We want to study the setup, in which the identity of the matrix of interest should be kept private from the workers and then obtain the recovery threshold of the colluding model, that is, the number of workers that need to complete their task before the master server can recover the product W. The problem of secure and private distributed matrix multiplication W = XY which the matrix X is confidential, while matrix Y is selected in a private manner from a library of public matrices. We present the best currently known trade-off between communication load and recovery threshold. On the other words, we design an achievable PSGPD scheme for any arbitrary privacy level by trivially concatenating a robust PIR scheme for arbitrary colluding workers and private databases and the proposed SGPD code that provides a smaller computational complexity at the workers.

Keywords: coded distributed computation, private information retrieval, secret sharing, stragglers

Procedia PDF Downloads 106
11947 External Retinal Prosthesis Image Processing System Used One-Cue Saliency Map Based on DSP

Authors: Yili Chen, Jixiang Fu, Zhihua Liu, Zhicheng Zhang, Rongmao Li, Nan Fu, Yaoqin Xie

Abstract:

Retinal prothesis is designed to help the blind to get some sight.It is made up of internal part and external part.In external part ,there is made up of camera, image processing, and RF transmitter.In internal part, there is RF receiver, implant chip,micro-electrode.The image got from the camera should be processed by suitable stragies to corresponds to stimulus the electrode.Nowadays, the number of the micro-electrode is hundreds and we don’t know the mechanism how the elctrode stimulus the optic nerve, an easy way to the hypothesis is that the pixel in the image is correspondence to the electrode.So it is a question how to get the important information of the image captured from the picture.There are many strategies to experimented to get the most important information as soon as possible, due to the real time system.ROI is a useful algorithem to extract the region of the interest.Our paper will explain the details of the orinciples and functions of the ROI.And based on this, we simplified the ROI algrithem,and used it in outside image prcessing DSP system of the retinal prothesis.Results show that our image processing stratiges is suitable for real-time retinal prothesis and can cut redundant information and help useful information to express in the low-size image.

Keywords: image processing, region of interest, saliency map, low-size image, useful information express, cut redundant information in image

Procedia PDF Downloads 267
11946 REDUCER: An Architectural Design Pattern for Reducing Large and Noisy Data Sets

Authors: Apkar Salatian

Abstract:

To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article, we also show how REDUCER has successfully been applied to 3 different case studies.

Keywords: design pattern, filtering, compression, architectural design

Procedia PDF Downloads 199
11945 Large Scale Production of Polyhydroxyalkanoates (PHAs) from Waste Water: A Study of Techno-Economics, Energy Use, and Greenhouse Gas Emissions

Authors: Cora Fernandez Dacosta, John A. Posada, Andrea Ramirez

Abstract:

The biodegradable family of polymers polyhydroxyalkanoates are interesting substitutes for convectional fossil-based plastics. However, the manufacturing and environmental impacts associated with their production via intracellular bacterial fermentation are strongly dependent on the raw material used and on energy consumption during the extraction process, limiting their potential for commercialization. Industrial wastewater is studied in this paper as a promising alternative feedstock for waste valorization. Based on results from laboratory and pilot-scale experiments, a conceptual process design, techno-economic analysis and life cycle assessment are developed for the large-scale production of the most common type of polyhydroxyalkanoate, polyhydroxbutyrate. Intracellular polyhydroxybutyrate is obtained via fermentation of microbial community present in industrial wastewater and the downstream processing is based on chemical digestion with surfactant and hypochlorite. The economic potential and environmental performance results help identifying bottlenecks and best opportunities to scale-up the process prior to industrial implementation. The outcome of this research indicates that the fermentation of wastewater towards PHB presents advantages compared to traditional PHAs production from sugars because the null environmental burdens and financial costs of the raw material in the bioplastic production process. Nevertheless, process optimization is still required to compete with the petrochemicals counterparts.

Keywords: circular economy, life cycle assessment, polyhydroxyalkanoates, waste valorization

Procedia PDF Downloads 437
11944 Robust Image Design Based Steganographic System

Authors: Sadiq J. Abou-Loukh, Hanan M. Habbi

Abstract:

This paper presents a steganography to hide the transmitted information without excite suspicious and also illustrates the level of secrecy that can be increased by using cryptography techniques. The proposed system has been implemented firstly by encrypted image file one time pad key and secondly encrypted message that hidden to perform encryption followed by image embedding. Then the new image file will be created from the original image by using four triangles operation, the new image is processed by one of two image processing techniques. The proposed two processing techniques are thresholding and differential predictive coding (DPC). Afterwards, encryption or decryption keys are generated by functional key generator. The generator key is used one time only. Encrypted text will be hidden in the places that are not used for image processing and key generation system has high embedding rate (0.1875 character/pixel) for true color image (24 bit depth).

Keywords: encryption, thresholding, differential predictive coding, four triangles operation

Procedia PDF Downloads 475
11943 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events

Authors: Jaqueline Maria Ribeiro Vieira

Abstract:

Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.

Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer

Procedia PDF Downloads 286
11942 Optimized Road Lane Detection Through a Combined Canny Edge Detection, Hough Transform, and Scaleable Region Masking Toward Autonomous Driving

Authors: Samane Sharifi Monfared, Lavdie Rada

Abstract:

Nowadays, autonomous vehicles are developing rapidly toward facilitating human car driving. One of the main issues is road lane detection for a suitable guidance direction and car accident prevention. This paper aims to improve and optimize road line detection based on a combination of camera calibration, the Hough transform, and Canny edge detection. The video processing is implemented using the Open CV library with the novelty of having a scale able region masking. The aim of the study is to introduce automatic road lane detection techniques with the user’s minimum manual intervention.

Keywords: hough transform, canny edge detection, optimisation, scaleable masking, camera calibration, improving the quality of image, image processing, video processing

Procedia PDF Downloads 78
11941 Public Transportation Demand and Policy in Kabul, Afghanistan

Authors: Ahmad Samim Ranjbar, Shoshi Mizokami

Abstract:

Kabul is the heart of political, commercial, cultural, educational and social life in Afghanistan and the Kabul fifth fastest growing city in the world, since 2001 with the establishment of new government Lack of adequate employment opportunities and basic utility services in remote provinces have prompted people to move to Kabul and other urban areas. From 2001 to the present, a rapid increase in population, and also less income of the people most of residence tend to use public transport, especially buses, however there is no proper bus system exist in Kabul city, because of wars, from 1992 to 2001 Kabul suffered damage and destruction of its transportation facilities including pavements, sidewalks, traffic circles, drainage systems, traffic signs and signals, trolleybuses and almost all of the public transit buses (e.g. Millie bus). This research is a primary and very important phase into Kabul city transportation and especially an initial and important step toward using large bus in Kabul city, which the main purpose of this research is to find the demand of Kabul city residence for public transport (Large Bus) and compare it with the actual supply from government. Finding of this research shows that the demand of Kabul city residence for the public transport (Large Bus) exceed the supply from the government, means that current public transportation (Large Bus) is not sufficient to serve people of Kabul city, it is mentionable that according to this research there is no need to build a new road or exclusive way for bus, this research propose to government for investment on the public transportation and exceed the number of large buses to can handle the current demand for public transport.

Keywords: transportation, planning, public transport, large bus, Kabul, Afghanistan

Procedia PDF Downloads 279
11940 Translation Directionality: An Eye Tracking Study

Authors: Elahe Kamari

Abstract:

Research on translation process has been conducted for more than 20 years, investigating various issues and using different research methodologies. Most recently, researchers have started to use eye tracking to study translation processes. They believed that the observable, measurable data that can be gained from eye tracking are indicators of unobservable cognitive processes happening in the translators’ mind during translation tasks. The aim of this study was to investigate directionality in translation processes through using eye tracking. The following hypotheses were tested: 1) processing the target text requires more cognitive effort than processing the source text, in both directions of translation; 2) L2 translation tasks on the whole require more cognitive effort than L1 tasks; 3) cognitive resources allocated to the processing of the source text is higher in L1 translation than in L2 translation; 4) cognitive resources allocated to the processing of the target text is higher in L2 translation than in L1 translation; and 5) in both directions non-professional translators invest more cognitive effort in translation tasks than do professional translators. The performance of a group of 30 male professional translators was compared with that of a group of 30 male non-professional translators. All the participants translated two comparable texts one into their L1 (Persian) and the other into their L2 (English). The eye tracker measured gaze time, average fixation duration, total task length and pupil dilation. These variables are assumed to measure the cognitive effort allocated to the translation task. The data derived from eye tracking only confirmed the first hypothesis. This hypothesis was confirmed by all the relevant indicators: gaze time, average fixation duration and pupil dilation. The second hypothesis that L2 translation tasks requires allocation of more cognitive resources than L1 translation tasks has not been confirmed by all four indicators. The third hypothesis that source text processing requires more cognitive resources in L1 translation than in L2 translation and the fourth hypothesis that target text processing requires more cognitive effort in L2 translation than L1 translation were not confirmed. It seems that source text processing in L2 translation can be just as demanding as in L1 translation. The final hypothesis that non-professional translators allocate more cognitive resources for the same translation tasks than do the professionals was partially confirmed. One of the indicators, average fixation duration, indicated higher cognitive effort-related values for professionals.

Keywords: translation processes, eye tracking, cognitive resources, directionality

Procedia PDF Downloads 443
11939 A Trends Analysis of Yatch Simulator

Authors: Jae-Neung Lee, Keun-Chang Kwak

Abstract:

This paper describes an analysis of Yacht Simulator international trends and also explains about Yacht. Examples of yacht Simulator using Yacht Simulator include image processing for totaling the total number of vehicles, edge/target detection, detection and evasion algorithm, image processing using SIFT (scale invariant features transform) matching, and application of median filter and thresholding.

Keywords: yacht simulator, simulator, trends analysis, SIFT

Procedia PDF Downloads 416
11938 Video Processing of a Football Game: Detecting Features of a Football Match for Automated Calculation of Statistics

Authors: Rishabh Beri, Sahil Shah

Abstract:

We have applied a range of filters and processing in order to extract out the various features of the football game, like the field lines of a football field. Another important aspect was the detection of the players in the field and tagging them according to their teams distinguished by their jersey colours. This extracted information combined about the players and field helped us to create a virtual field that consists of the playing field and the players mapped to their locations in it.

Keywords: Detect, Football, Players, Virtual

Procedia PDF Downloads 315
11937 An Image Processing Based Approach for Assessing Wheelchair Cushions

Authors: B. Farahani, R. Fadil, A. Aboonabi, B. Hoffmann, J. Loscheider, K. Tavakolian, S. Arzanpour

Abstract:

Wheelchair users spend long hours in a sitting position, and selecting the right cushion is highly critical in preventing pressure ulcers in that demographic. Pressure mapping systems (PMS) are typically used in clinical settings by therapists to identify the sitting profile and pressure points in the sitting area to select the cushion that fits the best for the users. A PMS is a flexible mat composed of arrays of distributed networks of flexible sensors. The output of the PMS systems is a color-coded image that shows the intensity of the pressure concentration. Therapists use the PMS images to compare different cushions fit for each user. This process is highly subjective and requires good visual memory for the best outcome. This paper aims to develop an image processing technique to analyze the images of PMS and provide an objective measure to assess the cushions based on their pressure distribution mappings. In this paper, we first reviewed the skeletal anatomy of the human sitting area and its relation to the PMS image. This knowledge is then used to identify the important features that must be considered in image processing. We then developed an algorithm based on those features to analyze the images and rank them according to their fit to the users' needs.

Keywords: dynamic cushion, image processing, pressure mapping system, wheelchair

Procedia PDF Downloads 152
11936 Rapid Soil Classification Using Computer Vision with Electrical Resistivity and Soil Strength

Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, P. L. Goh, Grace H. B. Foo, M. L. Leong

Abstract:

This paper presents the evaluation of various soil testing methods such as the four-probe soil electrical resistivity method and cone penetration test (CPT) that can complement a newly developed novel rapid soil classification scheme using computer vision, to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from the local construction industry are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labor-intensive. Thus, a rapid classification method is needed at the SGs. Four-probe soil electrical resistivity and CPT were evaluated for their feasibility as suitable additions to the computer vision system to further develop this innovative non-destructive and instantaneous classification method. The computer vision technique comprises soil image acquisition using an industrial-grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the following three items were targeted to be added onto the computer vision scheme: the apparent electrical resistivity of soil (ρ) measured using a set of four probes arranged in Wenner’s array, the soil strength measured using a modified mini cone penetrometer, and w measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay,” and a mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay” and are feasible as complementing methods to the computer vision system.

Keywords: computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification

Procedia PDF Downloads 220
11935 Relationship among Teams' Information Processing Capacity and Performance in Information System Projects: The Effects of Uncertainty and Equivocality

Authors: Ouafa Sakka, Henri Barki, Louise Cote

Abstract:

Uncertainty and equivocality are defined in the information processing literature as two task characteristics that require different information processing responses from managers. As uncertainty often stems from a lack of information, addressing it is thought to require the collection of additional data. On the other hand, as equivocality stems from ambiguity and a lack of understanding of the task at hand, addressing it is thought to require rich communication between those involved. Past research has provided weak to moderate empirical support to these hypotheses. The present study contributes to this literature by defining uncertainty and equivocality at the project level and investigating their moderating effects on the association between several project information processing constructs and project performance. The information processing constructs considered are the amount of information collected by the project team, and the richness and frequency of formal communications among the team members to discuss the project’s follow-up reports. Data on 93 information system development (ISD) project managers was collected in a questionnaire survey and analyzed it via the Fisher Test for correlation differences. The results indicate that the highest project performance levels were observed in projects characterized by high uncertainty and low equivocality in which project managers were provided with detailed and updated information on project costs and schedules. In addition, our findings show that information about user needs and technical aspects of the project is less useful to managing projects where uncertainty and equivocality are high. Further, while the strongest positive effect of interactive use of follow-up reports on performance occurred in projects where both uncertainty and equivocality levels were high, its weakest effect occurred when both of these were low.

Keywords: uncertainty, equivocality, information processing model, management control systems, project control, interactive use, diagnostic use, information system development

Procedia PDF Downloads 271
11934 An Explanatory Study Approach Using Artificial Intelligence to Forecast Solar Energy Outcome

Authors: Agada N. Ihuoma, Nagata Yasunori

Abstract:

Artificial intelligence (AI) techniques play a crucial role in predicting the expected energy outcome and its performance, analysis, modeling, and control of renewable energy. Renewable energy is becoming more popular for economic and environmental reasons. In the face of global energy consumption and increased depletion of most fossil fuels, the world is faced with the challenges of meeting the ever-increasing energy demands. Therefore, incorporating artificial intelligence to predict solar radiation outcomes from the intermittent sunlight is crucial to enable a balance between supply and demand of energy on loads, predict the performance and outcome of solar energy, enhance production planning and energy management, and ensure proper sizing of parameters when generating clean energy. However, one of the major problems of forecasting is the algorithms used to control, model, and predict performances of the energy systems, which are complicated and involves large computer power, differential equations, and time series. Also, having unreliable data (poor quality) for solar radiation over a geographical location as well as insufficient long series can be a bottleneck to actualization. To overcome these problems, this study employs the anaconda Navigator (Jupyter Notebook) for machine learning which can combine larger amounts of data with fast, iterative processing and intelligent algorithms allowing the software to learn automatically from patterns or features to predict the performance and outcome of Solar Energy which in turns enables the balance of supply and demand on loads as well as enhance production planning and energy management.

Keywords: artificial Intelligence, backward elimination, linear regression, solar energy

Procedia PDF Downloads 147
11933 Business Continuity Risk Review for a Large Petrochemical Complex

Authors: Michel A. Thomet

Abstract:

A discrete-event simulation model was used to perform a Reliability-Availability-Maintainability (RAM) study of a large petrochemical complex which included sixteen process units, and seven feeds and intermediate streams. All the feeds and intermediate streams have associated storage tanks, so that if a processing unit fails and shuts down, the downstream units can keep producing their outputs. This also helps the upstream units which do not have to reduce their outputs, but can store their excess production until the failed unit restart. Each process unit and each pipe section carrying the feeds and intermediate streams has a probability of failure with an associated distribution and a Mean Time Between Failure (MTBF), as well as a distribution of the time to restore and a Mean Time To Restore (MTTR). The utilities supporting the process units can also fail and have their own distributions with specific MTBF and MTTR. The model runs are for ten years or more and the runs are repeated several times to obtain statistically relevant results. One of the main results is the On-Stream factor (OSF) of each process unit (percent of hours in a year when the unit is running in nominal conditions). One of the objectives of the study was to investigate if the storage capacity of each of the feeds and the intermediate stream was adequate. This was done by increasing the storage capacities in several steps and through running the simulation to see if the OSF were improved and by how much. Other objectives were to see if the failure of the utilities were an important factor in the overall OSF, and what could be done to reduce their failure rates through redundant equipment.

Keywords: business continuity, on-stream factor, petrochemical, RAM study, simulation, MTBF

Procedia PDF Downloads 201
11932 Intelligent Grading System of Apple Using Neural Network Arbitration

Authors: Ebenezer Obaloluwa Olaniyi

Abstract:

In this paper, an intelligent system has been designed to grade apple based on either its defective or healthy for production in food processing. This paper is segmented into two different phase. In the first phase, the image processing techniques were employed to extract the necessary features required in the apple. These techniques include grayscale conversion, segmentation where a threshold value is chosen to separate the foreground of the images from the background. Then edge detection was also employed to bring out the features in the images. These extracted features were then fed into the neural network in the second phase of the paper. The second phase is a classification phase where neural network employed to classify the defective apple from the healthy apple. In this phase, the network was trained with back propagation and tested with feed forward network. The recognition rate obtained from our system shows that our system is more accurate and faster as compared with previous work.

Keywords: image processing, neural network, apple, intelligent system

Procedia PDF Downloads 382
11931 Interactive, Topic-Oriented Search Support by a Centroid-Based Text Categorisation

Authors: Mario Kubek, Herwig Unger

Abstract:

Centroid terms are single words that semantically and topically characterise text documents and so may serve as their very compact representation in automatic text processing. In the present paper, centroids are used to measure the relevance of text documents with respect to a given search query. Thus, a new graphbased paradigm for searching texts in large corpora is proposed and evaluated against keyword-based methods. The first, promising experimental results demonstrate the usefulness of the centroid-based search procedure. It is shown that especially the routing of search queries in interactive and decentralised search systems can be greatly improved by applying this approach. A detailed discussion on further fields of its application completes this contribution.

Keywords: search algorithm, centroid, query, keyword, co-occurrence, categorisation

Procedia PDF Downloads 271
11930 Identification System for Grading Banana in Food Processing Industry

Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan

Abstract:

In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.

Keywords: banana, food processing, identification system, neural network

Procedia PDF Downloads 453