Search results for: distributed sensor networks
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5821

Search results for: distributed sensor networks

2581 Building an Ontology for Researchers: An Application of Topic Maps and Social Information

Authors: Yu Hung Chiang, Hei Chia Wang

Abstract:

In the academic area, it is important for research to find proper research domain. Many researchers may refer to conference issues to find their interesting or new topics. Furthermore, conferences issues can help researchers realize current research trends in their field and learn about cutting-edge developments in their specialty. However, online published conference information may widely be distributed; it is not easy to be concluded. Many researchers use search engine of journals or conference issues to filter information in order to get what they want. However, this search engine has its limitation. There will still be some issues should be considered; i.e. researchers cannot find the associated topics which may be useful information for them. Hence, use Knowledge Management (KM) could be a way to resolve these issues. In KM, ontology is widely adopted; but most existed ontology construction methods do not consider social information between target users. To effective in academic KM, this study proposes a method of constructing research Topic Maps using Open Directory Project (ODP) and Social Information Processing (SIP). Through catching of social information in conference website: i.e. the information of co-authorship or collaborator, research topics can be associated among related researchers. Finally, the experiments show Topic Maps successfully help researchers to find the information they need more easily and quickly as well as construct associations between research topics.

Keywords: knowledge management, topic map, social information processing, ontology extraction

Procedia PDF Downloads 294
2580 NprRX Regulation on Surface Spreading Motility in Bacillus cereus

Authors: Yan-Shiang Chiou, Yi-Huang Hsueh

Abstract:

Bacillus cereus is a foodborne pathogen that causes two types of foodborne illness, the emetic and diarrheal syndromes. B. cereus consistently ranks among the top three among bacterial foodborne outbreaks in the ten years of 2001 to 2010 in Taiwan. Foodborne outbreak caused by B. cereus has been increased, and recently it ranks second foodborne pathogen after Vibrio parahaemolyticus. This pathogen is difficult to control due to its ubiquitousness in the environment, the psychrotrophic nature of many strains, and the heat resistance of their spores. Because complete elimination of biofilms is difficult, a better understanding of the molecular mechanisms of biofilm formation by B. cereus will help to develop better strategies to control this pathogen. Surface translocation can be an important factor in biofilm formation. In B. cereus, NprR is a quorum sensor, and its apo NprR is a dimer and changes to a tetramer in the presence of NprX. The small peptide NprX may induce conformational change allowing the apo dimer to switch to an active tetramer specifically recognizing target DNA sequences. Our result showed that mutation of nprRX causes surface spreading deficiency. Mutation of flagella, pili and surfactant genes (flgAB, bcpAB, krsABC), did not abolish spreading motility. Under nprRX mutant, mutation of spo0A restored the spreading deficiency. This suggests that spreading motility is not related surfactant, pili and flagella but other unknown mechanism and Spo0A, a sporulation initiation protein, inhibits spreading motility.

Keywords: Bacillus cereus, nprRX, spo0A, spreading motility

Procedia PDF Downloads 259
2579 The Location-Routing Problem with Pickup Facilities and Heterogeneous Demand: Formulation and Heuristics Approach

Authors: Mao Zhaofang, Xu Yida, Fang Kan, Fu Enyuan, Zhao Zhao

Abstract:

Nowadays, last-mile distribution plays an increasingly important role in the whole industrial chain delivery link and accounts for a large proportion of the whole distribution process cost. Promoting the upgrading of logistics networks and improving the layout of final distribution points has become one of the trends in the development of modern logistics. Due to the discrete and heterogeneous needs and spatial distribution of customer demand, which will lead to a higher delivery failure rate and lower vehicle utilization, last-mile delivery has become a time-consuming and uncertain process. As a result, courier companies have introduced a range of innovative parcel storage facilities, including pick-up points and lockers. The introduction of pick-up points and lockers has not only improved the users’ experience but has also helped logistics and courier companies achieve large-scale economy. Against the backdrop of the COVID-19 of the previous period, contactless delivery has become a new hotspot, which has also created new opportunities for the development of collection services. Therefore, a key issue for logistics companies is how to design/redesign their last-mile distribution network systems to create integrated logistics and distribution networks that consider pick-up points and lockers. This paper focuses on the introduction of self-pickup facilities in new logistics and distribution scenarios and the heterogeneous demands of customers. In this paper, we consider two types of demand, including ordinary products and refrigerated products, as well as corresponding transportation vehicles. We consider the constraints associated with self-pickup points and lockers and then address the location-routing problem with self-pickup facilities and heterogeneous demands (LRP-PFHD). To solve this challenging problem, we propose a mixed integer linear programming (MILP) model that aims to minimize the total cost, which includes the facility opening cost, the variable transport cost, and the fixed transport cost. Due to the NP-hardness of the problem, we propose a hybrid adaptive large-neighbourhood search algorithm to solve LRP-PFHD. We evaluate the effectiveness and efficiency of the proposed algorithm by using instances generated based on benchmark instances. The results demonstrate that the hybrid adaptive large neighbourhood search algorithm is more efficient than MILP solvers such as Gurobi for LRP-PFHD, especially for large-scale instances. In addition, we made a comprehensive analysis of some important parameters (e.g., facility opening cost and transportation cost) to explore their impacts on the results and suggested helpful managerial insights for courier companies.

Keywords: city logistics, last-mile delivery, location-routing, adaptive large neighborhood search

Procedia PDF Downloads 88
2578 Economic Growth and Transport Carbon Dioxide Emissions in New Zealand: A Co-Integration Analysis of the Environmental Kuznets Curve

Authors: Mingyue Sheng, Basil Sharp

Abstract:

Greenhouse gas (GHG) emissions from national transport account for the largest share of emissions from energy use in New Zealand. Whether the environmental Kuznets curve (EKC) relationship exists between environmental degradation indicators from the transport sector and economic growth in New Zealand remains unclear. This paper aims at exploring the causality relationship between CO₂ emissions from the transport sector, fossil fuel consumption, and the Gross Domestic Product (GDP) per capita in New Zealand, using annual data for the period 1977 to 2013. First, conventional unit root tests (Augmented Dickey–Fuller and Phillips–Perron tests), and a unit root test with the breakpoint (Zivot-Andrews test) are employed to examine the stationarity of the variables. Second, the autoregressive distributed lag (ARDL) bounds test for co-integration, followed by Granger causality investigated causality among the variables. Empirical results of the study reveal that, in the short run, there is a unidirectional causality between economic growth and transport CO₂ emissions with direction from economic growth to transport CO₂ emissions, as well as a bidirectional causality from transport CO₂ emissions to road energy consumption.

Keywords: economic growth, transport carbon dioxide emissions, environmental Kuznets curve, causality

Procedia PDF Downloads 304
2577 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained

Authors: Homa Ghave, Parmis Shahmaleki

Abstract:

This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.

Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function

Procedia PDF Downloads 267
2576 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications

Authors: Atish Bagchi, Siva Chandrasekaran

Abstract:

Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.

Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning

Procedia PDF Downloads 152
2575 Perception and Knowledge of the Jordanian Society of Occupational Therapy

Authors: Wesam Darawsheh

Abstract:

Background: there are scarcity of studies done to investigate the level of knowledge and the level of awareness and perception of Jordanians about occupational therapy (OT). Aim: to investigate the level of awareness of lay people, clients receiving services and healthcare professionals of OT, identify the common misconceptions about OT, and to explore ways whereby the knowledge and awareness about OT can be increased. Methodology: a cross sectional design was employed in this study where a survey was distributed in the Northern, Southern, Western, Eastern provinces and the Middle (capital city: Amman) province of Jordan. The survey consisted of eight section and 61 questions that aims to investigate the demographics of participants, self evaluation concerning knowledge and awareness about OT, sources of knowledge about OT, the perception of the aims, fields of practice, OT settings, misconceptions about OT, and suggestion to improve knowledge and awareness about OT. Results: A total of 829 participants were enrolled in this study: 459 lay people, 155 clients who are currently receiving OT services, 215 healthcare professionals. About 57% of the participants did not hear about OT, and 48% of those who reported to hear about OT did not have sufficient knowledge about it. There are several misconceptions associated with OT. The statistical analysis was executed using IBM SPSS software, Version 22.0 (SPSS, Chicago, USA). Conclusion: it is the responsibility of OTRs to increase the knowledge and awareness about OT in Jordan. This is required for the profession to proliferate and to be given its status.

Keywords: knowledge, occupational therapy misconceptions, healthcare professionals, lay people, Jordan

Procedia PDF Downloads 369
2574 Foot Recognition Using Deep Learning for Knee Rehabilitation

Authors: Rakkrit Duangsoithong, Jermphiphut Jaruenpunyasak, Alba Garcia

Abstract:

The use of foot recognition can be applied in many medical fields such as the gait pattern analysis and the knee exercises of patients in rehabilitation. Generally, a camera-based foot recognition system is intended to capture a patient image in a controlled room and background to recognize the foot in the limited views. However, this system can be inconvenient to monitor the knee exercises at home. In order to overcome these problems, this paper proposes to use the deep learning method using Convolutional Neural Networks (CNNs) for foot recognition. The results are compared with the traditional classification method using LBP and HOG features with kNN and SVM classifiers. According to the results, deep learning method provides better accuracy but with higher complexity to recognize the foot images from online databases than the traditional classification method.

Keywords: foot recognition, deep learning, knee rehabilitation, convolutional neural network

Procedia PDF Downloads 165
2573 Automatic Measurement of Garment Sizes Using Deep Learning

Authors: Maulik Parmar, Sumeet Sandhu

Abstract:

The online fashion industry experiences high product return rates. Many returns are because of size/fit mismatches -the size scale on labels can vary across brands, the size parameters may not capture all fit measurements, or the product may have manufacturing defects. Warehouse quality check of garment sizes can be semi-automated to improve speed and accuracy. This paper presents an approach for automatically measuring garment sizes from a single image of the garment -using Deep Learning to learn garment keypoints. The paper focuses on the waist size measurement of jeans and can be easily extended to other garment types and measurements. Experimental results show that this approach can greatly improve the speed and accuracy of today’s manual measurement process.

Keywords: convolutional neural networks, deep learning, distortion, garment measurements, image warping, keypoints

Procedia PDF Downloads 317
2572 Parents' Attitude toward Compulsory Pre-School Education in Slovakia

Authors: Sona Lorencova, Beata Hornickova

Abstract:

Compulsory pre-school education in Slovakia will be established by the Education Act for all five-year-old children from September 2021. The implementation of this law will change pre-school education in our country from optional to compulsory, and children will be able to complete this education either in institutional form school facilities or in the form of individual education at the request of the parent. The primary purpose of this change is that all children achieve pre-school education before entering primary school, thus eliminating differences between children before entering primary school. The benefits of introducing compulsory pre-school education are obvious to the professional public. However, as this fundamental change in children's education is perceived by parents who have a prime position in the upbringing and education of their children, research pays minimal attention. The aim of the study is to interpret the findings of quantitatively oriented research, which was focused on finding out the attitudes of parents to the planned introduction of compulsory preschool education in Slovakia. The data were obtained through questionnaires primarily intended for parents of preschool children. In the distributed questionnaire, the degree of agreement or disagreement with individual items could be expressed on a 5-point Likert scale. The results of the research present how perceived compulsory pre-school education is perceived by the parental public in Slovakia and what perspectives and limitations parents anticipate after its introduction.

Keywords: compulsory pre-school education, education act, childs' learning and development, kindergarten, parents' perspectives

Procedia PDF Downloads 165
2571 Rule Based Architecture for Collaborative Multidisciplinary Aircraft Design Optimisation

Authors: Nickolay Jelev, Andy Keane, Carren Holden, András Sóbester

Abstract:

In aircraft design, the jump from the conceptual to preliminary design stage introduces a level of complexity which cannot be realistically handled by a single optimiser, be that a human (chief engineer) or an algorithm. The design process is often partitioned along disciplinary lines, with each discipline given a level of autonomy. This introduces a number of challenges including, but not limited to: coupling of design variables; coordinating disciplinary teams; handling of large amounts of analysis data; reaching an acceptable design within time constraints. A number of classical Multidisciplinary Design Optimisation (MDO) architectures exist in academia specifically designed to address these challenges. Their limited use in the industrial aircraft design process has inspired the authors of this paper to develop an alternative strategy based on well established ideas from Decision Support Systems. The proposed rule based architecture sacrifices possibly elusive guarantees of convergence for an attractive return in simplicity. The method is demonstrated on analytical and aircraft design test cases and its performance is compared to a number of classical distributed MDO architectures.

Keywords: Multidisciplinary Design Optimisation, Rule Based Architecture, Aircraft Design, Decision Support System

Procedia PDF Downloads 358
2570 Benefits of Construction Management Implications and Processes by Projects Managers on Project Completion

Authors: Mamoon Mousa Atout

Abstract:

Projects managers in construction industry usually face a difficult organizational environment especially if the project is unique. The organization lacks the processes to practice construction management correctly, and the executive’s technical managers who have lack of experience in playing their role and responsibilities correctly. Project managers need to adopt best practices that allow them to do things effectively to make sure that the project can be delivered without any delay even though the executive’s technical managers should follow a certain process to avoid any factor might cause any delay during the project life cycle. The purpose of the paper is to examine the awareness level of projects managers about construction management processes, tools, techniques and implications to complete projects on time. The outcome and the results of the study are prepared based on the designed questionnaires and interviews conducted with many project managers. The method used in this paper is a quantitative study. A survey with a sample of 100 respondents was prepared and distributed in a construction company in Dubai, which includes nine questions to examine the level of their awareness. This research will also identify the necessary benefits of processes of construction management that has to be adopted by projects managers to mitigate the maximum potential problems which might cause any delay to the project life cycle.

Keywords: construction management, project objectives, resource planing and scheduling, project completion

Procedia PDF Downloads 406
2569 Use of Artificial Intelligence Based Models to Estimate the Use of a Spectral Band in Cognitive Radio

Authors: Danilo López, Edwin Rivas, Fernando Pedraza

Abstract:

Currently, one of the major challenges in wireless networks is the optimal use of radio spectrum, which is managed inefficiently. One of the solutions to existing problem converges in the use of Cognitive Radio (CR), as an essential parameter so that the use of the available licensed spectrum is possible (by secondary users), well above the usage values that are currently detected; thus allowing the opportunistic use of the channel in the absence of primary users (PU). This article presents the results found when estimating or predicting the future use of a spectral transmission band (from the perspective of the PU) for a chaotic type channel arrival behavior. The time series prediction method (which the PU represents) used is ANFIS (Adaptive Neuro Fuzzy Inference System). The results obtained were compared to those delivered by the RNA (Artificial Neural Network) algorithm. The results show better performance in the characterization (modeling and prediction) with the ANFIS methodology.

Keywords: ANFIS, cognitive radio, prediction primary user, RNA

Procedia PDF Downloads 425
2568 Structural, Electrochemical and Electrocatalysis Studies of a New 2D Metal-Organic Coordination Polymer of Ni (II) Constructed by Naphthalene-1,4-Dicarboxylic Acid; Oxidation and Determination of Fructose

Authors: Zohreh Derikvand

Abstract:

One new 2D metal-organic coordination polymer of Ni(II) namely [Ni2(ndc)2(DMSO)4(H2O)]n, where ndc = naphthalene-1,4-dicarboxylic acid and DMSO= dimethyl sulfoxide has been synthesized and characterized by elemental analysis, spectral (IR, UV-Vis), thermal (TG/DTG) analysis and single crystal X-ray diffraction. Compound 1 possesses a 2D layer structure constructed from dinuclear nickel(II) building blocks in which two crystallographically independent Ni2+ ions are bridged by ndc2– ligands and water molecule. The ndc2– ligands adopt μ3 bridging modes, linking the metal centers into a two-dimensional coordination framework. The two independent NiII cations are surrounded by dimethyl sulfoxide and naphthalene-1,4-dicarboxylate molecules in distorted octahedron geometry. In the crystal structures of 1 there are non-classical hydrogen bonding arrangements and C-H–π stacking interactions. Electrochemical behavior of [Ni2(ndc)2(DMSO)4(H2O)]n, (Ni-NDA) on the surface of carbon nanotube (CNTs) glassy carbon electrode (GCE) was described. The surface structure and composition of the sensor were characterized by scanning electron microscopy (SEM). Oxidation of fructose on the surface of modified electrode was investigated with cyclic voltammetry and electrochemical impedance spectroscopy (EIS) and the results showed that the Ni-NDA/CNTs film displays excellent electrochemical catalytic activities towards fructose oxidation.

Keywords: naphthalene-1, 4-dicarboxylic acid, crystal structure, coordination polymer, electrocatalysis, impedance spectroscopy

Procedia PDF Downloads 333
2567 White Light Emission through Downconversion of Terbium and Europium Doped CEF3 Nanophosphors

Authors: Mohit Kalra, Varun S., Mayuri Gandhi

Abstract:

CeF3 nanophosphors has been extensively investigated in the recent years for lighting and numerous bio-applications. Down conversion emissions in CeF3:Eu3+/Tb3+ phosphors were studied with the aim of obtaining a white light emitting composition, by a simple co-precipitation method. The material was characterized by X-ray Diffraction (XRD), High Resolution Transmission Electron Microscopy (HR-TEM), Fourier Transform Infrared Spectroscopy (FT-IR) and Photoluminescence (PL). Uniformly distributed nanoparticles were obtained with an average particle size 8-10 nm. Different doping concentrations were performed and fluorescence study was carried out to optimize the dopants concentration for maximum luminescence intensity. The steady state and time resolved luminescence studies confirmed efficient energy transfer from the host to activator ions. Different concentrations of Tb 3+, Eu 3+ were doped to achieve a white light emitting phosphor for UV-based Light Emitting Diodes (LEDs). The nanoparticles showed characteristic emission of respective dopants (Eu 3+, Tb3+) when excited at the 4f→5d transition of Ce3+. The chromaticity coordinates for these samples were calculated and the CeF3 doped with Eu 3+ and Tb3+ gave an emission very close to white light. These materials may find its applications in optoelectronics and various bio applications.

Keywords: white light down-conversion, nanophosphors, LEDs, rare earth, cerium fluoride, lanthanides

Procedia PDF Downloads 408
2566 Cellulose Acetate/Polyacrylic Acid Filled with Nano-Hydroxapatite Composites: Spectroscopic Studies and Search for Biomedical Applications

Authors: E. M. AbdelRazek, G. S. ElBahy, M. A. Allam, A. M. Abdelghany, A. M. Hezma

Abstract:

Polymeric biocomposite of hydroxyapatite/polyacrylic acid were prepared and their thermal and mechanical properties were improved by addition of cellulose acetate. FTIR spectroscopy technique and X-ray diffraction analysis were employed to examine the physical and chemical characteristics of the biocomposites. Scanning electron microscopy shows a uniform distribution of HAp nano-particles through the polymeric matrix of two organic/inorganic composites weight ratios (60/40 and 70/30), at which the material crystallinity reaches a considerable value appropriate for the needed applications were studied and revealed that the HAp nano-particles are uniformly distributed in the polymeric matrix. Kinetic parameters were determined from the weight loss data using non isothermal thermogravimetric analysis (TGA). Also, the main degradation steps were described and discussed. The mechanical properties of composites were evaluated by measuring tensile strength and elastic modulus. The data indicate that the addition of cellulose acetate can make homogeneous composites scaffold significantly resistant to higher stress. Elastic modulus of the composites was also improved by the addition of cellulose acetate, making them more appropriate for bioapplications.

Keywords: biocomposite, chemical synthesis, infrared spectroscopy, mechanical properties

Procedia PDF Downloads 461
2565 Barriers Facing the Implementation of Lean Manufacturing in Libyan Manufacturing Companies

Authors: Mohamed Abduelmula, Martin Birkett, Chris Connor

Abstract:

Lean Manufacturing has developed from being a set of tools and methods to becoming a management philosophy which can be used to remove or reduce waste in manufacturing processes and so enhance the operational productivity of an enterprise. Several enterprises around the world have applied the lean manufacturing system and gained great improvements. This paper investigates the barriers and obstacles that face Libyan manufacturing companies to implement lean manufacturing. A mixed-method approach is suggested, starting with conducting a questionnaire to get quantitative data then using this to develop semi-structured interviews to collect qualitative data. The findings of the questionnaire results and how these can be used further develop the semi-structured interviews are then discussed. The survey was distributed to 65 manufacturing companies in Libya, and a response rate of 64.6% was obtained. The results showed that these are five main barriers to implementing lean in Libya, namely organizational culture, skills and expertise, and training program, financial capability, top management, and communication. These barriers were also identified from the literature as being significant obstacles to implementing Lean in other countries industries. Having an understanding of the difficulties that face the implementation of lean manufacturing systems, as a new and modern system and using this to develop a suitable framework will help to improve the manufacturing sector in Libya.

Keywords: lean manufacturing, barriers, questionnaire, Libyan manufacturing companies

Procedia PDF Downloads 253
2564 Sub-Pixel Mapping Based on New Mixed Interpolation

Authors: Zeyu Zhou, Xiaojun Bi

Abstract:

Due to the limited environmental parameters and the limited resolution of the sensor, the universal existence of the mixed pixels in the process of remote sensing images restricts the spatial resolution of the remote sensing images. Sub-pixel mapping technology can effectively improve the spatial resolution. As the bilinear interpolation algorithm inevitably produces the edge blur effect, which leads to the inaccurate sub-pixel mapping results. In order to avoid the edge blur effect that affects the sub-pixel mapping results in the interpolation process, this paper presents a new edge-directed interpolation algorithm which uses the covariance adaptive interpolation algorithm on the edge of the low-resolution image and uses bilinear interpolation algorithm in the low-resolution image smooth area. By using the edge-directed interpolation algorithm, the super-resolution of the image with low resolution is obtained, and we get the percentage of each sub-pixel under a certain type of high-resolution image. Then we rely on the probability value as a soft attribute estimate and carry out sub-pixel scale under the ‘hard classification’. Finally, we get the result of sub-pixel mapping. Through the experiment, we compare the algorithm and the bilinear algorithm given in this paper to the results of the sub-pixel mapping method. It is found that the sub-pixel mapping method based on the edge-directed interpolation algorithm has better edge effect and higher mapping accuracy. The results of the paper meet our original intention of the question. At the same time, the method does not require iterative computation and training of samples, making it easier to implement.

Keywords: remote sensing images, sub-pixel mapping, bilinear interpolation, edge-directed interpolation

Procedia PDF Downloads 234
2563 Autonomous Ground Vehicle Navigation Based on a Single Camera and Image Processing Methods

Authors: Auday Al-Mayyahi, Phil Birch, William Wang

Abstract:

A vision system-based navigation for autonomous ground vehicle (AGV) equipped with a single camera in an indoor environment is presented. A proposed navigation algorithm has been utilized to detect obstacles represented by coloured mini- cones placed in different positions inside a corridor. For the recognition of the relative position and orientation of the AGV to the coloured mini cones, the features of the corridor structure are extracted using a single camera vision system. The relative position, the offset distance and steering angle of the AGV from the coloured mini-cones are derived from the simple corridor geometry to obtain a mapped environment in real world coordinates. The corridor is first captured as an image using the single camera. Hence, image processing functions are then performed to identify the existence of the cones within the environment. Using a bounding box surrounding each cone allows to identify the locations of cones in a pixel coordinate system. Thus, by matching the mapped and pixel coordinates using a projection transformation matrix, the real offset distances between the camera and obstacles are obtained. Real time experiments in an indoor environment are carried out with a wheeled AGV in order to demonstrate the validity and the effectiveness of the proposed algorithm.

Keywords: autonomous ground vehicle, navigation, obstacle avoidance, vision system, single camera, image processing, ultrasonic sensor

Procedia PDF Downloads 304
2562 Evaluation of Interaction Between Fans and Celebrities in New Media

Authors: Mohadese Motahari

Abstract:

In general, we consider the phenomenon of "fandism" or extreme fandom to be an aspect of fandom for a person, a group, or a collection, which leads to extreme support for them. So, for example, we consider a fan or a "fanatic" (which literally means a "fanatical person") to be a person who is extremely interested in a certain topic or topics and has a special passion and fascination for that issue. It may also be beyond the scope of logic and normal behavior of the society. With the expansion of the media and the advancement of technology, the phenomenon of fandom also underwent many changes and not only became more intense, but a large economy was also formed alongside it, and it is becoming more and more important every day. This economy, which emerged from the past with the formation of the first media, has now taken a different form with the development of media and social networks, as well as the change in the interaction between celebrities and audiences. Earning huge amounts of money with special methods in every social network and every media is achieved through fans and fandoms. In this article, we have studied the relationship between fans and famous people with reference to the economic debates surrounding it.

Keywords: fandism, famous people, social media, new media

Procedia PDF Downloads 93
2561 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP

Procedia PDF Downloads 323
2560 Numerical Investigation of Static and Dynamic Responses of Fiber Reinforced Sand

Authors: Sandeep Kumar, Mahesh Kumar Jat, Rajib Sarkar

Abstract:

Soil reinforced with randomly distributed fibers is an attractive means to improve the performance of soil in a cost effective manner. Static and dynamic characterization of fiber reinforced soil have become important to evaluate adequate performance for all classes of geotechnical engineering problems. Present study investigates the behaviour of fiber reinforced cohesionless soil through numerical simulation of triaxial specimen. The numerical model has been validated with the existing literature of laboratory triaxial compression testing. A parametric study has been done to find out optimum fiber content for shear resistance. Cyclic triaxial testing has been simulated and the stress-strain response of fiber-reinforced sand has been examined considering different combination of fiber contents. Shear modulus values and damping values of fiber-reinforced sand are evaluated. It has been observed from results that for 1.0 percent fiber content shear modulus increased 2.28 times and damping ratio decreased 4.6 times. The influence of amplitude of cyclic strain, confining pressure and frequency of loading on the dynamic properties of fiber reinforced sand has been investigated and presented.

Keywords: damping, fiber reinforced soil, numerical modelling, shear modulus

Procedia PDF Downloads 282
2559 Citation Analysis of New Zealand Court Decisions

Authors: Tobias Milz, L. Macpherson, Varvara Vetrova

Abstract:

The law is a fundamental pillar of human societies as it shapes, controls and governs how humans conduct business, behave and interact with each other. Recent advances in computer-assisted technologies such as NLP, data science and AI are creating opportunities to support the practice, research and study of this pervasive domain. It is therefore not surprising that there has been an increase in investments into supporting technologies for the legal industry (also known as “legal tech” or “law tech”) over the last decade. A sub-discipline of particular appeal is concerned with assisted legal research. Supporting law researchers and practitioners to retrieve information from the vast amount of ever-growing legal documentation is of natural interest to the legal research community. One tool that has been in use for this purpose since the early nineteenth century is legal citation indexing. Among other use cases, they provided an effective means to discover new precedent cases. Nowadays, computer-assisted network analysis tools can allow for new and more efficient ways to reveal the “hidden” information that is conveyed through citation behavior. Unfortunately, access to openly available legal data is still lacking in New Zealand and access to such networks is only commercially available via providers such as LexisNexis. Consequently, there is a need to create, analyze and provide a legal citation network with sufficient data to support legal research tasks. This paper describes the development and analysis of a legal citation Network for New Zealand containing over 300.000 decisions from 125 different courts of all areas of law and jurisdiction. Using python, the authors assembled web crawlers, scrapers and an OCR pipeline to collect and convert court decisions from openly available sources such as NZLII into uniform and machine-readable text. This facilitated the use of regular expressions to identify references to other court decisions from within the decision text. The data was then imported into a graph-based database (Neo4j) with the courts and their respective cases represented as nodes and the extracted citations as links. Furthermore, additional links between courts of connected cases were added to indicate an indirect citation between the courts. Neo4j, as a graph-based database, allows efficient querying and use of network algorithms such as PageRank to reveal the most influential/most cited courts and court decisions over time. This paper shows that the in-degree distribution of the New Zealand legal citation network resembles a power-law distribution, which indicates a possible scale-free behavior of the network. This is in line with findings of the respective citation networks of the U.S. Supreme Court, Austria and Germany. The authors of this paper provide the database as an openly available data source to support further legal research. The decision texts can be exported from the database to be used for NLP-related legal research, while the network can be used for in-depth analysis. For example, users of the database can specify the network algorithms and metrics to only include specific courts to filter the results to the area of law of interest.

Keywords: case citation network, citation analysis, network analysis, Neo4j

Procedia PDF Downloads 112
2558 A Comprehensive Safety Analysis for a Pressurized Water Reactor Fueled with Mixed-Oxide Fuel as an Accident Tolerant Fuel

Authors: Mohamed Y. M. Mohsen

Abstract:

The viability of utilising mixed-oxide fuel (MOX) ((U₀.₉, rgPu₀.₁) O₂) as an accident-tolerant fuel (ATF) has been thoroughly investigated. MOX fuel provides the best example of a nuclear waste recycling process. The MCNPX 2.7 code was used to determine the main neutronic features, especially the radial power distribution, to identify the hot channel on which the thermal-hydraulic (TH) study was performed. Based on the computational fluid dynamics technique, the simulation of the rod-centered thermal-hydraulic subchannel model was implemented using COMSOL Multiphysics. TH analysis was utilised to determine the axially and radially distributed temperatures of the fuel and cladding materials, as well as the departure from the nucleate boiling ratio (DNBR) along the coolant channel. COMSOL Multiphysics can simulate reality by coupling multiphysics, such as coupling between heat transfer and solid mechanics. The main solid structure parameters, such as the von Mises stress, volumetric strain, and displacement, were simulated using this coupling. When the neutronic, TH, and solid structure performances of UO₂ and ((U₀.₉, rgPu₀.₁) O₂) were compared, the results showed considerable improvement and an increase in safety margins with the use of ((U₀.₉, rgPu₀.₁) O₂).

Keywords: mixed-oxide, MCNPX, neutronic analysis, COMSOL-multiphysics, thermal-hydraulic, solid structure

Procedia PDF Downloads 110
2557 Cracks Detection and Measurement Using VLP-16 LiDAR and Intel Depth Camera D435 in Real-Time

Authors: Xinwen Zhu, Xingguang Li, Sun Yi

Abstract:

Crack is one of the most common damages in buildings, bridges, roads and so on, which may pose safety hazards. However, cracks frequently happen in structures of various materials. Traditional methods of manual detection and measurement, which are known as subjective, time-consuming, and labor-intensive, are gradually unable to meet the needs of modern development. In addition, crack detection and measurement need be safe considering space limitations and danger. Intelligent crack detection has become necessary research. In this paper, an efficient method for crack detection and quantification using a 3D sensor, LiDAR, and depth camera is proposed. This method works even in a dark environment, which is usual in real-world applications. The LiDAR rapidly spins to scan the surrounding environment and discover cracks through lasers thousands of times per second, providing a rich, 3D point cloud in real-time. The LiDAR provides quite accurate depth information. The precision of the distance of each point can be determined within around  ±3 cm accuracy, and not only it is good for getting a precise distance, but it also allows us to see far of over 100m going with the top range models. But the accuracy is still large for some high precision structures of material. To make the depth of crack is much more accurate, the depth camera is in need. The cracks are scanned by the depth camera at the same time. Finally, all data from LiDAR and Depth cameras are analyzed, and the size of the cracks can be quantified successfully. The comparison shows that the minimum and mean absolute percentage error between measured and calculated width are about 2.22% and 6.27%, respectively. The experiments and results are presented in this paper.

Keywords: LiDAR, depth camera, real-time, detection and measurement

Procedia PDF Downloads 235
2556 Cooperative Coevolution for Neuro-Evolution of Feed Forward Networks for Time Series Prediction Using Hidden Neuron Connections

Authors: Ravneil Nand

Abstract:

Cooperative coevolution uses problem decomposition methods to solve a larger problem. The problem decomposition deals with breaking down the larger problem into a number of smaller sub-problems depending on their method. Different problem decomposition methods have their own strengths and limitations depending on the neural network used and application problem. In this paper we are introducing a new problem decomposition method known as Hidden-Neuron Level Decomposition (HNL). The HNL method is competing with established problem decomposition method in time series prediction. The results show that the proposed approach has improved the results in some benchmark data sets when compared to the standalone method and has competitive results when compared to methods from literature.

Keywords: cooperative coevaluation, feed forward network, problem decomposition, neuron, synapse

Procedia PDF Downloads 343
2555 Sustainable Use of Fresh Groundwater Lens of Pleistocene Aquifer in Nam Dinh, Vietnam

Authors: Tran Thanh Le, Pham Trong Duc

Abstract:

The fresh groundwater lens of the Pleistocene aquifer in Nam Dinh was formed since 12,900 years ago. Currently, the Pleistocene aquifer has been continuously exploited on average of 154,163m3/day, distributed mainly in the districts of Nghia Hung, Hai Hau, a part of Truc Ninh, Y Yen, Nam Truc and Giao Thuy. The groundwater level is still on a declining trend, saltwater intrusion in this freshwater lens can occur if the growth rate in exploitation is maintained. This study focused on groundwater sustainable use by means of 4 groups of criteria including: Groundwater quality and pollution; Aquifers’ productivity and capacity; Environment impacts due to exploitation (groundwater level decline, land subsidence due to water exploitation); Social and economic impacts. Using a combination of methods including field surveys, geophysics, hydrogeochemistry, isotope and numerical models to determine safe groundwater exploitation thresholds for the whole study area has been determined to be 544,314m3/day and the actual exploitation amount is currently about 30% compared to the safe exploitation threshold. However, it should also be noted that the current groundwater exploitation threshold and level of its exploitation compared to the safe exploitation threshold of each locality are not the same. From this result, the groundwater exploitation threshold map of the study area was established to serve the management, licensing and orientation of groundwater exploitation.

Keywords: criteria, groundwater, fresh groundwater lens, pleistocene, Nam Dinh

Procedia PDF Downloads 162
2554 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic

Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi

Abstract:

In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.

Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing

Procedia PDF Downloads 303
2553 Urban Transformation as a Process for Inner-City Slums in Turkey the Experience of Gaziantep City, Turkey

Authors: Samer Katerji, Mustafa Ozakça, Esra Demircioğlu

Abstract:

The inner-city slums become a global phenomenon problem. It is widely distributed in separate zones through the urban textures, threatens cities in physical, economic and social aspects. It often has illegal settlements with unsafe and unhealthy conditions. By the time, it grown up rapidly followed by growing in its problems. According to United Nations, in some cities, up to 80 percent of the population lives in slums. Fifty-five million new slum dwellers have been added to the global population since 2000. Both developed and developing countries started to figure out mechanics to find solutions, which is suitable to solve the inner-city slums problems. In turn, the planning agenda of Turkey has been focused on urban transformation as a solution for inner-city slums problems since the 2000s. The current laws after 2004 changed all of the statements on the urban transformation of the country. This paper come to explain the urban transformation approach as qualified presses in dealing with inner-city slums problems of turkey. After that, it highlights one of the earliest ongoing transformation projects in Gaziantep city, which is adopted by the local municipalities. The study includes assessment of the pros and cons of pursuing the project and identifying the potential consequences. This is more likely to keep up with the efforts of Gaziantep Municipality in developing and transforming slum areas.

Keywords: transformation, urban, slums, Gaziantep

Procedia PDF Downloads 503
2552 Information in Public Domain: How Far It Measures Government's Accountability

Authors: Sandip Mitra

Abstract:

Studies on Governance and Accountability has often stressed the need to release Data in public domain to increase transparency ,which otherwise act as an evidence of performance. However, inefficient handling, lack of capacity and the dynamics of transfers (especially fund transfers) are important issues which need appropriate attention. E-Governance alone can not serve as a measure of transparency as long as a comprehensive planning is instituted. Studies on Governance and public exposure has often triggered public opinion in favour or against any government. The root of the problem (especially in local governments) lies in the management of the governance. The participation of the people in the local government functioning, the networks within and outside the locality, synergy with various layers of Government are crucial in understanding the activities of any government. Unfortunately, data on such issues are not released in the public domain .If they are at all released , the extraction of information is often hindered for complicated designs. A Study has been undertaken with a few local Governments in India. The data has been analysed to substantiate the views.

Keywords: accountability, e-governance, transparency, local government

Procedia PDF Downloads 439