Search results for: Solve Elmstahl
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1706

Search results for: Solve Elmstahl

566 Object Recognition System Operating from Different Type Vehicles Using Raspberry and OpenCV

Authors: Maria Pavlova

Abstract:

In our days, it is possible to put the camera on different vehicles like quadcopter, train, airplane and etc. The camera also can be the input sensor in many different systems. That means the object recognition like non separate part of monitoring control can be key part of the most intelligent systems. The aim of this paper is to focus of the object recognition process during vehicles movement. During the vehicle’s movement the camera takes pictures from the environment without storage in Data Base. In case the camera detects a special object (for example human or animal), the system saves the picture and sends it to the work station in real time. This functionality will be very useful in emergency or security situations where is necessary to find a specific object. In another application, the camera can be mounted on crossroad where do not have many people and if one or more persons come on the road, the traffic lights became the green and they can cross the road. In this papers is presented the system has solved the aforementioned problems. It is presented architecture of the object recognition system includes the camera, Raspberry platform, GPS system, neural network, software and Data Base. The camera in the system takes the pictures. The object recognition is done in real time using the OpenCV library and Raspberry microcontroller. An additional feature of this library is the ability to display the GPS coordinates of the captured objects position. The results from this processes will be sent to remote station. So, in this case, we can know the location of the specific object. By neural network, we can learn the module to solve the problems using incoming data and to be part in bigger intelligent system. The present paper focuses on the design and integration of the image recognition like a part of smart systems.

Keywords: camera, object recognition, OpenCV, Raspberry

Procedia PDF Downloads 206
565 Identification of Spam Keywords Using Hierarchical Category in C2C E-Commerce

Authors: Shao Bo Cheng, Yong-Jin Han, Se Young Park, Seong-Bae Park

Abstract:

Consumer-to-Consumer (C2C) E-commerce has been growing at a very high speed in recent years. Since identical or nearly-same kinds of products compete one another by relying on keyword search in C2C E-commerce, some sellers describe their products with spam keywords that are popular but are not related to their products. Though such products get more chances to be retrieved and selected by consumers than those without spam keywords, the spam keywords mislead the consumers and waste their time. This problem has been reported in many commercial services like e-bay and taobao, but there have been little research to solve this problem. As a solution to this problem, this paper proposes a method to classify whether keywords of a product are spam or not. The proposed method assumes that a keyword for a given product is more reliable if the keyword is observed commonly in specifications of products which are the same or the same kind as the given product. This is because that a hierarchical category of a product in general determined precisely by a seller of the product and so is the specification of the product. Since higher layers of the hierarchical category represent more general kinds of products, a reliable degree is differently determined according to the layers. Hence, reliable degrees from different layers of a hierarchical category become features for keywords and they are used together with features only from specifications for classification of the keywords. Support Vector Machines are adopted as a basic classifier using the features, since it is powerful, and widely used in many classification tasks. In the experiments, the proposed method is evaluated with a golden standard dataset from Yi-han-wang, a Chinese C2C e-commerce, and is compared with a baseline method that does not consider the hierarchical category. The experimental results show that the proposed method outperforms the baseline in F1-measure, which proves that spam keywords are effectively identified by a hierarchical category in C2C e-commerce.

Keywords: spam keyword, e-commerce, keyword features, spam filtering

Procedia PDF Downloads 278
564 Retrospective Study of Positive Blood Cultures Carried out in the Microbiology Department of General Hospital of Ioannina in 2017

Authors: M. Gerasimou, S. Mantzoukis, P. Christodoulou, N. Varsamis, G. Kolliopoulou, N. Zotos

Abstract:

Purpose: Microbial infection of the blood is a serious condition where bacteria invade the bloodstream and cause systemic disease. In such cases, blood cultures are performed. Blood cultures are a key diagnostic test for intensive care unit (ICU) patients. Material and method: The BacT/Alert system, which measures the production of carbon dioxide with metabolic organisms, is used. The positive result in the BacT/Alert system is followed by culture in the following selective media: Blood, Mac Conkey No 2, Chocolate, Mueller Hinton, Chapman and Sabaureaud agar. Gram staining method was used to differentiate bacterial species. The microorganisms were identified by biochemical techniques in the automated Microscan (Siemens) system and followed by a sensitivity test on the same system using the minimum inhibitory concentration MIC technique. The sensitivity test is verified by a Kirby Bauer-based test. Results: In 2017 the Laboratory of Microbiology received 3347 blood cultures. Of these, 170 came from the ICU. 116 found positive. Of these S. epidermidis was identified in 42, A. baumannii in 27, K. pneumoniae in 12 (4 of these KPC ‘Klebsiella pneumoniae carbapenemase’), S. hominis in 8, E. faecium in 7, E. faecalis in 5, P. aeruginosa in 3, C. albicans in 3, S. capitis in 2, K. oxytoca in 2, P. mirabilis in 2, E. coli in 1, S. intermidius in 1 and S. lugdunensis in 1. Conclusions: The study of epidemiological data and microbial resistance phenotypes is essential for the choice of therapeutic regimen for the early treatment and limitation of multivalent strains, while it is a crucial factor to solve diagnostic problems.

Keywords: blood culture, bloodstream, infection, intensive care unit

Procedia PDF Downloads 134
563 Diversifying from Petroleum Products to Arable Farming as Source of Revenue Generation in Nigeria: A Case Study of Ondo West Local Government

Authors: A. S. Akinbani

Abstract:

Overdependence on petroleum is causing set back in Nigeria economy. Field survey was carried out to assess the profitability and production of selected arable crops in six selected towns and villages of Ondo southwestern. Data were collected from 240 arable crop farmers with the aid of both primary and secondary data. Data were collected with the use of oral interview and structured questionnaires. Data collected were analyzed using both descriptive and inferential statistics. Forty farmers were randomly selected to give a total number of 240 respondents. 84 farmers interviewed had no formal education, 72 had primary education, 50 farmers attained secondary education while 38 attained beyond secondary education. The majority of the farmers hold less than 10 acres of land. The data collected from the field showed that 192 farmers practiced mixed cropping which includes mixtures of yam, cowpea, cocoyam, vegetable, cassava and maize while only 48 farmers practiced monocropping. Among the sampled farmers, 93% agreed that arable production is profitable while 7% disagreed. The findings show that managerial practices that conserve the soil fertility and reduce labor cost such as planting of leguminous crops and herbicide application instead of using hand held hoe for weeding should be encouraged. All the respondents agreed that yam, cowpea, cocoyam, sweet potato, rice, maize and vegetable production will solve the problem of hunger and increase standard of living compared with petroleum product that Nigeria relied on as means of livelihood.

Keywords: farmers, arable crop, cocoyam, respondents, maize

Procedia PDF Downloads 237
562 Device for Reversible Hydrogen Isotope Storage with Aluminum Oxide Ceramic Case

Authors: Igor P. Maximkin, Arkady A. Yukhimchuk, Victor V. Baluev, Igor L. Malkov, Rafael K. Musyaev, Damir T. Sitdikov, Alexey V. Buchirin, Vasily V. Tikhonov

Abstract:

Minimization of tritium diffusion leakage when developing devices handling tritium-containing media is key problems whose solution will at least allow essential enhancement of radiation safety and minimization of diffusion losses of expensive tritium. One of the ways to solve this problem is to use Al₂O₃ high-strength non-porous ceramics as a structural material of the bed body. This alumina ceramics offers high strength characteristics, but its main advantages are low hydrogen permeability (as against the used structural material) and high dielectric properties. The latter enables direct induction heating of an hydride-forming metal without essential heating of the pressure and containment vessel. The use of alumina ceramics and induction heating allows: - essential reduction of tritium extraction time; - several orders reduction of tritium diffusion leakage; - more complete extraction of tritium from metal hydrides due to its higher heating up to melting in the event of final disposal of the device. The paper presents computational and experimental results for the tritium bed designed to absorb 6 liters of tritium. Titanium was used as hydrogen isotope sorbent. Results of hydrogen realize kinetic from hydride-forming metal, strength and cyclic service life tests are reported. Recommendations are also provided for the practical use of the given bed type.

Keywords: aluminum oxide ceramic, hydrogen pressure, hydrogen isotope storage, titanium hydride

Procedia PDF Downloads 387
561 How Different Perceived Affordances of Game Elements Shape Motivation and Performance in Gamified Learning: A Cognitive Evaluation Theory Perspective

Authors: Kibbeum Na

Abstract:

Previous gamification research has produced mixed results regarding the effectiveness of gamified learning. One possible explanation for this is that individuals perceive the game elements differently. Cognitive Evaluation Theory posits that external rewards can boost or undermine intrinsic motivation, depending on whether the rewards are perceived as informational or controlling. This research tested the hypothesis that game elements can be perceived as either informational feedback or external reward, and the motivational impact differ accordingly. An experiment was conducted using an educational math puzzle to compare the motivation and performance as a result of different perceived affordances game elements. Participants were primed to perceive the game elements as either informational feedback or external reward, and the duration of an attempt to solve the unsolvable puzzle – amotivation indicator – and the puzzle score – a performance indicator–were measured with the game elements incorporated and then without the game elements. Badges and points were deployed as the main game elements. Results showed that, regardless of priming, a significant decrease in performance occurred when the game elements were removed, whereas the control group who solved non-gamified math puzzles maintained their performance. The undermined performance with gamification removal indicates that learners may perceive some game elements as controlling factors irrespective of the way they are presented. The results of the current study also imply that some game elements are better not being implemented to preserve long-term performance. Further research delving into the extrinsic reward-like nature of game elements and its impact on learning motivation is called for.

Keywords: cognitive Evaluation Theory, game elements, gamification, motivation, motivational affordance, performance

Procedia PDF Downloads 85
560 Modeling and Simulation of Vibratory Behavior of Hybrid Smart Composite Plate

Authors: Salah Aguib, Noureddine Chikh, Abdelmalek Khabli, Abdelkader Nour, Toufik Djedid, Lallia Kobzili

Abstract:

This study presents the behavior of a hybrid smart sandwich plate with a magnetorheological elastomer core. In order to improve the vibrational behavior of the plate, the pseudo‐fibers formed by the effect of the magnetic field on the elastomer charged by the ferromagnetic particles are oriented at 45° with respect to the direction of the magnetic field at 0°. Ritz's approach is taken to solve the physical problem. In order to verify and compare the results obtained by the Ritz approach, an analysis using the finite element method was carried out. The rheological property of the MRE material at 0° and at 45° are determined experimentally, The studied elastomer is prepared by a mixture of silicone oil, RTV141A polymer, and 30% of iron particles of total mixture, the mixture obtained is mixed for about 15 minutes to obtain an elastomer paste with good homogenization. In order to develop a magnetorheological elastomer (MRE), this paste is injected into an aluminum mold and subjected to a magnetic field. In our work, we have chosen an ideal percentage of filling of 30%, to obtain the best characteristics of the MRE. The mechanical characteristics obtained by dynamic mechanical viscoanalyzer (DMA) are used in the two numerical approaches. The natural frequencies and the modal damping of the sandwich plate are calculated and discussed for various magnetic field intensities. The results obtained by the two methods are compared. These off‐axis anisotropic MRE structures could open up new opportunities in various fields of aeronautics, aerospace, mechanical engineering and civil engineering.

Keywords: hybrid smart sandwich plate, vibratory behavior, FEM, Ritz approach, MRE

Procedia PDF Downloads 52
559 An Improved Total Variation Regularization Method for Denoising Magnetocardiography

Authors: Yanping Liao, Congcong He, Ruigang Zhao

Abstract:

The application of magnetocardiography signals to detect cardiac electrical function is a new technology developed in recent years. The magnetocardiography signal is detected with Superconducting Quantum Interference Devices (SQUID) and has considerable advantages over electrocardiography (ECG). It is difficult to extract Magnetocardiography (MCG) signal which is buried in the noise, which is a critical issue to be resolved in cardiac monitoring system and MCG applications. In order to remove the severe background noise, the Total Variation (TV) regularization method is proposed to denoise MCG signal. The approach transforms the denoising problem into a minimization optimization problem and the Majorization-minimization algorithm is applied to iteratively solve the minimization problem. However, traditional TV regularization method tends to cause step effect and lacks constraint adaptability. In this paper, an improved TV regularization method for denoising MCG signal is proposed to improve the denoising precision. The improvement of this method is mainly divided into three parts. First, high-order TV is applied to reduce the step effect, and the corresponding second derivative matrix is used to substitute the first order. Then, the positions of the non-zero elements in the second order derivative matrix are determined based on the peak positions that are detected by the detection window. Finally, adaptive constraint parameters are defined to eliminate noises and preserve signal peak characteristics. Theoretical analysis and experimental results show that this algorithm can effectively improve the output signal-to-noise ratio and has superior performance.

Keywords: constraint parameters, derivative matrix, magnetocardiography, regular term, total variation

Procedia PDF Downloads 139
558 [Keynote Talk]: Applying p-Balanced Energy Technique to Solve Liouville-Type Problems in Calculus

Authors: Lina Wu, Ye Li, Jia Liu

Abstract:

We are interested in solving Liouville-type problems to explore constancy properties for maps or differential forms on Riemannian manifolds. Geometric structures on manifolds, the existence of constancy properties for maps or differential forms, and energy growth for maps or differential forms are intertwined. In this article, we concentrate on discovery of solutions to Liouville-type problems where manifolds are Euclidean spaces (i.e. flat Riemannian manifolds) and maps become real-valued functions. Liouville-type results of vanishing properties for functions are obtained. The original work in our research findings is to extend the q-energy for a function from finite in Lq space to infinite in non-Lq space by applying p-balanced technique where q = p = 2. Calculation skills such as Hölder's Inequality and Tests for Series have been used to evaluate limits and integrations for function energy. Calculation ideas and computational techniques for solving Liouville-type problems shown in this article, which are utilized in Euclidean spaces, can be universalized as a successful algorithm, which works for both maps and differential forms on Riemannian manifolds. This innovative algorithm has a far-reaching impact on research work of solving Liouville-type problems in the general settings involved with infinite energy. The p-balanced technique in this algorithm provides a clue to success on the road of q-energy extension from finite to infinite.

Keywords: differential forms, holder inequality, Liouville-type problems, p-balanced growth, p-harmonic maps, q-energy growth, tests for series

Procedia PDF Downloads 219
557 Improving Fault Tolerance and Load Balancing in Heterogeneous Grid Computing Using Fractal Transform

Authors: Saad M. Darwish, Adel A. El-Zoghabi, Moustafa F. Ashry

Abstract:

The popularity of the Internet and the availability of powerful computers and high-speed networks as low-cost commodity components are changing the way we use computers today. These technical opportunities have led to the possibility of using geographically distributed and multi-owner resources to solve large-scale problems in science, engineering, and commerce. Recent research on these topics has led to the emergence of a new paradigm known as Grid computing. To achieve the promising potentials of tremendous distributed resources, effective and efficient load balancing algorithms are fundamentally important. Unfortunately, load balancing algorithms in traditional parallel and distributed systems, which usually run on homogeneous and dedicated resources, cannot work well in the new circumstances. In this paper, the concept of a fast fractal transform in heterogeneous grid computing based on R-tree and the domain-range entropy is proposed to improve fault tolerance and load balancing algorithm by improve connectivity, communication delay, network bandwidth, resource availability, and resource unpredictability. A novel two-dimension figure of merit is suggested to describe the network effects on load balance and fault tolerance estimation. Fault tolerance is enhanced by adaptively decrease replication time and message cost while load balance is enhanced by adaptively decrease mean job response time. Experimental results show that the proposed method yields superior performance over other methods.

Keywords: Grid computing, load balancing, fault tolerance, R-tree, heterogeneous systems

Procedia PDF Downloads 470
556 The Study of Internship Performances: Comparison of Information Technology Interns towards Students’ Types and Background Profiles

Authors: Shutchapol Chopvitayakun

Abstract:

Internship program is a compulsory course of many undergraduate programs in Thailand. It gives opportunities to a lot of senior students as interns to practice their working skills in the real organizations and also gives chances for interns to face real-world working problems. Interns also learn how to solve those problems by direct and indirect experiences. This program in many schools is a well-structured course with a contract or agreement made with real business organizations. Moreover, this program also offers opportunities for interns to get jobs after completing it from where the internship program takes place. Interns also learn how to work as a team and how to associate with other colleagues, trainers, and superiors of each organization in term of social hierarchy, self-responsibility, and self-disciplinary. This research focuses on senior students of Suan Sunandha Rajabhat University, Thailand whose studying major is information technology program. They practiced their working skills or took internship programs in the real business sector or real operating organizations in 2015-2016. Interns are categorized in to two types: normal program and special program. For special program, students study in weekday evening from Monday to Friday or Weekend and most of them work full-time or part-time job. For normal program, students study in weekday working hours and most of them do not work. The differences of these characters and the outcomes of internship performance were studied and analyzed in this research. This work applied some statistical analytics to find out whether the internship performance of each intern type has different performances statistically or not.

Keywords: internship, intern, senior student, information technology program

Procedia PDF Downloads 250
555 A Study from Language and Culture Perspective of Human Needs in Chinese and Vietnamese Euphemism Languages

Authors: Quoc Hung Le Pham

Abstract:

Human beings are motivated to satisfy the physiological needs and psychological needs. In the fundamental needs, bodily excretion is the most basic one, while physiological excretion refers to the final products produced in the process of discharging the body. This physiological process is a common human phenomenon. For instance, bodily secretion is totally natural, but people of various nationalities through the times avoid saying it directly. Terms like ‘shit’ are often negatively regarded as dirty, smelly and vulgar; it will lead people to negative thinking. In fact, it is in the psychology of human beings to avoid such unsightly terms. Especially in social situations where you have to take care of your image, and you have to release. The best way to solve this is to approach the use of euphemism. People prefer to say it as ‘answering nature's call’ or ‘to pass a motion’ instead. Chinese and Vietnamese nations are referring to use euphemisms to replace bodily secretions, so this research will take this phenomenon as the object aims to explore the similarities and dissimilarities between two languages euphemism. The basic of the niche of this paper is human physiological phenomenon excretion. As the preliminary results show, in expressing bodily secretions the deeply impacting factor is language and cultural factors. On language factor terms, two languages are using assonance to replace human nature discharge, whilst the dissimilarities are metonymy, loan word and personification. On culture factor terms, the convergences are metonymy and application of the semantically-contrary-word-euphemism, whilst the difference is Chinese euphemism using allusion but Vietnamese euphemism does not.

Keywords: cultural factors, euphemism, human needs, language factors

Procedia PDF Downloads 278
554 Analysis Influence Variation Frequency on Characterization of Nano-Particles in Preteatment Bioetanol Oil Palm Stem (Elaeis guineensis JACQ) Use Sonication Method with Alkaline Peroxide Activators on Improvement of Celullose

Authors: Luristya Nur Mahfut, Nada Mawarda Rilek, Ameiga Cautsarina Putri, Mujaroh Khotimah

Abstract:

The use of bioetanol from lignocellulosic material has begone to be developed. In Indonesia the most abundant lignocellulosic material is stem of palm which contain 32.22% of cellulose. Indonesia produces approximatelly 300.375.000 tons of stem of palm each year. To produce bioetanol from lignocellulosic material, the first process is pretreatment. But, until now the method of lignocellulosic pretretament is uneffective. This is related to the particle size and the method of pretreatment of less than optimal so that led to an overhaul of the lignin insufficient, consequently increased levels of cellulose was not significant resulting in low yield of bioetanol. To solve the problem, this research was implemented by using the process of pretreatment method ultasonifikasi in order to produce higher pulp with nano-sized particles that will obtain higher of yield ethanol from stem of palm. Research methods used in this research is the RAK that is composed of one factor which is the frequency ultrasonic waves with three varians, they are 30 kHz, 40 kHz, 50 kHz, and use constant variable is concentration of NaOH. The analysis conducted in this research is the influence of the frequency of the wave to increase levels of cellulose and change size on the scale of nanometers on pretreatment process by using the PSA methods (Particle Size Analyzer), and a Cheason. For the analysis of the results, data, and best treatment using ANOVA and test BNT with confidence interval 5%. The best treatment was obtained by combination X3 (frequency of sonication 50 kHz) and lignin (19,6%) cellulose (59,49%) and hemicellulose (11,8%) with particle size 385,2nm (18,8%).

Keywords: bioethanol, pretreatment, stem of palm, cellulosa

Procedia PDF Downloads 315
553 Efficient Implementation of Finite Volume Multi-Resolution Weno Scheme on Adaptive Cartesian Grids

Authors: Yuchen Yang, Zhenming Wang, Jun Zhu, Ning Zhao

Abstract:

An easy-to-implement and robust finite volume multi-resolution Weighted Essentially Non-Oscillatory (WENO) scheme is proposed on adaptive cartesian grids in this paper. Such a multi-resolution WENO scheme is combined with the ghost cell immersed boundary method (IBM) and wall-function technique to solve Navier-Stokes equations. Unlike the k-exact finite volume WENO schemes which involve large amounts of extra storage, repeatedly solving the matrix generated in a least-square method or the process of calculating optimal linear weights on adaptive cartesian grids, the present methodology only adds very small overhead and can be easily implemented in existing edge-based computational fluid dynamics (CFD) codes with minor modifications. Also, the linear weights of this adaptive finite volume multi-resolution WENO scheme can be any positive numbers on condition that their sum is one. It is a way of bypassing the calculation of the optimal linear weights and such a multi-resolution WENO scheme avoids dealing with the negative linear weights on adaptive cartesian grids. Some benchmark viscous problems are numerical solved to show the efficiency and good performance of this adaptive multi-resolution WENO scheme. Compared with a second-order edge-based method, the presented method can be implemented into an adaptive cartesian grid with slight modification for big Reynolds number problems.

Keywords: adaptive mesh refinement method, finite volume multi-resolution WENO scheme, immersed boundary method, wall-function technique.

Procedia PDF Downloads 136
552 Study of Composite Materials for Aisha Containment Chamber

Authors: G. Costa, F. Noto, L. Celona, F. Chines, G. Ciavola, G. Cuttone, S. Gammino, O. Leonardi, S. Marletta, G. Torrisi

Abstract:

The ion sources for accelerators devoted to medical applications must provide intense ion beams, with high reproducibility, stability and brightness. AISHa (Advanced Ion Source for Hadron-therapy) is a compact ECRIS whose hybrid magnetic system consists of a permanent Halbach-type hexapole magnet and a set of independently energized superconducting coils. These coils will be enclosed in a compact cryostat with two cryocoolers for LHe-free operation. The AISHa ion source has been designed by taking into account the typical requirements of hospital-based facilities, where the minimization of the mean time between failures (MTBF) is a key point together with the maintenance operations which should be fast and easy. It is intended to be a multipurpose device, operating at 18 GHz, in order to achieve higher plasma densities. It should provide enough versatility for future needs of the hadron therapy, including the ability to run at larger microwave power to produce different species and highly charged ion beams. The source is potentially interesting for any hadrontherapy center using heavy ions. In the paper, we designed an innovative solution for the plasma containment chamber that allows us to solve our isolation and structural problems. We analyzed the materials chosen for our aim (glass fibers and carbon fibers) and we illustrated the all process (spinning, curing and machining) of the assembly of our chamber. The glass fibers and carbon fibers are used to reinforce polymer matrices and give rise to structural composites and composites by molding.

Keywords: hadron-therapy, carbon fiber, glass fiber, vacuum-bag, ECR, ion source

Procedia PDF Downloads 195
551 On the Significance of Preparing a Professional Literature Review in EFL Context

Authors: Fahimeh Marefat, Marzieh Marefat

Abstract:

The present research is inspired by the comment that “A substantive, thorough, sophisticated literature review is a precondition for doing substantive, thorough, sophisticated research”. This study is a report on an action research to solve my problem of preparing students to write a Literature Review (LR) that is more than mere cut and paste. More specifically, this study was initiated to discover whether there is an impact of equipping students with tools to write LR on the quality of research and on their view on LR significance. The participants were twenty-four Iranian TEFLers at Allameh Tabataba’i University. they were taking their advanced writing course with the lead researcher. We met once a week for 90 minutes for five weeks followed by individual consultations. Working through a process approach, and implementing tasks, the lead researcher ran workshops implementing different controlled assignments and subsequent activities to lead students to practice appropriate source use on multiple drafts: From choosing the topic, finding sources, forming questions, preparing quotation, paraphrase, and summary note cards, to outlining and most importantly introducing them the tools to evaluate prior research and offer their own take of it and finally synthesizing and incorporating the notes into the body of the LR section of their papers. The LR scoring rubric was implemented and a note was emailed to the students asking about their views. It was indicated that awareness raising and detailed explicit instruction improved the LR quality compared to their previous projects. Interestingly enough, they acknowledged how LR shaped all stages of their research, a further support for the notion of “being scholars before researchers”. The key to success is mastery over the literature which translates into extensive reading and critically appraising it.

Keywords: controlled tasks, critical evaluation, review of literature, writing synthesis

Procedia PDF Downloads 339
550 A Multi Objective Reliable Location-Inventory Capacitated Disruption Facility Problem with Penalty Cost Solve with Efficient Meta Historic Algorithms

Authors: Elham Taghizadeh, Mostafa Abedzadeh, Mostafa Setak

Abstract:

Logistics network is expected that opened facilities work continuously for a long time horizon without any failure; but in real world problems, facilities may face disruptions. This paper studies a reliable joint inventory location problem to optimize cost of facility locations, customers’ assignment, and inventory management decisions when facilities face failure risks and doesn’t work. In our model we assume when a facility is out of work, its customers may be reassigned to other operational facilities otherwise they must endure high penalty costs associated with losing service. For defining the model closer to real world problems, the model is proposed based on p-median problem and the facilities are considered to have limited capacities. We define a new binary variable (Z_is) for showing that customers are not assigned to any facilities. Our problem involve a bi-objective model; the first one minimizes the sum of facility construction costs and expected inventory holding costs, the second one function that mention for the first one is minimizes maximum expected customer costs under normal and failure scenarios. For solving this model we use NSGAII and MOSS algorithms have been applied to find the pareto- archive solution. Also Response Surface Methodology (RSM) is applied for optimizing the NSGAII Algorithm Parameters. We compare performance of two algorithms with three metrics and the results show NSGAII is more suitable for our model.

Keywords: joint inventory-location problem, facility location, NSGAII, MOSS

Procedia PDF Downloads 509
549 Optimal Construction Using Multi-Criteria Decision-Making Methods

Authors: Masood Karamoozian, Zhang Hong

Abstract:

The necessity and complexity of the decision-making process and the interference of the various factors to make decisions and consider all the relevant factors in a problem are very obvious nowadays. Hence, researchers show their interest in multi-criteria decision-making methods. In this research, the Analytical Hierarchy Process (AHP), Simple Additive Weighting (SAW), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods of multi-criteria decision-making have been used to solve the problem of optimal construction systems. Systems being evaluated in this problem include; Light Steel Frames (LSF), a case study of designs by Zhang Hong studio in the Southeast University of Nanjing, Insulating Concrete Form (ICF), Ordinary Construction System (OCS), and Prefabricated Concrete System (PRCS) as another case study designs in Zhang Hong studio in the Southeast University of Nanjing. Crowdsourcing was done by using a questionnaire at the sample level (200 people). Questionnaires were distributed among experts, university centers, and conferences. According to the results of the research, the use of different methods of decision-making led to relatively the same results. In this way, with the use of all three multi-criteria decision-making methods mentioned above, the Prefabricated Concrete System (PRCS) was in the first rank, and the Light Steel Frame (LSF) system ranked second. Also, the Prefabricated Concrete System (PRCS), in terms of performance standards and economics, was ranked first, and the Light Steel Frame (LSF) system was allocated the first rank in terms of environmental standards.

Keywords: multi-criteria decision making, AHP, SAW, TOPSIS

Procedia PDF Downloads 93
548 Preliminary Studies of Transient Stability for the 380 kV Connection West-Central of Saudi Electricity Company

Authors: S. Raja Mohamed, M. H Shwehdi, D. Devaraj

Abstract:

This paper is to present and discuss the new planned 380 kV transmission line performance under steady and transient states. Dynamic modeling and analysis of such inter-tie, which is, proposed to transfer energy from west to south and vice versa will be demonstrated and discussed. The west-central-south inter-tie links Al-Aula-Zaba-Tabuk-Tubajal-Jawf-Hail. It is essential to investigate the transient over-voltage to assure steady and stable transmission over such inter-tie. Saudi Electricity Company (SEC) has been improving its grid to make the whole country as an interconnected system. Already east, central and west were interconnected, yet mostly each is fed with its local generation. The SEC is planning to establish many inter-ties to strengthen the transient stability of its grid. The paper studies one of the important links of 380 kV, 220 km between Tabouk and Tubarjal, which is a step towards connecting the West with the South region. Modeling and analysis using some softwares will be utilized under different scenarios. Adoption of methods to stabilize and increase its power transmission are also discussed. Improvement of power system transients has been controlled by FACTS elements such the Static Var Compensators (SVC) receiving a wide interest since many technical studies have proven their effects on damping system oscillations and stability enhancement. Illustrations of the transient at each main generating or load bus will be checked in all inter-tie links. A brief review of possible means to solve the transient over-voltage problem using different FACTS element modeling will be discussed.

Keywords: transient stability, static var compensator, central-west interconnected system, damping controller, Saudi Electricity Company

Procedia PDF Downloads 590
547 Resource Allocation and Task Scheduling with Skill Level and Time Bound Constraints

Authors: Salam Saudagar, Ankit Kamboj, Niraj Mohan, Satgounda Patil, Nilesh Powar

Abstract:

Task Assignment and Scheduling is a challenging Operations Research problem when there is a limited number of resources and comparatively higher number of tasks. The Cost Management team at Cummins needs to assign tasks based on a deadline and must prioritize some of the tasks as per business requirements. Moreover, there is a constraint on the resources that assignment of tasks should be done based on an individual skill level, that may vary for different tasks. Another constraint is for scheduling the tasks that should be evenly distributed in terms of number of working hours, which adds further complexity to this problem. The proposed greedy approach to solve assignment and scheduling problem first assigns the task based on management priority and then by the closest deadline. This is followed by an iterative selection of an available resource with the least allocated total working hours for a task, i.e. finding the local optimal choice for each task with the goal of determining the global optimum. The greedy approach task allocation is compared with a variant of Hungarian Algorithm, and it is observed that the proposed approach gives an equal allocation of working hours among the resources. The comparative study of the proposed approach is also done with manual task allocation and it is noted that the visibility of the task timeline has increased from 2 months to 6 months. An interactive dashboard app is created for the greedy assignment and scheduling approach and the tasks with more than 2 months horizon that were waiting in a queue without a delivery date initially are now analyzed effectively by the business with expected timelines for completion.

Keywords: assignment, deadline, greedy approach, Hungarian algorithm, operations research, scheduling

Procedia PDF Downloads 133
546 A U-Net Based Architecture for Fast and Accurate Diagram Extraction

Authors: Revoti Prasad Bora, Saurabh Yadav, Nikita Katyal

Abstract:

In the context of educational data mining, the use case of extracting information from images containing both text and diagrams is of high importance. Hence, document analysis requires the extraction of diagrams from such images and processes the text and diagrams separately. To the author’s best knowledge, none among plenty of approaches for extracting tables, figures, etc., suffice the need for real-time processing with high accuracy as needed in multiple applications. In the education domain, diagrams can be of varied characteristics viz. line-based i.e. geometric diagrams, chemical bonds, mathematical formulas, etc. There are two broad categories of approaches that try to solve similar problems viz. traditional computer vision based approaches and deep learning approaches. The traditional computer vision based approaches mainly leverage connected components and distance transform based processing and hence perform well in very limited scenarios. The existing deep learning approaches either leverage YOLO or faster-RCNN architectures. These approaches suffer from a performance-accuracy tradeoff. This paper proposes a U-Net based architecture that formulates the diagram extraction as a segmentation problem. The proposed method provides similar accuracy with a much faster extraction time as compared to the mentioned state-of-the-art approaches. Further, the segmentation mask in this approach allows the extraction of diagrams of irregular shapes.

Keywords: computer vision, deep-learning, educational data mining, faster-RCNN, figure extraction, image segmentation, real-time document analysis, text extraction, U-Net, YOLO

Procedia PDF Downloads 115
545 The Triple Threat: Microplastic, Nanoplastic, and Macroplastic Pollution and Their Cumulative Impacts on Marine Ecosystem

Authors: Tabugbo B. Ifeyinwa, Josephat O. Ogbuagu, Okeke A. Princewill, Victor C. Eze

Abstract:

The increasing amount of plastic pollution in maritime settings poses a substantial risk to the functioning of ecosystems and the preservation of biodiversity. This comprehensive analysis combines the most recent data on the environmental effects of pollution from macroplastics, microplastics, and nanoplastics within marine ecosystems. Our goal is to provide a comprehensive understanding of the cumulative impacts that plastic waste accumulates on marine life by outlining the origins, processes, and ecological repercussions connected with each size category of plastic debris. Microplastics and nanoplastics have more sneaky effects that are controlled by chemicals. These effects can get through biological barriers and affect the health of cells and the whole body. Compared to macroplastics, which primarily contribute to physical harm through entanglement and ingestion by marine fauna, microplastics, and nanoplastics are associated with non-physical effects. The review underlines a vital need for research that crosses disciplinary boundaries to untangle the intricate interactions that the various sizes of plastic pollution have with marine animals, evaluate the long-term ecological repercussions, and identify effective measures for mitigating the effects of plastic pollution. Additionally, we urge governmental interventions and worldwide cooperation to solve this pervasive environmental concern. Specifically, we identify significant knowledge gaps in the detection and effect assessment of nanoplastics. To protect marine biodiversity and preserve ecosystem services, this review highlights how urgent it is to address the broad spectrum of plastic pollution.

Keywords: macroplastic pollution, marine ecosystem, microplastic pollution, nanoplastic pollution

Procedia PDF Downloads 40
544 Inversion of the Spectral Analysis of Surface Waves Dispersion Curves through the Particle Swarm Optimization Algorithm

Authors: A. Cerrato Casado, C. Guigou, P. Jean

Abstract:

In this investigation, the particle swarm optimization (PSO) algorithm is used to perform the inversion of the dispersion curves in the spectral analysis of surface waves (SASW) method. This inverse problem usually presents complicated solution spaces with many local minima that make difficult the convergence to the correct solution. PSO is a metaheuristic method that was originally designed to simulate social behavior but has demonstrated powerful capabilities to solve inverse problems with complex space solution and a high number of variables. The dispersion curve of the synthetic soils is constructed by the vertical flexibility coefficient method, which is especially convenient for soils where the stiffness does not increase gradually with depth. The reason is that these types of soil profiles are not normally dispersive since the dominant mode of Rayleigh waves is usually not coincident with the fundamental mode. Multiple synthetic soil profiles have been tested to show the characteristics of the convergence process and assess the accuracy of the final soil profile. In addition, the inversion procedure is applied to multiple real soils and the final profile compared with the available information. The combination of the vertical flexibility coefficient method to obtain the dispersion curve and the PSO algorithm to carry out the inversion process proves to be a robust procedure that is able to provide good solutions for complex soil profiles even with scarce prior information.

Keywords: dispersion, inverse problem, particle swarm optimization, SASW, soil profile

Procedia PDF Downloads 167
543 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 247
542 Design and Analysis of a Piezoelectric Linear Motor Based on Rigid Clamping

Authors: Chao Yi, Cunyue Lu, Lingwei Quan

Abstract:

Piezoelectric linear motors have the characteristics of great electromagnetic compatibility, high positioning accuracy, compact structure and no deceleration mechanism, which make it promising to applicate in micro-miniature precision drive systems. However, most piezoelectric motors are employed by flexible clamping, which has insufficient rigidity and is difficult to use in rapid positioning. Another problem is that this clamping method seriously affects the vibration efficiency of the vibrating unit. In order to solve these problems, this paper proposes a piezoelectric stack linear motor based on double-end rigid clamping. First, a piezoelectric linear motor with a length of only 35.5 mm is designed. This motor is mainly composed of a motor stator, a driving foot, a ceramic friction strip, a linear guide, a pre-tightening mechanism and a base. This structure is much simpler and smaller than most similar motors, and it is easy to assemble as well as to realize precise control. In addition, the properties of piezoelectric stack are reviewed and in order to obtain the elliptic motion trajectory of the driving head, a driving scheme of the longitudinal-shear composite stack is innovatively proposed. Finally, impedance analysis and speed performance testing were performed on the piezoelectric linear motor prototype. The motor can measure speed up to 25.5 mm/s under the excitation of signal voltage of 120 V and frequency of 390 Hz. The result shows that the proposed piezoelectric stacked linear motor obtains great performance. It can run smoothly in a large speed range, which is suitable for various precision control in medical images, aerospace, precision machinery and many other fields.

Keywords: piezoelectric stack, linear motor, rigid clamping, elliptical trajectory

Procedia PDF Downloads 140
541 LCA and Multi-Criteria Analysis of Fly Ash Concrete Pavements

Authors: Marcela Ondova, Adriana Estokova

Abstract:

Rapid industrialization results in increased use of natural resources bring along serious ecological and environmental imbalance due to the dumping of industrial wastes. Principles of sustainable construction have to be accepted with regard to the consumption of natural resources and the production of harmful emissions. Cement is a great importance raw material in the building industry and today is its large amount used in the construction of concrete pavements. Concerning raw materials cost and producing CO2 emission the replacing of cement in concrete mixtures with more sustainable materials is necessary. To reduce this environmental impact people all over the world are looking for a solution. Over a period of last ten years, the image of fly ash has completely been changed from a polluting waste to resource material and it can solve the major problems of cement use. Fly ash concretes are proposed as a potential approach for achieving substantial reductions in cement. It is known that it improves the workability of concrete, extends the life cycle of concrete roads, and reduces energy use and greenhouse gas as well as amount of coal combustion products that must be disposed in landfills. Life cycle assessment also proved that a concrete pavement with fly ash cement replacement is considerably more environmentally friendly compared to standard concrete roads. In addition, fly ash is cheap raw material, and the costs saving are guaranteed. The strength properties, resistance to a frost or de-icing salts, which are important characteristics in the construction of concrete pavements, have reached the required standards as well. In terms of human health it can´t be stated that a concrete cover with fly ash could be dangerous compared with a cover without fly ash. Final Multi-criteria analysis also pointed that a concrete with fly ash is a clearly proper solution.

Keywords: life cycle assessment, fly ash, waste, concrete pavements

Procedia PDF Downloads 393
540 The Term of Intellectual Property and Artificial Intelligence

Authors: Yusuf Turan

Abstract:

Definition of Intellectual Property Rights according to the World Intellectual Property Organization: " Intellectual property (IP) refers to creations of the mind, such as inventions; literary and artistic works; designs; and symbols, names and images used in commerce." It states as follows. There are 2 important points in the definition; we can say that it is the result of intellectual activities that occur by one or more than one PERSON and as INNOVATION. When the history and development of the relevant definitions are briefly examined, it is realized that these two points have remained constant and Intellectual Property law and rights have been shaped around these two points. With the expansion of the scope of the term Intellectual Property as a result of the development of technology, especially in the field of artificial intelligence, questions such as "Can "Artificial Intelligence" be an inventor?" need to be resolved within the expanding scope. In the past years, it was ruled that the artificial intelligence named DABUS seen in the USA did not meet the definition of "individual" and therefore would be an inventor/inventor. With the developing technology, it is obvious that we will encounter such situations much more frequently in the field of intellectual property. While expanding the scope, we should definitely determine the boundaries of how we should decide who performs the mental activity or creativity that we call indispensable on the inventor/inventor according to these problems. As a result of all these problems and innovative situations, it is clearly realized that not only Intellectual Property Law and Rights but also their definitions need to be updated and improved. Ignoring the situations that are outside the scope of the current Intellectual Property Term is not enough to solve the problem and brings uncertainty. The fact that laws and definitions that have been operating on the same theories for years exclude today's innovative technologies from the scope contradicts intellectual property, which is expressed as a new and innovative field. Today, as a result of the innovative creation of poetry, painting, animation, music and even theater works with artificial intelligence, it must be recognized that the definition of Intellectual Property must be revised.

Keywords: artificial intelligence, innovation, the term of intellectual property, right

Procedia PDF Downloads 56
539 An MIPSSTWM-based Emergency Vehicle Routing Approach for Quick Response to Highway Incidents

Authors: Siliang Luan, Zhongtai Jiang

Abstract:

The risk of highway incidents is commonly recognized as a major concern for transportation authorities due to the hazardous consequences and negative influence. It is crucial to respond to these unpredictable events as soon as possible faced by emergency management decision makers. In this paper, we focus on path planning for emergency vehicles, one of the most significant processes to avoid congestion and reduce rescue time. A Mixed-Integer Linear Programming with Semi-Soft Time Windows Model (MIPSSTWM) is conducted to plan an optimal routing respectively considering the time consumption of arcs and nodes of the urban road network and the highway network, especially in developing countries with an enormous population. Here, the arcs indicate the road segments and the nodes include the intersections of the urban road network and the on-ramp and off-ramp of the highway networks. An attempt in this research has been made to develop a comprehensive and executive strategy for emergency vehicle routing in heavy traffic conditions. The proposed Cuckoo Search (CS) algorithm is designed by imitating obligate brood parasitic behaviors of cuckoos and Lévy Flights (LF) to solve this hard and combinatorial problem. Using a Chinese city as our case study, the numerical results demonstrate the approach we applied in this paper outperforms the previous method without considering the nodes of the road network for a real-world situation. Meanwhile, the accuracy and validity of the CS algorithm also show better performances than the traditional algorithm.

Keywords: emergency vehicle, path planning, cs algorithm, urban traffic management and urban planning

Procedia PDF Downloads 63
538 Feasibility Study of MongoDB and Radio Frequency Identification Technology in Asset Tracking System

Authors: Mohd Noah A. Rahman, Afzaal H. Seyal, Sharul T. Tajuddin, Hartiny Md Azmi

Abstract:

Taking into consideration the real time situation specifically the higher academic institutions, small, medium to large companies, public to private sectors and the remaining sectors, do experience the inventory or asset shrinkages due to theft, loss or even inventory tracking errors. This happening is due to a zero or poor security systems and measures being taken and implemented in their organizations. Henceforth, implementing the Radio Frequency Identification (RFID) technology into any manual or existing web-based system or web application can simply deter and will eventually solve certain major issues to serve better data retrieval and data access. Having said, this manual or existing system can be enhanced into a mobile-based system or application. In addition to that, the availability of internet connections can aid better services of the system. Such involvement of various technologies resulting various privileges to individuals or organizations in terms of accessibility, availability, mobility, efficiency, effectiveness, real-time information and also security. This paper will look deeper into the integration of mobile devices with RFID technologies with the purpose of asset tracking and control. Next, it is to be followed by the development and utilization of MongoDB as the main database to store data and its association with RFID technology. Finally, the development of a web based system which can be viewed in a mobile based formation with the aid of Hypertext Preprocessor (PHP), MongoDB, Hyper-Text Markup Language 5 (HTML5), Android, JavaScript and AJAX programming language.

Keywords: RFID, asset tracking system, MongoDB, NoSQL

Procedia PDF Downloads 288
537 Vehicle Speed Estimation Using Image Processing

Authors: Prodipta Bhowmik, Poulami Saha, Preety Mehra, Yogesh Soni, Triloki Nath Jha

Abstract:

In India, the smart city concept is growing day by day. So, for smart city development, a better traffic management and monitoring system is a very important requirement. Nowadays, road accidents increase due to more vehicles on the road. Reckless driving is mainly responsible for a huge number of accidents. So, an efficient traffic management system is required for all kinds of roads to control the traffic speed. The speed limit varies from road to road basis. Previously, there was a radar system but due to high cost and less precision, the radar system is unable to become favorable in a traffic management system. Traffic management system faces different types of problems every day and it has become a researchable topic on how to solve this problem. This paper proposed a computer vision and machine learning-based automated system for multiple vehicle detection, tracking, and speed estimation of vehicles using image processing. Detection of vehicles and estimating their speed from a real-time video is tough work to do. The objective of this paper is to detect vehicles and estimate their speed as accurately as possible. So for this, a real-time video is first captured, then the frames are extracted from that video, then from that frames, the vehicles are detected, and thereafter, the tracking of vehicles starts, and finally, the speed of the moving vehicles is estimated. The goal of this method is to develop a cost-friendly system that can able to detect multiple types of vehicles at the same time.

Keywords: OpenCV, Haar Cascade classifier, DLIB, YOLOV3, centroid tracker, vehicle detection, vehicle tracking, vehicle speed estimation, computer vision

Procedia PDF Downloads 63