Search results for: match filter (MF)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1290

Search results for: match filter (MF)

480 Biodiversity Indices for Macrobenthic Community structures of Mangrove Forests, Khamir Port, Iran

Authors: Mousa Keshavarz, Abdul-Reza Dabbagh, Maryam Soyuf Jahromi

Abstract:

The diversity of mangrove macrobenthos assemblages at mudflat and mangrove ecosystems of Port Khamir, Iran were investigated for one year. During this period, we measured physicochemical properties of water temperature, salinity, pH, DO and the density and distribution of the macrobenthos. We sampled a total of 9 transects, at three different topographic levels along the intertidal zone at three stations. Assemblages at class level were compared. The five most diverse and abundant classes were Foraminifers (54%), Gastropods (23%), Polychaetes (10%), Bivalves (8%) & Crustaceans (5%), respectively. Overall densities were 1869 ± 424 ind/m2 (26%) in spring, 2544 ± 383 ind/m2(36%) in summer, 1482 ± 323 ind/m2 (21%) in autumn and 1207 ± 80 ind/m2 (17%) in winter. Along the intertidal zone, the overall relative density of individuals at high, intermediate, and low topographic levels was 40, 30, and 30% respectively. Biodiversity indices were used to compare different classes: Gastropoda (Shannon index: 0.33) and Foraminifera (Simpson index: 0.28) calculated the highest scores. It was also calculated other bio-indices. With the exception of bivalves, filter feeders were associated with coarser sediments at higher intertidal levels, while deposit feeders were associated with finer sediments at lower levels. Salinity was the most important factor acting on community structure, while DO and pH had little influence.

Keywords: macrobenthos, biodiversity, mangrove forest, Khamir Port

Procedia PDF Downloads 376
479 Signal Estimation and Closed Loop System Performance in Atrial Fibrillation Monitoring with Communication Channels

Authors: Mohammad Obeidat, Ayman Mansour

Abstract:

In this paper a unique issue rising from feedback control of Atrial Fibrillation monitoring system with embedded communication channels has been investigated. One of the important factors to measure the performance of the feedback control closed loop system is disturbance and noise attenuation factor. It is important that the feedback system can attenuate such disturbances on the atrial fibrillation heart rate signals. Communication channels depend on network traffic conditions and deliver different throughput, implying that the sampling intervals may change. Since signal estimation is updated on the arrival of new data, its dynamics actually change with the sampling interval. Consequently, interaction among sampling, signal estimation, and the controller will introduce new issues in remotely controlled Atrial Fibrillation system. This paper treats a remotely controlled atrial fibrillation system with one communication channel which connects between the heart rate and rhythm measurements to the remote controller. Typical and optimal signal estimation schemes is represented by a signal averaging filter with its time constant derived from the step size of the signal estimation algorithm.

Keywords: atrial fibrillation, communication channels, closed loop, estimation

Procedia PDF Downloads 378
478 Energy Performance Gaps in Residences: An Analysis of the Variables That Cause Energy Gaps and Their Impact

Authors: Amrutha Kishor

Abstract:

Today, with the rising global warming and depletion of resources every industry is moving toward sustainability and energy efficiency. As part of this movement, it is nowadays obligatory for architects to play their part by creating energy predictions for their designs. But in a lot of cases, these predictions do not reflect the real quantities of energy in newly built buildings when operating. These can be described as ‘Energy Performance Gaps’. This study aims to determine the underlying reasons for these gaps. Seven houses designed by Allan Joyce Architects, UK from 1998 until 2019 were considered for this study. The data from the residents’ energy bills were cross-referenced with the predictions made with the software SefairaPro and from energy reports. Results indicated that the predictions did not match the actual energy usage. An account of how energy was used in these seven houses was made by means of personal interviews. The main factors considered in the study were occupancy patterns, heating systems and usage, lighting profile and usage, and appliances’ profile and usage. The study found that the main reasons for the creation of energy gaps were the discrepancies in occupant usage and patterns of energy consumption that are predicted as opposed to the actual ones. This study is particularly useful for energy-conscious architectural firms to fine-tune the approach to designing houses and analysing their energy performance. As the findings reveal that energy usage in homes varies based on the way residents use the space, it helps deduce the most efficient technological combinations. This information can be used to set guidelines for future policies and regulations related to energy consumption in homes. This study can also be used by the developers of simulation software to understand how architects use their product and drive improvements in its future versions.

Keywords: architectural simulation, energy efficient design, energy performance gaps, environmental design

Procedia PDF Downloads 118
477 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 428
476 Patient-Specific Design Optimization of Cardiovascular Grafts

Authors: Pegah Ebrahimi, Farshad Oveissi, Iman Manavi-Tehrani, Sina Naficy, David F. Fletcher, Fariba Dehghani, David S. Winlaw

Abstract:

Despite advances in modern surgery, congenital heart disease remains a medical challenge and a major cause of infant mortality. Cardiovascular prostheses are routinely used in surgical procedures to address congenital malformations, for example establishing a pathway from the right ventricle to the pulmonary arteries in pulmonary valvar atresia. Current off-the-shelf options including human and adult products have limited biocompatibility and durability, and their fixed size necessitates multiple subsequent operations to upsize the conduit to match with patients’ growth over their lifetime. Non-physiological blood flow is another major problem, reducing the longevity of these prostheses. These limitations call for better designs that take into account the hemodynamical and anatomical characteristics of different patients. We have integrated tissue engineering techniques with modern medical imaging and image processing tools along with mathematical modeling to optimize the design of cardiovascular grafts in a patient-specific manner. Computational Fluid Dynamics (CFD) analysis is done according to models constructed from each individual patient’s data. This allows for improved geometrical design and achieving better hemodynamic performance. Tissue engineering strives to provide a material that grows with the patient and mimic the durability and elasticity of the native tissue. Simulations also give insight on the performance of the tissues produced in our lab and reduce the need for costly and time-consuming methods of evaluation of the grafts. We are also developing a methodology for the fabrication of the optimized designs.

Keywords: computational fluid dynamics, cardiovascular grafts, design optimization, tissue engineering

Procedia PDF Downloads 242
475 Design of a Real Time Heart Sounds Recognition System

Authors: Omer Abdalla Ishag, Magdi Baker Amien

Abstract:

Physicians used the stethoscope for listening patient heart sounds in order to make a diagnosis. However, the determination of heart conditions by acoustic stethoscope is a difficult task so it requires special training of medical staff. This study developed an accurate model for analyzing the phonocardiograph signal based on PC and DSP processor. The system has been realized into two phases; offline and real time phase. In offline phase, 30 cases of heart sounds files were collected from medical students and doctor's world website. For experimental phase (real time), an electronic stethoscope has been designed, implemented and recorded signals from 30 volunteers, 17 were normal cases and 13 were various pathologies cases, these acquired 30 signals were preprocessed using an adaptive filter to remove lung sounds. The background noise has been removed from both offline and real data, using wavelet transform, then graphical and statistics features vector elements were extracted, finally a look-up table was used for classification heart sounds cases. The obtained results of the implemented system showed accuracy of 90%, 80% and sensitivity of 87.5%, 82.4% for offline data, and real data respectively. The whole system has been designed on TMS320VC5509a DSP Platform.

Keywords: code composer studio, heart sounds, phonocardiograph, wavelet transform

Procedia PDF Downloads 445
474 Continuous Blood Pressure Measurement from Pulse Transit Time Techniques

Authors: Chien-Lin Wang, Cha-Ling Ko, Tainsong Chen

Abstract:

Pulse Blood pressure (BP) is one of the vital signs, and is an index that helps determining the stability of life. In this respect, some spinal cord injury patients need to take the tilt table test. While doing the test, the posture changes abruptly, and may cause a patient’s BP to change abnormally. This may cause patients to feel discomfort, and even feel as though their life is threatened. Therefore, if a continuous non-invasive BP assessment system were built, it could help to alert health care professionals in the process of rehabilitation when the BP value is out of range. In our research, BP assessed by the pulse transit time technique was developed. In the system, we use a self-made photoplethysmograph (PPG) sensor and filter circuit to detect two PPG signals and to calculate the time difference. The BP can immediately be assessed by the trend line. According to the results of this study, the relationship between the systolic BP and PTT has a highly negative linear correlation (R2=0.8). Further, we used the trend line to assess the value of the BP and compared it to a commercial sphygmomanometer (Omron MX3); the error rate of the system was found to be in the range of ±10%, which is within the permissible error range of a commercial sphygmomanometer. The continue blood pressure measurement from pulse transit time technique may have potential to become a convenience method for clinical rehabilitation.

Keywords: continous blood pressure measurement, PPG, time transit time, transit velocity

Procedia PDF Downloads 353
473 A Methodology for Investigating Public Opinion Using Multilevel Text Analysis

Authors: William Xiu Shun Wong, Myungsu Lim, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, many users have begun to frequently share their opinions on diverse issues using various social media. Therefore, numerous governments have attempted to establish or improve national policies according to the public opinions captured from various social media. In this paper, we indicate several limitations of the traditional approaches to analyze public opinion on science and technology and provide an alternative methodology to overcome these limitations. First, we distinguish between the science and technology analysis phase and the social issue analysis phase to reflect the fact that public opinion can be formed only when a certain science and technology is applied to a specific social issue. Next, we successively apply a start list and a stop list to acquire clarified and interesting results. Finally, to identify the most appropriate documents that fit with a given subject, we develop a new logical filter concept that consists of not only mere keywords but also a logical relationship among the keywords. This study then analyzes the possibilities for the practical use of the proposed methodology thorough its application to discover core issues and public opinions from 1,700,886 documents comprising SNS, blogs, news, and discussions.

Keywords: big data, social network analysis, text mining, topic modeling

Procedia PDF Downloads 294
472 A Framework for Early Differential Diagnosis of Tropical Confusable Diseases Using the Fuzzy Cognitive Map Engine

Authors: Faith-Michael E. Uzoka, Boluwaji A. Akinnuwesi, Taiwo Amoo, Flora Aladi, Stephen Fashoto, Moses Olaniyan, Joseph Osuji

Abstract:

The overarching aim of this study is to develop a soft-computing system for the differential diagnosis of tropical diseases. These conditions are of concern to health bodies, physicians, and the community at large because of their mortality rates, and difficulties in early diagnosis due to the fact that they present with symptoms that overlap, and thus become ‘confusable’. We report on the first phase of our study, which focuses on the development of a fuzzy cognitive map model for early differential diagnosis of tropical diseases. We used malaria as a case disease to show the effectiveness of the FCM technology as an aid to the medical practitioner in the diagnosis of tropical diseases. Our model takes cognizance of manifested symptoms and other non-clinical factors that could contribute to symptoms manifestations. Our model showed 85% accuracy in diagnosis, as against the physicians’ initial hypothesis, which stood at 55% accuracy. It is expected that the next stage of our study will provide a multi-disease, multi-symptom model that also improves efficiency by utilizing a decision support filter that works on an algorithm, which mimics the physician’s diagnosis process.

Keywords: medical diagnosis, tropical diseases, fuzzy cognitive map, decision support filters, malaria differential diagnosis

Procedia PDF Downloads 319
471 Auto Calibration and Optimization of Large-Scale Water Resources Systems

Authors: Arash Parehkar, S. Jamshid Mousavi, Shoubo Bayazidi, Vahid Karami, Laleh Shahidi, Arash Azaranfar, Ali Moridi, M. Shabakhti, Tayebeh Ariyan, Mitra Tofigh, Kaveh Masoumi, Alireza Motahari

Abstract:

Water resource systems modelling have constantly been a challenge through history for human being. As the innovative methodological development is evolving alongside computer sciences on one hand, researches are likely to confront more complex and larger water resources systems due to new challenges regarding increased water demands, climate change and human interventions, socio-economic concerns, and environment protection and sustainability. In this research, an automatic calibration scheme has been applied on the Gilan’s large-scale water resource model using mathematical programming. The water resource model’s calibration is developed in order to attune unknown water return flows from demand sites in the complex Sefidroud irrigation network and other related areas. The calibration procedure is validated by comparing several gauged river outflows from the system in the past with model results. The calibration results are pleasantly reasonable presenting a rational insight of the system. Subsequently, the unknown optimized parameters were used in a basin-scale linear optimization model with the ability to evaluate the system’s performance against a reduced inflow scenario in future. Results showed an acceptable match between predicted and observed outflows from the system at selected hydrometric stations. Moreover, an efficient operating policy was determined for Sefidroud dam leading to a minimum water shortage in the reduced inflow scenario.

Keywords: auto-calibration, Gilan, large-scale water resources, simulation

Procedia PDF Downloads 335
470 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency

Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu

Abstract:

In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.

Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal

Procedia PDF Downloads 151
469 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 155
468 One-off Separation of Multiple Types of Oil-in-Water Emulsions with Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oil wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM has a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 80
467 Design of Raw Water Reservoir on Sandy Soil

Authors: Venkata Ramana Pamu

Abstract:

This paper is a case study of a 5310 ML capacity Raw Water Reservoir (RWR), situated in Indian state Rajasthan, which is a part of Rajasthan Rural Water Supply & Fluorosis Mitigation Project. This RWR embankment was constructed by locally available material on natural ground profile. Height of the embankment was varying from 2m to 10m.This is due to existing ground level was varying. Reservoir depth 9m including 1.5m free board and 1V:3H slopes were provided both upstream and downstream side. Proper soil investigation, tests were done and it was confirmed that the existing soil is sandy silt. The existing excavated earth was used as filling material for embankment construction, due to this controlling seepage from upstream to downstream be a challenging task. Slope stability and Seismic analysis of the embankment done by Conventional method for both full reservoir condition and rapid drawdown. Horizontal filter at toe level was provided along with upstream side PCC (Plain Cement Concrete) block and HDPE (High Density poly ethylene) lining as a remedy to control seepage. HDPE lining was also provided at storage area of the reservoir bed level. Mulching was done for downstream side slope protection.

Keywords: raw water reservoir, seepage, seismic analysis, slope stability

Procedia PDF Downloads 497
466 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer

Authors: Nabil Saad, David Morgan, Manish Gupta

Abstract:

Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.

Keywords: aerosols, extinction, visibility, albedo

Procedia PDF Downloads 90
465 Exchange Rate Variation and Balance of Payments: The Nigerian Experience (1970-2012)

Authors: Vitus Onyebuchim Onyemailu, Olive Obianuju Okalibe

Abstract:

The study tried to examine relationship between exchange rate variations on the balance of payments in Nigeria from 1970 to 2012. Using time series on econometric measures such as Granger causality and ordinary least square (OLS), the study found that exchange rate movements especially the depreciation of naira has not contributed significantly on the balance of payments under the year of the study. The granger result conform the Marshall-Lerner short and long run prepositions that exchange rate devaluation enhances balance of payments. On disaggregation exchange rate granger causes current and capital account balances give the Nigeria data from 1970 to 2012. Overall in the long run OLS regression analysis, exchange rate on semi log functional form, exchange rate variation did not record significant effect on balance of payment equation. This height was also maintained in the current or trade balance which does not match the Marshall-Lerner. The capital account balance in reverse reported a significant impact of exchange rate variability on the capital account balance. Finally, on exchange rate determination equation, where many fundamentals were considered including lagged of exchange rate. Thus, the lagged of exchange rate recorded a positive and significant influence on the present exchange rate. This means that players in the financial markets usually out plays authority’s policy’s stances through their speculative tendencies. The work therefore, recommend that effort should be made by the authorities to providing enabling environment for production of goods and services to triumph in order to take advantages of steady devaluation of its currency. This is done by providing infrastructure, provision of science and technology. Thus, when this is done Nigeria would be able to have competitive power against the rest of the world.

Keywords: exchange rate variation, balance of payments, current account, capital account, Marshall-Lerner hypothesis

Procedia PDF Downloads 397
464 Decision Support Tool for Selecting Appropriate Sustainable Rainwater Harvesting Based System in Ibadan, Nigeria

Authors: Omolara Lade, David Oloke

Abstract:

The approach to water management worldwide is currently in transition, with a shift from centralised infrastructures to greater consideration of decentralised technologies, such as rainwater harvesting (RWH). However, in Nigeria, implementation of sustainable water management, such as RWH systems, is inefficient and social, environmental and technical barriers, concerns and knowledge gaps exist, which currently restrict its widespread utilisation. This inefficiency contributes to water scarcity, water-borne diseases, and loss of lives and property due to flooding. Meanwhile, several RWH technologies have been developed to improve SWM through both demand and storm-water management. Such technologies involve the use of reinforced concrete cement (RCC) storage tanks, surface water reservoirs and ground-water recharge pits as storage systems. A framework was developed to assess the significance and extent of water management problems, match the problems with existing RWH-based solutions and develop a robust ready-to-use decision support tool that can quantify the costs and benefits of implementing several RWH-based storage systems. The methodology adopted was the mixed method approach, involving a detailed literature review, followed by a questionnaire survey of household respondents, Nigerian Architects and Civil Engineers and focus group discussion with stakeholders. 18 selection attributes have been defined and three alternatives have been identified in this research. The questionnaires were analysed using SPSS, excel and selected statistical methods to derive weightings of the attributes for the tool. Following this, three case studies were modelled using RainCycle software. From the results, the MDA model chose RCC tank as the most appropriate storage system for RWH.

Keywords: rainwater harvesting, modelling, hydraulic assessment, whole life cost, decision support system

Procedia PDF Downloads 371
463 Benefits of Automobile Electronic Technology in the Logistics Industry in Third World Countries

Authors: Jonathan Matyenyika

Abstract:

In recent years, automobile manufacturers have increasingly produced vehicles equipped with cutting-edge automotive electronic technology to match the fast-paced digital world of today; this has brought about various benefits in different business sectors that make use of these vehicles as a means of turning over a profit. In the logistics industry, vehicles equipped with this technology have proved to be very utilitarian; this paper focuses on the benefits automobile electronic equipped vehicles have in the logistics industry. Automotive vehicle manufacturers have introduced new technological electronic features to their vehicles to enhance and improve the overall performance, efficiency, safety and driver comfort. Some of these features have proved to be beneficial to logistics operators. To start with the introduction of adaptive cruise control in long-distance haulage vehicles, to see how this system benefits the drivers, we carried out research in the form of interviews with long-distance truck drivers with the main question being, what major difference have they experienced since they started to operate vehicles equipped with this technology to which most stated they had noticed that they are less tired and are able to drive longer distances as compared to when they used vehicles not equipped with this system. As a result, they can deliver faster and take on the next assignment, thus improving efficiency and bringing in more monetary return for the logistics company. Secondly, the introduction of electric hybrid technology, this system allows the vehicle to be propelled by electric power stored in batteries located in the vehicle instead of fossil fuel. Consequently, this benefits the logistic company as vehicles become cheaper to run as electricity is more affordable as compared to fossil fuel. The merging of electronic systems in vehicles has proved to be of great benefit, as my research proves that this can benefit the logistics industry in plenty of ways.

Keywords: logistics, manufacturing, hybrid technology, haulage vehicles

Procedia PDF Downloads 57
462 Developing Structured Sizing Systems for Manufacturing Ready-Made Garments of Indian Females Using Decision Tree-Based Data Mining

Authors: Hina Kausher, Sangita Srivastava

Abstract:

In India, there is a lack of standard, systematic sizing approach for producing readymade garments. Garments manufacturing companies use their own created size tables by modifying international sizing charts of ready-made garments. The purpose of this study is to tabulate the anthropometric data which covers the variety of figure proportions in both height and girth. 3,000 data has been collected by an anthropometric survey undertaken over females between the ages of 16 to 80 years from some states of India to produce the sizing system suitable for clothing manufacture and retailing. This data is used for the statistical analysis of body measurements, the formulation of sizing systems and body measurements tables. Factor analysis technique is used to filter the control body dimensions from a large number of variables. Decision tree-based data mining is used to cluster the data. The standard and structured sizing system can facilitate pattern grading and garment production. Moreover, it can exceed buying ratios and upgrade size allocations to retail segments.

Keywords: anthropometric data, data mining, decision tree, garments manufacturing, sizing systems, ready-made garments

Procedia PDF Downloads 133
461 Expert System: Debugging Using MD5 Process Firewall

Authors: C. U. Om Kumar, S. Kishore, A. Geetha

Abstract:

An Operating system (OS) is software that manages computer hardware and software resources by providing services to computer programs. One of the important user expectations of the operating system is to provide the practice of defending information from unauthorized access, disclosure, modification, inspection, recording or destruction. Operating system is always vulnerable to the attacks of malwares such as computer virus, worm, Trojan horse, backdoors, ransomware, spyware, adware, scareware and more. And so the anti-virus software were created for ensuring security against the prominent computer viruses by applying a dictionary based approach. The anti-virus programs are not always guaranteed to provide security against the new viruses proliferating every day. To clarify this issue and to secure the computer system, our proposed expert system concentrates on authorizing the processes as wanted and unwanted by the administrator for execution. The Expert system maintains a database which consists of hash code of the processes which are to be allowed. These hash codes are generated using MD5 message-digest algorithm which is a widely used cryptographic hash function. The administrator approves the wanted processes that are to be executed in the client in a Local Area Network by implementing Client-Server architecture and only the processes that match with the processes in the database table will be executed by which many malicious processes are restricted from infecting the operating system. The add-on advantage of this proposed Expert system is that it limits CPU usage and minimizes resource utilization. Thus data and information security is ensured by our system along with increased performance of the operating system.

Keywords: virus, worm, Trojan horse, back doors, Ransomware, Spyware, Adware, Scareware, sticky software, process table, MD5, CPU usage and resource utilization

Procedia PDF Downloads 427
460 One-off Separation of Multiple Types of Oil-In-Water Emulsions With Surface-Engineered Graphene-Based Multilevel Structure Materials

Authors: Han Longxiang

Abstract:

In the process of treating industrial oily wastewater with complex components, the traditional treatment methods (flotation, coagulation, microwave heating, etc.) often produce high operating costs, secondary pollution, and other problems. In order to solve these problems, the materials with high flux and stability applied to surfactant-stabilized emulsions separation have gained huge attention in the treatment of oily wastewater. Nevertheless, four stable oil-in-water emulsions can be formed due to different surfactants (surfactant-free, anionic surfactant, cationic surfactant, and non-ionic surfactant), and the previous advanced materials can only separate one or several of them, cannot effectively separate in one step. Herein, a facile synthesis method of graphene-based multilevel filter materials (GMFM) which can efficiently separate the oil-in-water emulsions stabilized with different surfactants only through its gravity. The prepared materials with high stability of 20 cycles show a high flux of ~ 5000 L m-2 h-1 with a high separation efficiency of > 99.9 %. GMFM can effectively separate the emulsion stabilized by mixed surfactants and oily wastewater from factories. The results indicate that the GMFM have a wide range of applications in oil-in-water emulsions separation in industry and environmental science.

Keywords: emulsion, filtration, graphene, one-step

Procedia PDF Downloads 90
459 Development and Evaluation of Virtual Basketball Game Using Motion Capture Technology

Authors: Shunsuke Aoki, Taku Ri, Tatsuya Yamazaki

Abstract:

These days, along with the development of e-sports, video games as a competitive sport is attracting attention. But, in many cases, action in the screen does not match the real motion of operation. Inclusiveness of player motion is needed to increase reality and excitement for sports games. Therefore, in this study, the authors propose a method to recognize player motion by using the motion capture technology and develop a virtual basketball game. The virtual basketball game consists of a screen with nine targets, players, depth sensors, and no ball. The players pretend a two-handed basketball shot without a ball aiming at one of the nine targets on the screen. Time-series data of three-dimensional coordinates of player joints are captured by the depth sensor. 20 joints data are measured for each player to estimate the shooting motion in real-time. The trajectory of the thrown virtual ball is calculated based on the time-series data and hitting on the target is judged as success or failure. The virtual basketball game can be played by 2 to 4 players as a competitive game among the players. The developed game was exhibited to the public for evaluation on the authors' university open campus days. 339 visitors participated in the exhibition and enjoyed the virtual basketball game over the two days. A questionnaire survey on the developed game was conducted for the visitors who experienced the game. As a result of the survey, about 97.3% of the players found the game interesting regardless of whether they had experienced actual basketball before or not. In addition, it is found that women are easy to comfort for shooting motion. The virtual game with motion capture technology has the potential to become a universal entertainment between e-sports and actual sports.

Keywords: basketball, motion capture, questionnaire survey, video ga

Procedia PDF Downloads 126
458 Quantifying Stakeholders’ Values of Technical and Vocational Education and Training Provision in Nigeria

Authors: Lidimma Benjamin, Nimmyel Gwakzing, Wuyep Nanyi

Abstract:

Technical and Vocational Education and Training (TVET) has many stakeholders, each with their own values and interests. This study will focus on the diversity of the values and interests within and across groups of stakeholders by quantifying the value that stakeholders attached to several quality attributes of TVET, and also find out to what extent TVET stakeholders differ in their values. The quality of TVET therefore, depends on how well it aligns with the values and interests of these stakeholders. The five stakeholders are parents, students, teachers, policy makers, and work place training supervisors. The 9 attributes are employer appreciation of students, graduation rate, obtained computer skills of students, mentoring hours in workplace learning/Students Industrial Work Experience Scheme (SIWES), challenge, structure, students’ appreciation of teachers, schooling hours, and attention to civic education. 346 respondents (comprising Parents, Students, Teachers, Policy Makers, and Workplace Training Supervisors) were repeatedly asked to rank a set of 4 programs, each with a specific value on the nine quality indicators. Conjoint analysis was used to obtain the values that the stakeholders assigned to the 9 attributes when evaluating the quality of TVET programs. Rank-ordered logistic regression was the statistical/tool used for ranking the respondents values assign to the attributes. The similarities and diversity in values and interests of the different stakeholders will be of use by both Nigerian government and TVET colleges, to improve the overall quality of education and the match between vocational programs and their stakeholders simultaneous evaluation and combination of information in product attributes. Such approach models the decision environment by confronting a respondent with choices that are close to real-life choices. Therefore, it is more realistically than traditional survey methods.

Keywords: TVET, vignette study, conjoint analysis, quality perception, educational stakeholders

Procedia PDF Downloads 80
457 Energy Analysis of Sugarcane Production: A Case Study in Metehara Sugar Factory in Ethiopia

Authors: Wasihun Girma Hailemariam

Abstract:

Energy is one of the key elements required for every agricultural activity, especially for large scale agricultural production such as sugarcane cultivation which mostly is used to produce sugar and bioethanol from sugarcane. In such kinds of resource (energy) intensive activities, energy analysis of the production system and looking for other alternatives which can reduce energy inputs of the sugarcane production process are steps forward for resource management. The purpose of this study was to determine input energy (direct and indirect) per hectare of sugarcane production sector of Metehara sugar factory in Ethiopia. Total energy consumption of the production system was 61,642 MJ/ha-yr. This total input energy is a cumulative value of different inputs (direct and indirect inputs) in the production system. The contribution of these different inputs is discussed and a scenario of substituting the most influential input by other alternative input which can replace the original input in its nutrient content was discussed. In this study the most influential input for increased energy consumption was application of organic fertilizer which accounted for 50 % of the total energy consumption. Filter cake which is a residue from the sugar production in the factory was used to substitute the organic fertilizer and the reduction in the energy consumption of the sugarcane production was discussed

Keywords: energy analysis, organic fertilizer, resource management, sugarcane

Procedia PDF Downloads 158
456 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)

Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves

Abstract:

The modeling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high-resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve denser and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high-resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.

Keywords: 3D models, environment, matching, pleiades

Procedia PDF Downloads 330
455 Predicting Match Outcomes in Team Sport via Machine Learning: Evidence from National Basketball Association

Authors: Jacky Liu

Abstract:

This paper develops a team sports outcome prediction system with potential for wide-ranging applications across various disciplines. Despite significant advancements in predictive analytics, existing studies in sports outcome predictions possess considerable limitations, including insufficient feature engineering and underutilization of advanced machine learning techniques, among others. To address these issues, we extend the Sports Cross Industry Standard Process for Data Mining (SRP-CRISP-DM) framework and propose a unique, comprehensive predictive system, using National Basketball Association (NBA) data as an example to test this extended framework. Our approach follows a holistic methodology in feature engineering, employing both Time Series and Non-Time Series Data, as well as conducting Explanatory Data Analysis and Feature Selection. Furthermore, we contribute to the discourse on target variable choice in team sports outcome prediction, asserting that point spread prediction yields higher profits as opposed to game-winner predictions. Using machine learning algorithms, particularly XGBoost, results in a significant improvement in predictive accuracy of team sports outcomes. Applied to point spread betting strategies, it offers an astounding annual return of approximately 900% on an initial investment of $100. Our findings not only contribute to academic literature, but have critical practical implications for sports betting. Our study advances the understanding of team sports outcome prediction a burgeoning are in complex system predictions and pave the way for potential profitability and more informed decision making in sports betting markets.

Keywords: machine learning, team sports, game outcome prediction, sports betting, profits simulation

Procedia PDF Downloads 102
454 Cascaded Transcritical/Supercritical CO2 Cycles and Organic Rankine Cycles to Recover Low-Temperature Waste Heat and LNG Cold Energy Simultaneously

Authors: Haoshui Yu, Donghoi Kim, Truls Gundersen

Abstract:

Low-temperature waste heat is abundant in the process industries, and large amounts of Liquefied Natural Gas (LNG) cold energy are discarded without being recovered properly in LNG terminals. Power generation is an effective way to utilize low-temperature waste heat and LNG cold energy simultaneously. Organic Rankine Cycles (ORCs) and CO2 power cycles are promising technologies to convert low-temperature waste heat and LNG cold energy into electricity. If waste heat and LNG cold energy are utilized simultaneously in one system, the performance may outperform separate systems utilizing low-temperature waste heat and LNG cold energy, respectively. Low-temperature waste heat acts as the heat source and LNG regasification acts as the heat sink in the combined system. Due to the large temperature difference between the heat source and the heat sink, cascaded power cycle configurations are proposed in this paper. Cascaded power cycles can improve the energy efficiency of the system considerably. The cycle operating at a higher temperature to recover waste heat is called top cycle and the cycle operating at a lower temperature to utilize LNG cold energy is called bottom cycle in this study. The top cycle condensation heat is used as the heat source in the bottom cycle. The top cycle can be an ORC, transcritical CO2 (tCO2) cycle or supercritical CO2 (sCO2) cycle, while the bottom cycle only can be an ORC due to the low-temperature range of the bottom cycle. However, the thermodynamic path of the tCO2 cycle and sCO2 cycle are different from that of an ORC. The tCO2 cycle and the sCO2 cycle perform better than an ORC for sensible waste heat recovery due to a better temperature match with the waste heat source. Different combinations of the tCO2 cycle, sCO2 cycle and ORC are compared to screen the best configurations of the cascaded power cycles. The influence of the working fluid and the operating conditions are also investigated in this study. Each configuration is modeled and optimized in Aspen HYSYS. The results show that cascaded tCO2/ORC performs better compared with cascaded ORC/ORC and cascaded sCO2/ORC for the case study.

Keywords: LNG cold energy, low-temperature waste heat, organic Rankine cycle, supercritical CO₂ cycle, transcritical CO₂ cycle

Procedia PDF Downloads 260
453 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods

Authors: Bandar Alahmadi, Lethia Jackson

Abstract:

Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.

Keywords: adversarial examples, attack, computer vision, image processing

Procedia PDF Downloads 339
452 Combination between Intrusion Systems and Honeypots

Authors: Majed Sanan, Mohammad Rammal, Wassim Rammal

Abstract:

Today, security is a major concern. Intrusion Detection, Prevention Systems and Honeypot can be used to moderate attacks. Many researchers have proposed to use many IDSs ((Intrusion Detection System) time to time. Some of these IDS’s combine their features of two or more IDSs which are called Hybrid Intrusion Detection Systems. Most of the researchers combine the features of Signature based detection methodology and Anomaly based detection methodology. For a signature based IDS, if an attacker attacks slowly and in organized way, the attack may go undetected through the IDS, as signatures include factors based on duration of the events but the actions of attacker do not match. Sometimes, for an unknown attack there is no signature updated or an attacker attack in the mean time when the database is updating. Thus, signature-based IDS fail to detect unknown attacks. Anomaly based IDS suffer from many false-positive readings. So there is a need to hybridize those IDS which can overcome the shortcomings of each other. In this paper we propose a new approach to IDS (Intrusion Detection System) which is more efficient than the traditional IDS (Intrusion Detection System). The IDS is based on Honeypot Technology and Anomaly based Detection Methodology. We have designed Architecture for the IDS in a packet tracer and then implemented it in real time. We have discussed experimental results performed: both the Honeypot and Anomaly based IDS have some shortcomings but if we hybridized these two technologies, the newly proposed Hybrid Intrusion Detection System (HIDS) is capable enough to overcome these shortcomings with much enhanced performance. In this paper, we present a modified Hybrid Intrusion Detection System (HIDS) that combines the positive features of two different detection methodologies - Honeypot methodology and anomaly based intrusion detection methodology. In the experiment, we ran both the Intrusion Detection System individually first and then together and recorded the data from time to time. From the data we can conclude that the resulting IDS are much better in detecting intrusions from the existing IDSs.

Keywords: security, intrusion detection, intrusion prevention, honeypot, anomaly-based detection, signature-based detection, cloud computing, kfsensor

Procedia PDF Downloads 382
451 Liquid-Liquid Plug Flow Characteristics in Microchannel with T-Junction

Authors: Anna Yagodnitsyna, Alexander Kovalev, Artur Bilsky

Abstract:

The efficiency of certain technological processes in two-phase microfluidics such as emulsion production, nanomaterial synthesis, nitration, extraction processes etc. depends on two-phase flow regimes in microchannels. For practical application in chemistry and biochemistry it is very important to predict the expected flow pattern for a large variety of fluids and channel geometries. In the case of immiscible liquids, the plug flow is a typical and optimal regime for chemical reactions and needs to be predicted by empirical data or correlations. In this work flow patterns of immiscible liquid-liquid flow in a rectangular microchannel with T-junction are investigated. Three liquid-liquid flow systems are considered, viz. kerosene – water, paraffin oil – water and castor oil – paraffin oil. Different flow patterns such as parallel flow, slug flow, plug flow, dispersed (droplet) flow, and rivulet flow are observed for different velocity ratios. New flow pattern of the parallel flow with steady wavy interface (serpentine flow) has been found. It is shown that flow pattern maps based on Weber numbers for different liquid-liquid systems do not match well. Weber number multiplied by Ohnesorge number is proposed as a parameter to generalize flow maps. Flow maps based on this parameter are superposed well for all liquid-liquid systems of this work and other experiments. Plug length and velocity are measured for the plug flow regime. When dispersed liquid wets channel walls plug length cannot be predicted by known empirical correlations. By means of particle tracking velocimetry technique instantaneous velocity fields in a plug flow regime were measured. Flow circulation inside plug was calculated using velocity data that can be useful for mass flux prediction in chemical reactions.

Keywords: flow patterns, hydrodynamics, liquid-liquid flow, microchannel

Procedia PDF Downloads 394