Search results for: data comparison
27803 Router 1X3 - RTL Design and Verification
Authors: Nidhi Gopal
Abstract:
Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.Keywords: data packets, networking, router, routing
Procedia PDF Downloads 81427802 Comparison between Ultra-High-Performance Concrete and Ultra-High-Performance-Glass Concrete
Authors: N. A. Soliman, A. F. Omran, A. Tagnit-Hamou
Abstract:
The finely ground waste glass has successfully used by the authors to develop and patent an ecological ultra-high-performance concrete (UHPC), which was named as ultra-high-performance-glass concrete (UHPGC). After the successful development in laboratory, the current research presents a comparison between traditional UHPC and UHPGC produced using large-scale pilot plant mixer, in terms of rheology, mechanical, and durability properties. The rheology of the UHPGCs was improved due to the non-absorptive nature of the glass particles. The mechanical performance of UHPGC was comparable and very close to the traditional UHPC due to the pozzolan reactivity of the amorphous waste glass. The UHPGC has also shown excellent durability: negligible permeability (chloride-ion ≈ 20 Coulombs from the RCPT test), high abrasion resistance (volume loss index less than 1.3), and almost no freeze-thaw deterioration even after 1000 freeze-thaw cycles. The enhancement in the strength and rigidity of the UHPGC mixture can be referred to the inclusions of the glass particles that have very high strength and elastic modulus.Keywords: ground glass pozzolan, large-scale production, sustainability, ultra-high performance glass concrete
Procedia PDF Downloads 15727801 Explanatory Variables for Crash Injury Risk Analysis
Authors: Guilhermina Torrao
Abstract:
An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.Keywords: crash, exploratory, injury, risk, variables, vehicle
Procedia PDF Downloads 13527800 A Performance Analysis Study for Cloud Based ERP Systems
Authors: Burak Erkayman
Abstract:
The manufacturing and service organizations are in the need of using ERP systems to integrate many functions from purchasing to storage, production planning to calculation of costs. Using ERP systems by the integration in the level of information provides companies remarkable advantages in terms of profitability, productivity and efficiency in processes. Cloud computing is one of the most significant changes in information and communication technology. The developments in Cloud Computing attract business world to take advantage of this field. Cloud Computing means much more storage area, more cost saving and faster data transfer rate. In addition to these, it presents new business models, new field of study and practicable solutions for anyone’s use. These developments make inevitable the implementation of ERP systems to cloud environment. In this study, the performance of ERP systems in cloud environment is analyzed through various performance criteria and a comparison between traditional and cloud-ERP systems is presented. At the end of study the transformation and the future of ERP systems is discussed.Keywords: cloud-ERP, ERP system performance, information system transformation
Procedia PDF Downloads 52927799 A Double Acceptance Sampling Plan for Truncated Life Test Having Exponentiated Transmuted Weibull Distribution
Authors: A. D. Abdellatif, A. N. Ahmed, M. E. Abdelaziz
Abstract:
The main purpose of this paper is to design a double acceptance sampling plan under the time truncated life test when the product lifetime follows an exponentiated transmuted Weibull distribution. Here, the motive is to meet both the consumer’s risk and producer’s risk simultaneously at the specified quality levels, while the termination time is specified. A comparison between the results of the double and single acceptance sampling plans is conducted. We demonstrate the applicability of our results to real data sets.Keywords: double sampling plan, single sampling plan, producer’s risk, consumer’s risk, exponentiated transmuted weibull distribution, time truncated experiment, single, double, Marshal-Olkin
Procedia PDF Downloads 48827798 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 26527797 Endocardial Ultrasound Segmentation using Level Set method
Authors: Daoudi Abdelaziz, Mahmoudi Saïd, Chikh Mohamed Amine
Abstract:
This paper presents a fully automatic segmentation method of the left ventricle at End Systolic (ES) and End Diastolic (ED) in the ultrasound images by means of an implicit deformable model (level set) based on Geodesic Active Contour model. A pre-processing Gaussian smoothing stage is applied to the image, which is essential for a good segmentation. Before the segmentation phase, we locate automatically the area of the left ventricle by using a detection approach based on the Hough Transform method. Consequently, the result obtained is used to automate the initialization of the level set model. This initial curve (zero level set) deforms to search the Endocardial border in the image. On the other hand, quantitative evaluation was performed on a data set composed of 15 subjects with a comparison to ground truth (manual segmentation).Keywords: level set method, transform Hough, Gaussian smoothing, left ventricle, ultrasound images.
Procedia PDF Downloads 46527796 An Analytic Comparison between Arabic and English Prosodies: Poetical Feet and Meters
Authors: Jamil Jafari, Sharafat Karimi
Abstract:
The Arabic Language has a complicated system of prosody invented by the great grammarian Khalil Ibn Ahmad Farahidi. He could extract 15 meters out of his innovative five circles, which were used in Arabic poetry of the 7th and 8th centuries. Then after a while, his student Akhfash added or compensated another meter to his tutor's meters, so overall, we now have 16 different meters in Arabic poetry. These meters have been formed by various combinations of 8 different feet and each foot is combined of rudimentary units called Sabab and Wated which are combinations of movement (/) and silent (ʘ) letters. On the other hand in English, we are dealing with another system of metrical prosody. In this language, feet are consisted of stressed and unstressed syllables and are of six types: iamb, trochee, dactyl, anapest, spondee, and pyrrhic. Using the descriptive-analytic method, in this research we aim at making a comparison between Arabic and English systems of metrical prosody to investigate their similarities and differences. The results show that both of them are quantitative and both of them rely on syllables in afoot. But unlike Arabic, English is utilizing another rhyme system and the number of feet in a line differs from Arabic; also, its feet are combined of stressed and unstressed syllables, while those of Arabic is a combination of movement and silent letters.Keywords: Arabic prosody, English prosody, foot, meter, poetry
Procedia PDF Downloads 14627795 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study
Authors: Zeba Mahmood
Abstract:
The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining
Procedia PDF Downloads 53827794 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 35127793 The Accuracy of an In-House Developed Computer-Assisted Surgery Protocol for Mandibular Micro-Vascular Reconstruction
Authors: Christophe Spaas, Lies Pottel, Joke De Ceulaer, Johan Abeloos, Philippe Lamoral, Tom De Backer, Calix De Clercq
Abstract:
We aimed to evaluate the accuracy of an in-house developed low-cost computer-assisted surgery (CAS) protocol for osseous free flap mandibular reconstruction. All patients who underwent primary or secondary mandibular reconstruction with a free (solely or composite) osseous flap, either a fibula free flap or iliac crest free flap, between January 2014 and December 2017 were evaluated. The low-cost protocol consisted out of a virtual surgical planning, a prebend custom reconstruction plate and an individualized free flap positioning guide. The accuracy of the protocol was evaluated through comparison of the postoperative outcome with the 3D virtual planning, based on measurement of the following parameters: intercondylar distance, mandibular angle (axial and sagittal), inner angular distance, anterior-posterior distance, length of the fibular/iliac crest segments and osteotomy angles. A statistical analysis of the obtained values was done. Virtual 3D surgical planning and cutting guide design were performed with Proplan CMF® software (Materialise, Leuven, Belgium) and IPS Gate (KLS Martin, Tuttlingen, Germany). Segmentation of the DICOM data as well as outcome analysis were done with BrainLab iPlan® Software (Brainlab AG, Feldkirchen, Germany). A cost analysis of the protocol was done. Twenty-two patients (11 fibula /11 iliac crest) were included and analyzed. Based on voxel-based registration on the cranial base, 3D virtual planning landmark parameters did not significantly differ from those measured on the actual treatment outcome (p-values >0.05). A cost evaluation of the in-house developed CAS protocol revealed a 1750 euro cost reduction in comparison with a standard CAS protocol with a patient-specific reconstruction plate. Our results indicate that an accurate transfer of the planning with our in-house developed low-cost CAS protocol is feasible at a significant lower cost.Keywords: CAD/CAM, computer-assisted surgery, low-cost, mandibular reconstruction
Procedia PDF Downloads 14127792 Dry Relaxation Shrinkage Prediction of Bordeaux Fiber Using a Feed Forward Neural
Authors: Baeza S. Roberto
Abstract:
The knitted fabric suffers a deformation in its dimensions due to stretching and tension factors, transverse and longitudinal respectively, during the process in rectilinear knitting machines so it performs a dry relaxation shrinkage procedure and thermal action of prefixed to obtain stable conditions in the knitting. This paper presents a dry relaxation shrinkage prediction of Bordeaux fiber using a feed forward neural network and linear regression models. Six operational alternatives of shrinkage were predicted. A comparison of the results was performed finding neural network models with higher levels of explanation of the variability and prediction. The presence of different reposes are included. The models were obtained through a neural toolbox of Matlab and Minitab software with real data in a knitting company of Southern Guanajuato. The results allow predicting dry relaxation shrinkage of each alternative operation.Keywords: neural network, dry relaxation, knitting, linear regression
Procedia PDF Downloads 58527791 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 16627790 Optimization of a Hybrid PV-Diesel Minigrid System: A Case Study of Vimtim-Mubi, Nigeria
Authors: Julius Agaka Yusufu, Tsutomu Dei, Hanif Ibrahim Awal
Abstract:
This study undertakes the development of an optimal PV-diesel hybrid power system tailored to the specific energy landscape of Vimtim Mubi, Nigeria, utilizing real-world wind speed, solar radiation, and diesel cost data. Employing HOMER simulation, the research meticulously assesses the technical and financial viability of this hybrid configuration. Additionally, a rigorous performance comparison is conducted between the PV-diesel system and the conventional grid-connected alternative, offering crucial insights into the potential advantages and economic feasibility of adopting hybrid renewable energy solutions in regions grappling with energy access and reliability challenges, with implications for sustainable electrification efforts in similar communities worldwide.Keywords: Vimtim-Nigeria, Homer, renewable energy, PV-diesel hybrid system
Procedia PDF Downloads 8627789 Physicochemical Attributes of Pectin Hydrogel and Its Wound Healing Activity
Authors: Nor Khaizan Anuar, Nur Karimah Aziz, Tin Wui Wong, Ahmad Sazali Hamzah, Wan Rozita Wan Engah
Abstract:
The physicochemical attributes and wound healing activity of pectin hydrogel in rat models, following partial thickness thermal injury were investigated. The pectin hydrogel was prepared by solvent evaporation method with the aid of glutaraldehyde as crosslinking agent and glycerol as plasticizer. The physicochemical properties were mainly evaluated using differential scanning calorimetry (DSC) and Fourier transform infrared (FTIR) spectroscopy, while the wound healing activity was examined by the macroscopic images, wound size reduction and histological evaluation using haematoxylin and eosin (H&E) stain for 14 days. The DSC and FTIR analysis suggested that pectin hydrogel exhibited higher extent of polymer-polymer interaction at O-H functional group in comparison to the unprocessed pectin. This was indicated by the increase of endothermic enthalpy values from 139.35 ± 13.06 J/g of unprocessed pectin to 156.23 ± 2.86 J/g of pectin hydrogel, as well as the decrease of FTIR wavenumber corresponding to O-H at 3432.07 ± 0.49 cm-1 of unprocessed pectin to 3412.62 ± 13.06 cm-1 of pectin hydrogel. Rats treated with pectin hydrogel had significantly smaller wound size (Student’s t-test, p<0.05) when compared to the untreated group starting from day 7 until day 14. H&E staining indicated that wounds received pectin hydrogel had more fibroblasts, blood vessels and collagen bundles on day 14 in comparison to the untreated rats.Keywords: pectin, physicochemical, rats, wound
Procedia PDF Downloads 36027788 Comparative Evaluation of Accuracy of Selected Machine Learning Classification Techniques for Diagnosis of Cancer: A Data Mining Approach
Authors: Rajvir Kaur, Jeewani Anupama Ginige
Abstract:
With recent trends in Big Data and advancements in Information and Communication Technologies, the healthcare industry is at the stage of its transition from clinician oriented to technology oriented. Many people around the world die of cancer because the diagnosis of disease was not done at an early stage. Nowadays, the computational methods in the form of Machine Learning (ML) are used to develop automated decision support systems that can diagnose cancer with high confidence in a timely manner. This paper aims to carry out the comparative evaluation of a selected set of ML classifiers on two existing datasets: breast cancer and cervical cancer. The ML classifiers compared in this study are Decision Tree (DT), Support Vector Machine (SVM), k-Nearest Neighbor (k-NN), Logistic Regression, Ensemble (Bagged Tree) and Artificial Neural Networks (ANN). The evaluation is carried out based on standard evaluation metrics Precision (P), Recall (R), F1-score and Accuracy. The experimental results based on the evaluation metrics show that ANN showed the highest-level accuracy (99.4%) when tested with breast cancer dataset. On the other hand, when these ML classifiers are tested with the cervical cancer dataset, Ensemble (Bagged Tree) technique gave better accuracy (93.1%) in comparison to other classifiers.Keywords: artificial neural networks, breast cancer, classifiers, cervical cancer, f-score, machine learning, precision, recall
Procedia PDF Downloads 27727787 Fintech Credit and Bank Efficiency Two-way Relationship: A Comparison Study Across Country Groupings
Authors: Tan Swee Liang
Abstract:
This paper studies the two-way relationship between fintech credit and banking efficiency using the Generalized panel Method of Moment (GMM) estimation in structural equation modeling (SEM). Banking system efficiency, defined as its ability to produce the existing level of outputs with minimal inputs, is measured using input-oriented data envelopment analysis (DEA), where the whole banking system of an economy is treated as a single DMU. Banks are considered an intermediary between depositors and borrowers, utilizing inputs (deposits and overhead costs) to provide outputs (increase credits to the private sector and its earnings). Analysis of the interrelationship between fintech credit and bank efficiency is conducted to determine the impact in different country groupings (ASEAN, Asia and OECD), in particular the banking system response to fintech credit platforms. Our preliminary results show that banks do respond to the greater pressure caused by fintech platforms to enhance their efficiency, but differently across the different groups. The author’s earlier research on ASEAN-5 high bank overhead costs (as a share of total assets) as the determinant of economic growth suggests that expenses may not have been channeled efficiently to income-generating activities. One practical implication of the findings is that policymakers should enable alternative financing, such as fintech credit, as a warning or encouragement for banks to improve their efficiency.Keywords: fintech lending, banking efficiency, data envelopment analysis, structural equation modeling
Procedia PDF Downloads 9127786 Empirical Study for the Project and the Project Management Dimensions Comparison between SMEs and Large Companies
Authors: Amina Oukennou, Zitouni Beidouri, Otmane Bouksour
Abstract:
Small to Medium-sized enterprises are a very important component of the economy. They are present in the whole industries all over the world. They are considered as the engine for future growth in the economy. Project management is an economical international factor impacting all types of enterprises including the SMEs. This paper has the aim of measuring the weight of using projects and project management in Moroccan SMEs in comparison with the large companies. The study is based on interviews with experts: project managers, managers, directors, and consultants. They were asked questions measuring the weight of using projects, the level of using project management, and the resources employed. Eighteen Moroccan companies from a range of industries and sizes were consulted. All the companies consider projects as a key element in their strategy. Most of them affirm the great usefulness of the approach 'project', especially for the external activities. The main differences lie in the duration and the size of used projects. Despite the commonly shared idea about the importance of the project management, the interviewed persons believe that the project management knowledge has the same importance or less than the technical knowledge. All the companies affirm the need for a simpler version of project management. The content varies from one company to another.Keywords: project dimension, project management, small to medium-sized entreprise, Morocco
Procedia PDF Downloads 31627785 Experimental and Computational Fluid Dynamics Analysis of Horizontal Axis Wind Turbine
Authors: Saim Iftikhar Awan, Farhan Ali
Abstract:
Wind power has now become one of the most important resources of renewable energy. The machine which extracts kinetic energy from wind is wind turbine. This work is all about the electrical power analysis of horizontal axis wind turbine to check the efficiency of different configurations of wind turbines to get maximum output and comparison of experimental and Computational Fluid Dynamics (CFD) results. Different experiments have been performed to obtain that configuration with the help of which we can get the maximum electrical power output by changing the different parameters like the number of blades, blade shape, wind speed, etc. in first step experimentation is done, and then the similar configuration is designed in 3D CAD software. After a series of experiments, it has been found that the turbine with four blades at an angle of 75° gives maximum power output and increase in wind speed increases the power output. The models designed on CAD software are imported on ANSYS-FLUENT to predict mechanical power. This mechanical power is then converted into electrical power, and the results were approximately the same in both cases. In the end, a comparison has been done to compare the results of experiments and ANSYS-FLUENT.Keywords: computational analysis, power efficiency, wind energy, wind turbine
Procedia PDF Downloads 15927784 The Comparison of Chromium Ions Release Stainless Steel 18-8 between Artificial Saliva and Black Tea Leaves Extracts
Authors: Nety Trisnawaty, Mirna Febriani
Abstract:
The use of stainless steel wires in the field of dentistry is widely used, especially for orthodontic and prosthodontic treatment using stainless steel wire. The oral cavity is the ideal environment for corrosion, which can be caused by saliva. Prevention of corrosion on stainless steel wires can be done by using an organic or non-organic corrosion inhibitor. One of the organic inhibitors that can be used to prevent corrosion is black tea leaves extracts. To explain the comparison of chromium ions release for stainlees steel between artificial saliva and black tea leaves extracts. In this research we used artificial saliva, black tea leaves extracts, stainless steel wire and using Atomic Absorption Spectrophometric testing machine. The samples were soaked for 1, 3, 7 and 14 days in the artificial saliva and black tea leaves extracts. The results showed the difference of chromium ion release soaked in artificial saliva and black tea leaves extracts on days 1, 3, 7 and 14. Statistically, calculation with independent T-test with p < 0,05 showed a significant difference. The longer the duration of days, the more ion chromium were released. The conclusion of this study shows that black tea leaves extracts can inhibit the corrosion rate of stainless steel wires.Keywords: chromium ion, stainless steel, artificial saliva, black tea leaves extracts
Procedia PDF Downloads 28027783 Examining the Predictors of Non-Urgent Emergency Department Visits: A Population Based Study
Authors: Maher El-Masri, Jamie Crawley, Judy Bornais, Abeer Omar
Abstract:
Background: Misuse of Emergency Department (ED) for non-urgent healthcare results in unnecessary crowdedness that can result in long ED waits and delays in treatment, diversion of ambulances to other hospitals, poor health outcomes for patients, and increased risk of death Objectives: The main purpose of this study was to explore the independent predictors of non-urgent ED visits in Erie St. Clair LHIN. Secondary purposes of the study include comparison of the rates of non-urgent ED visits between urban and rural hospitals Design: A secondary analysis of archived population-based data on 597,373 ED visits in southwestern Ontario Results The results suggest that older (OR = .992; 95% CI .992 – .993) and female patients (OR = .940; 95% CI .929 - .950) were less likely to visit ED for non-urgent causes. Non-urgent ED visits during the winter, spring, and fall were 13%, 5.8%, and 7.5%, respectively, lesser than they were during the summer time. The data further suggest that non-urgent visits were 19.6% and 21.3% less likely to occur in evening and overnight shifts compared to the day shift. Non-urgent visits were 2.76 times more likely to present to small community hospitals than large community hospitals. Health care providers were 1.92 times more likely to refer patients with non-urgent health problem to the ED than the decision taken by patients, family member or caretakers. Conclusion: In conclusion, our study highlights a number of important factors that are associated with inappropriate use of ED visits for non-urgent health problems. Knowledge of these factors could be used to address the issue of unnecessary ED crowdedness.Keywords: emergency department, non-urgent visits, predictors, logistic regression
Procedia PDF Downloads 24727782 Adoption of Big Data by Global Chemical Industries
Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta
Abstract:
The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science
Procedia PDF Downloads 8627781 Channel Estimation for Orthogonal Frequency Division Multiplexing Systems over Doubly Selective Channels Base on DCS-DCSOMP Algorithm
Authors: Linyu Wang, Furui Huo, Jianhong Xiang
Abstract:
The Doppler shift generated by high-speed movement and multipath effects in the channel are the main reasons for the generation of a time-frequency doubly-selective (DS) channel. There is severe inter-carrier interference (ICI) in the DS channel. Channel estimation for an orthogonal frequency division multiplexing (OFDM) system over a DS channel is very difficult. The simultaneous orthogonal matching pursuit algorithm under distributed compressive sensing theory (DCS-SOMP) has been used in channel estimation for OFDM systems over DS channels. However, the reconstruction accuracy of the DCS-SOMP algorithm is not high enough in the low SNR stage. To solve this problem, in this paper, we propose an improved DCS-SOMP algorithm based on the inner product difference comparison operation (DCS-DCSOMP). The reconstruction accuracy is improved by increasing the number of candidate indexes and designing the comparison conditions of inner product difference. We combine the DCS-DCSOMP algorithm with the basis expansion model (BEM) to reduce the complexity of channel estimation. Simulation results show the effectiveness of the proposed algorithm and its advantages over other algorithms.Keywords: OFDM, doubly selective, channel estimation, compressed sensing
Procedia PDF Downloads 9627780 Comparison of E-Waste Management in Switzerland and in Australia: A Qualitative Content Analysis
Authors: Md Tasbirul Islam, Pablo Dias, Nazmul Huda
Abstract:
E-waste/Waste electrical and electronic equipment (WEEE) is one of the fastest growing waste streams across the globe. This paper aims to compare the e-waste management system in Switzerland and Australia in terms of four features - legislative initiatives, disposal practice, collection and financial mechanisms. The qualitative content analysis is employed as a research method in the study. Data were collected from various published academic research papers, industry reports, and web sources. In addition, a questionnaire survey is conducted in Australia to understand the public awareness and opinions on the features. The results of the study provide valuable insights to policymakers in Australia developing better e-waste management system in conjunction with the public consensus, and the state-of-the-art operational strategies currently being practiced in Switzerland.Keywords: E-waste management, WEEE, awareness, pro-environmental behavior, Australia, Switzerland
Procedia PDF Downloads 28127779 Study and Simulation of a Sever Dust Storm over West and South West of Iran
Authors: Saeed Farhadypour, Majid Azadi, Habibolla Sayyari, Mahmood Mosavi, Shahram Irani, Aliakbar Bidokhti, Omid Alizadeh Choobari, Ziba Hamidi
Abstract:
In the recent decades, frequencies of dust events have increased significantly in west and south west of Iran. First, a survey on the dust events during the period (1990-2013) is investigated using historical dust data collected at 6 weather stations scattered over west and south-west of Iran. After statistical analysis of the observational data, one of the most severe dust storm event that occurred in the region from 3rd to 6th July 2009, is selected and analyzed. WRF-Chem model is used to simulate the amount of PM10 and how to transport it to the areas. The initial and lateral boundary conditions for model obtained from GFS data with 0.5°×0.5° spatial resolution. In the simulation, two aerosol schemas (GOCART and MADE/SORGAM) with 3 options (chem_opt=106,300 and 303) were evaluated. Results of the statistical analysis of the historical data showed that south west of Iran has high frequency of dust events, so that Bushehr station has the highest frequency between stations and Urmia station has the lowest frequency. Also in the period of 1990 to 2013, the years 2009 and 1998 with the amounts of 3221 and 100 respectively had the highest and lowest dust events and according to the monthly variation, June and July had the highest frequency of dust events and December had the lowest frequency. Besides, model results showed that the MADE / SORGAM scheme has predicted values and trends of PM10 better than the other schemes and has showed the better performance in comparison with the observations. Finally, distribution of PM10 and the wind surface maps obtained from numerical modeling showed that the formation of dust plums formed in Iraq and Syria and also transportation of them to the West and Southwest of Iran. In addition, comparing the MODIS satellite image acquired on 4th July 2009 with model output at the same time showed the good ability of WRF-Chem in simulating spatial distribution of dust.Keywords: dust storm, MADE/SORGAM scheme, PM10, WRF-Chem
Procedia PDF Downloads 27227778 Partially-Averaged Navier-Stokes for Computations of Flow Around Three-Dimensional Ahmed Bodies
Authors: Maryam Mirzaei, Sinisa Krajnovic´
Abstract:
The paper reports a study about the prediction of flows around simplified vehicles using Partially-Averaged Navier-Stokes (PANS). Numerical simulations are performed for two simplified vehicles: A slanted-back Ahmed body at Re=30 000 and a square back Ahmed body at Re=300 000. A comparison of the resolved and modeled physical flow scales is made with corresponding LES and experimental data for a better understanding of the performance of the PANS model. The PANS model is compared for coarse and fine grid resolutions and it is indicated that even a coarse-grid PANS simulation is able to produce fairly close flow predictions to those from a well-resolved LES simulation. The results indicate the possibility of improvement of the predictions by employing a finer grid resolution.Keywords: partially-averaged Navier-Stokes, large eddy simulation, PANS, LES, Ahmed body
Procedia PDF Downloads 60027777 Testing the Simplification Hypothesis in Constrained Language Use: An Entropy-Based Approach
Authors: Jiaxin Chen
Abstract:
Translations have been labeled as more simplified than non-translations, featuring less diversified and more frequent lexical items and simpler syntactic structures. Such simplified linguistic features have been identified in other bilingualism-influenced language varieties, including non-native and learner language use. Therefore, it has been proposed that translation could be studied within a broader framework of constrained language, and simplification is one of the universal features shared by constrained language varieties due to similar cognitive-physiological and social-interactive constraints. Yet contradicting findings have also been presented. To address this issue, this study intends to adopt Shannon’s entropy-based measures to quantify complexity in language use. Entropy measures the level of uncertainty or unpredictability in message content, and it has been adapted in linguistic studies to quantify linguistic variance, including morphological diversity and lexical richness. In this study, the complexity of lexical and syntactic choices will be captured by word-form entropy and pos-form entropy, and a comparison will be made between constrained and non-constrained language use to test the simplification hypothesis. The entropy-based method is employed because it captures both the frequency of linguistic choices and their evenness of distribution, which are unavailable when using traditional indices. Another advantage of the entropy-based measure is that it is reasonably stable across languages and thus allows for a reliable comparison among studies on different language pairs. In terms of the data for the present study, one established (CLOB) and two self-compiled corpora will be used to represent native written English and two constrained varieties (L2 written English and translated English), respectively. Each corpus consists of around 200,000 tokens. Genre (press) and text length (around 2,000 words per text) are comparable across corpora. More specifically, word-form entropy and pos-form entropy will be calculated as indicators of lexical and syntactical complexity, and ANOVA tests will be conducted to explore if there is any corpora effect. It is hypothesized that both L2 written English and translated English have lower entropy compared to non-constrained written English. The similarities and divergences between the two constrained varieties may provide indications of the constraints shared by and peculiar to each variety.Keywords: constrained language use, entropy-based measures, lexical simplification, syntactical simplification
Procedia PDF Downloads 9427776 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories
Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos
Abstract:
Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.Keywords: database, forensic genetics, genetic analysis, sample management, software solution
Procedia PDF Downloads 37027775 Trends in Endoscopic Versus Open Treatment of Carpal Tunnel Syndrome in Rheumatoid Arthritis Patients
Authors: Arman Kishan, Sanjay Kubsad, Steve Li, Mark Haft, Duc Nguyen, Dawn Laporte
Abstract:
Objective: Carpal tunnel syndrome can be managed surgically with endoscopic or open carpal tunnel release (CTR). Rheumatoid arthritis (RA) is a known risk factor for Carpal Tunnel Syndrome (CTS) and is believed to be related to compression of the median nerve secondary to inflammation. We aimed to analyze national trends, outcomes, and patient-specific comorbidities associated with ECTR and OCTR in patients with RA. Methods: A retrospective cohort study was conducted using the PearlDiver database, identifying 683 RA patients undergoing ECTR and 4234 undergoing OCTR between 2010 and 2014. Demographic data, comorbidities, and complication rates were analyzed. Univariate and multivariable analyses assessed differences between the treatment methods. Results: Patients with RA undergoing ECTR in comparison to OCTR had no significant differences in medical comorbidities such as hypertension, obesity, chronic kidney disease, hypothyroidism and diabetes mellitus. Patients in the ECTR group reported a risk ratio of 1.44 (95%CI: 1.10-1.89, p=0.01) of requiring repeat procedures within 90 days of the initial procedure. Five-year trends in ECTR and OCTR procedures reported a combined annual growth rate of 5.6% and 13.15, respectively. Conclusion: Endoscopic and open approaches to CTR are important considerations in surgical planning. RA and ECTR have previously been identified as independent risk factors for revision CTR. Our study has identified the 90-day risk of repeat procedures to be elevated in the ECTR group in comparison to the OCTR group. Additionally, the growth of OCTR procedures has outpaced the growth of ECTR procedures in the same period, likely in response to the trend of ECTR leading to higher rates of repeat procedures. The need for revision following ECTR in patients with RA could be related to chronic inflammation leading to transverse carpal ligament thickening and concomitant tenosynovitis. Future directions could include further characterization of repeat procedures performed in this subset of patients.Keywords: endoscopic treatment of carpal tunnel syndrome, open treatment of carpal tunnel syndrome, rheumatoid arthritis, trends analysis, carpal tunnel syndrome
Procedia PDF Downloads 6627774 Infrared Thermography as an Informative Tool in Energy Audit and Software Modelling of Historic Buildings: A Case Study of the Sheffield Cathedral
Authors: Ademuyiwa Agbonyin, Stamatis Zoras, Mohammad Zandi
Abstract:
This paper investigates the extent to which building energy modelling can be informed based on preliminary information provided by infrared thermography using a thermal imaging camera in a walkthrough audit. The case-study building is the Sheffield Cathedral, built in the early 1400s. Based on an informative qualitative report generated from the thermal images taken at the site, the regions showing significant heat loss are input into a computer model of the cathedral within the integrated environmental solution (IES) virtual environment software which performs an energy simulation to determine quantitative heat losses through the building envelope. Building data such as material thermal properties and building plans are provided by the architects, Thomas Ford and Partners Ltd. The results of the modelling revealed the portions of the building with the highest heat loss and these aligned with those suggested by the thermal camera. Retrofit options for the building are also considered, however, may not see implementation due to a desire to conserve the architectural heritage of the building. Results show that thermal imaging in a walk-through audit serves as a useful guide for the energy modelling process. Hand calculations were also performed to serve as a 'control' to estimate losses, providing a second set of data points of comparison.Keywords: historic buildings, energy retrofit, thermal comfort, software modelling, energy modelling
Procedia PDF Downloads 170