Search results for: Signal Processing
1978 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis
Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho
Abstract:
This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis
Procedia PDF Downloads 1821977 On the Qarat Kibrit Salt Dome Faulting System South of Adam, Oman: In Search of Uranium Anomalies
Authors: Alaeddin Ebrahimi, Narasimman Sundararajan, Bernhard Pracejus
Abstract:
Development of salt domes, often a rising from depths of some 10 km or more, causes an intense faulting of the surrounding host rocks (salt tectonics). The fractured rocks then present ideal space for oil that can migrate and get trapped. If such moving of hydrocarbons passes uranium-carrying rock units (e.g., shales), uranium is collected and enriched by organic carbon compounds. Brines from the salt body are also ideal carriers for oxidized uranium species and will further dislocate uranium when in contact with uranium-enriched oils. Uranium then has the potential to mineralize in the vicinity of the dome (blue halite is evidence for radiation having affected salt deposits elsewhere in the world). Based on this knowledge, the Qarat Kibrit salt dome was investigated by a well-established geophysical method like very low frequency electromagnetic (VLF-EM) along five traverses approximately 250 m in length (10 m intervals) in order to identify subsurface fault systems. In-phase and quadrature components of the VLF-EM signal were recorded at two different transmitter frequencies (24.0 and 24.9 kHz). The images of Fraser filtered response of the in-phase components indicate a conductive zone (fault) in the southeast and southwest of the study area. The Karous-Hjelt current density pseudo section delineates subsurface faults at depths between 10 and 40 m. The stacked profiles of the Fraser filtered responses brought out two plausible trends/directions of faults. However, there seems to be no evidence for uranium enrichment has been recorded in this area.Keywords: salt dome, uranium, fault, in-phase component, quadrature component, Fraser filter, Karous-Hjelt current density
Procedia PDF Downloads 2401976 Biodistribution of Fluorescence-Labelled Epidermal Growth Factor Protein from Slow Release Nanozolid Depots in Mouse
Authors: Stefan Gruden, Charlott Brunmark, Bo Holmqvist, Erwin D. Brenndorfer, Martin Johansson, Jian Liu, Ying Zhao, Niklas Axen, Moustapha Hassan
Abstract:
Aim: The study was designed to evaluate the ability of the calcium sulfate-based NanoZolid® drug delivery technology to locally release the epidermal growth factor (EGF) protein while maintaining its biological activity. Methods: NanoZolid-formulated EGF protein labelled with a near-infrared dye (EGF-NIR) depots or EGF-NIR dissolved in PBS were injected subcutaneously into mice bearing EGF receptor (EGFR) positive human A549 lung cancer tumors inoculated subcutaneously. The release and biodistribution of the EGF-NIR were investigated in vivo longitudinally up to 96 hours post-administration, utilizing whole-body fluorescence imaging. In order to confirm the in vivo findings, histological analysis of tumor cryosections was performed to investigate EGF-NIR fluorescent signal and EGFR expression level by immunofluorescence labelling. Results: The in vivo fluorescence imaging showed a controlled release profile of the EGF-NIR loaded in the NanoZolid depots compared to free EGF-NIR. Histological analysis of the tumors further demonstrated a prevailing distribution of EGF-NIR in regions with high levels of EGFR expression. Conclusion: Calcium sulfate based depots can be used to formulate EGF while maintaining its biological activity, e.g., receptor binding capability. This may have good clinical potential for local delivery of biomolecules to enhance treatment efficacy and minimize systemic adverse effects.Keywords: bioresorbable, calcium sulfate, controlled release, NanoZolid
Procedia PDF Downloads 1651975 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Authors: Ameur Abdelkader, Abed Bouarfa Hafida
Abstract:
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm
Procedia PDF Downloads 1421974 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot
Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan
Abstract:
With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.Keywords: object detection, feature, descriptors, SIFT, SURF, depth images, service robots
Procedia PDF Downloads 5461973 Climate Change and the Role of Foreign-Invested Enterprises
Authors: Xuemei Jiang, Kunfu Zhu, Shouyang Wang
Abstract:
In this paper, we selected China as a case and employ a time-series of unique input-output tables distinguishing firm ownership and processing exports, to evaluate the role of foreign-invested enterprises (FIEs) in China’s rapid carbon dioxide emission growth. The results suggested that FIEs contributed to 11.55% of the economic outputs’ growth in China between 1992-2010, but accounted for only 9.65% of the growth of carbon dioxide emissions. In relative term, until 2010 FIEs still emitted much less than Chinese-owned enterprises (COEs) when producing the same amount of outputs, although COEs experienced much faster technology upgrades. In an ideal scenario where we assume the final demands remain unchanged and COEs completely mirror the advanced technologies of FIEs, more than 2000 Mt of carbon dioxide emissions would be reduced for China in 2010. From a policy perspective, the widespread FIEs are very effective and efficient channel to encourage technology transfer from developed to developing countries.Keywords: carbon dioxide emissions, foreign-invested enterprises, technology transfer, input–output analysis, China
Procedia PDF Downloads 3981972 Synthesis of Human Factors Theories and Industry 4.0
Authors: Andrew Couch, Nicholas Loyd, Nathan Tenhundfeld
Abstract:
The rapid emergence of technology observably induces disruptive effects that carry implications for internal organizational dynamics as well as external market opportunities, strategic pressures, and threats. An examination of the historical tendencies of technology innovation shows that the body of managerial knowledge for addressing such disruption is underdeveloped. Fundamentally speaking, the impacts of innovation are unique and situationally oriented. Hence, the appropriate managerial response becomes a complex function that depends on the nature of the emerging technology, the posturing of internal organizational dynamics, the rate of technological growth, and much more. This research considers a particular case of mismanagement, the BP Texas City Refinery explosion of 2005, that carries notable discrepancies on the basis of human factors principles. Moreover, this research considers the modern technological climate (shaped by Industry 4.0 technologies) and seeks to arrive at an appropriate conceptual lens by which human factors principles and Industry 4.0 may be favorably integrated. In this manner, the careful examination of these phenomena helps to better support the sustainment of human factors principles despite the disruptive impacts that are imparted by technological innovation. In essence, human factors considerations are assessed through the application of principles that stem from usability engineering, the Swiss Cheese Model of accident causation, human-automation interaction, signal detection theory, alarm design, and other factors. Notably, this stream of research supports a broader framework in seeking to guide organizations amid the uncertainties of Industry 4.0 to capture higher levels of adoption, implementation, and transparency.Keywords: Industry 4.0, human factors engineering, management, case study
Procedia PDF Downloads 681971 Hydrologic Balance and Surface Water Resources of the Cheliff-Zahrez Basin
Authors: Mehaiguene Madjid, Touhari Fadhila, Meddi Mohamed
Abstract:
The Cheliff basin offers a good hydrological example for the possibility of studying the problem which elucidated in the future, because of the unclearity in several aspects and hydraulic installation. Thus, our study of the Cheliff basin is divided into two principal parts: The spatial evaluation of the precipitation: also, the understanding of the modes of the reconstitution of the resource in water supposes a good knowledge of the structuring of the precipitation fields in the studied space. In the goal of a good knowledge of revitalizes them in water and their management integrated one judged necessary to establish a precipitation card of the Cheliff basin for a good understanding of the evolution of the resource in water in the basin and that goes will serve as basis for all study of hydraulic planning in the Cheliff basin. Then, the establishment of the precipitation card of the Cheliff basin answered a direct need of setting to the disposition of the researchers for the region and a document of reference that will be completed therefore and actualized. The hydrological study, based on the statistical hydrometric data processing will lead us to specify the hydrological terms of the assessment hydrological and to clarify the fundamental aspects of the annual flow, seasonal, extreme and thus of their variability and resources surface water.Keywords: hydrological assessment, surface water resources, Cheliff, Algeria
Procedia PDF Downloads 3041970 LGG Architecture for Brain Tumor Segmentation Using Convolutional Neural Network
Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan
Abstract:
The most aggressive form of brain tumor is called glioma. Glioma is kind of tumor that arises from glial tissue of the brain and occurs quite often. A fully automatic 2D-CNN model for brain tumor segmentation is presented in this paper. We performed pre-processing steps to remove noise and intensity variances using N4ITK and standard intensity correction, respectively. We used Keras open-source library with Theano as backend for fast implementation of CNN model. In addition, we used BRATS 2015 MRI dataset to evaluate our proposed model. Furthermore, we have used SimpleITK open-source library in our proposed model to analyze images. Moreover, we have extracted random 2D patches for proposed 2D-CNN model for efficient brain segmentation. Extracting 2D patched instead of 3D due to less dimensional information present in 2D which helps us in reducing computational time. Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.77 for complete, 0.76 for core, 0.77 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.Keywords: brain tumor segmentation, convolutional neural networks, deep learning, LGG
Procedia PDF Downloads 1821969 An Online Adaptive Thresholding Method to Classify Google Trends Data Anomalies for Investor Sentiment Analysis
Authors: Duygu Dere, Mert Ergeneci, Kaan Gokcesu
Abstract:
Google Trends data has gained increasing popularity in the applications of behavioral finance, decision science and risk management. Because of Google’s wide range of use, the Trends statistics provide significant information about the investor sentiment and intention, which can be used as decisive factors for corporate and risk management fields. However, an anomaly, a significant increase or decrease, in a certain query cannot be detected by the state of the art applications of computation due to the random baseline noise of the Trends data, which is modelled as an Additive white Gaussian noise (AWGN). Since through time, the baseline noise power shows a gradual change an adaptive thresholding method is required to track and learn the baseline noise for a correct classification. To this end, we introduce an online method to classify meaningful deviations in Google Trends data. Through extensive experiments, we demonstrate that our method can successfully classify various anomalies for plenty of different data.Keywords: adaptive data processing, behavioral finance , convex optimization, online learning, soft minimum thresholding
Procedia PDF Downloads 1671968 Structural Damage Detection in a Steel Column-Beam Joint Using Piezoelectric Sensors
Authors: Carlos H. Cuadra, Nobuhiro Shimoi
Abstract:
Application of piezoelectric sensors to detect structural damage due to seismic action on building structures is investigated. Plate-type piezoelectric sensor was developed and proposed for this task. A film-type piezoelectric sheet was attached on a steel plate and covered by a layer of glass. A special glue is used to fix the glass. This glue is a silicone that requires the application of ultraviolet rays for its hardening. Then, the steel plate was set up at a steel column-beam joint of a test specimen that was subjected to bending moment when test specimen is subjected to monotonic load and cyclic load. The structural behavior of test specimen during cyclic loading was verified using a finite element model, and it was found good agreement between both results on load-displacement characteristics. The cross section of steel elements (beam and column) is a box section of 100 mm×100 mm with a thin of 6 mm. This steel section is specified by the Japanese Industrial Standards as carbon steel square tube for general structure (STKR400). The column and beam elements are jointed perpendicularly using a fillet welding. The resulting test specimen has a T shape. When large deformation occurs the glass plate of the sensor device cracks and at that instant, the piezoelectric material emits a voltage signal which would be the indicator of a certain level of deformation or damage. Applicability of this piezoelectric sensor to detect structural damages was verified; however, additional analysis and experimental tests are required to establish standard parameters of the sensor system.Keywords: piezoelectric sensor, static cyclic test, steel structure, seismic damages
Procedia PDF Downloads 1231967 The Shannon Entropy and Multifractional Markets
Authors: Massimiliano Frezza, Sergio Bianchi, Augusto Pianese
Abstract:
Introduced by Shannon in 1948 in the field of information theory as the average rate at which information is produced by a stochastic set of data, the concept of entropy has gained much attention as a measure of uncertainty and unpredictability associated with a dynamical system, eventually depicted by a stochastic process. In particular, the Shannon entropy measures the degree of order/disorder of a given signal and provides useful information about the underlying dynamical process. It has found widespread application in a variety of fields, such as, for example, cryptography, statistical physics and finance. In this regard, many contributions have employed different measures of entropy in an attempt to characterize the financial time series in terms of market efficiency, market crashes and/or financial crises. The Shannon entropy has also been considered as a measure of the risk of a portfolio or as a tool in asset pricing. This work investigates the theoretical link between the Shannon entropy and the multifractional Brownian motion (mBm), stochastic process which recently is the focus of a renewed interest in finance as a driving model of stochastic volatility. In particular, after exploring the current state of research in this area and highlighting some of the key results and open questions that remain, we show a well-defined relationship between the Shannon (log)entropy and the memory function H(t) of the mBm. In details, we allow both the length of time series and time scale to change over analysis to study how the relation modify itself. On the one hand, applications are developed after generating surrogates of mBm trajectories based on different memory functions; on the other hand, an empirical analysis of several international stock indexes, which confirms the previous results, concludes the work.Keywords: Shannon entropy, multifractional Brownian motion, Hurst–Holder exponent, stock indexes
Procedia PDF Downloads 1101966 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers
Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice
Abstract:
In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.Keywords: churn prediction, data mining, decision-theoretic rough set, feature selection
Procedia PDF Downloads 4461965 Surface Quality Improvement of Abrasive Waterjet Cutting for Spacecraft Structure
Authors: Tarek M. Ahmed, Ahmed S. El Mesalamy, Amro M. Youssef, Tawfik T. El Midany
Abstract:
Abrasive waterjet (AWJ) machining is considered as one of the most powerful cutting processes. It can be used for cutting heat sensitive, hard and reflective materials. Aluminum 2024 is a high-strength alloy which is widely used in aerospace and aviation industries. This paper aims to improve aluminum alloy and to investigate the effect of AWJ control parameters on surface geometry quality. Design of experiments (DoE) is used for establishing an experimental matrix. Statistical modeling is used to present a relation between the cutting parameters (pressure, speed, and distance between the nozzle and cut surface) and responses (taper angle and surface roughness). The results revealed a tangible improvement in productivity by using AWJ processing. The taper kerf angle can be improved by decreasing standoff distance and speed and increasing water pressure. While decreasing (cutting speed, pressure and distance between the nozzle and cut surface) improve the surface roughness in the operating window of cutting parameters.Keywords: abrasive waterjet machining, machining of aluminum alloy, non-traditional cutting, statistical modeling
Procedia PDF Downloads 2501964 SIF Computation of Cracked Plate by FEM
Authors: Sari Elkahina, Zergoug Mourad, Benachenhou Kamel
Abstract:
The main purpose of this paper is to perform a computations comparison of stress intensity factor 'SIF' evaluation in case of cracked thin plate with Aluminum alloy 7075-T6 and 2024-T3 used in aeronautics structure under uniaxial loading. This evaluation is based on finite element method with a virtual power principle through two techniques: the extrapolation and G−θ. The first one consists to extrapolate the nodal displacements near the cracked tip using a refined triangular mesh with T3 and T6 special elements, while the second, consists of determining the energy release rate G through G−θ method by potential energy derivation which corresponds numerically to the elastic solution post-processing of a cracked solid by a contour integration computation via Gauss points. The SIF obtained results from extrapolation and G−θ methods will be compared to an analytical solution in a particular case. To illustrate the influence of the meshing kind and the size of integration contour position simulations are presented and analyzed.Keywords: crack tip, SIF, finite element method, concentration technique, displacement extrapolation, aluminum alloy 7075-T6 and 2024-T3, energy release rate G, G-θ method, Gauss point numerical integration
Procedia PDF Downloads 3371963 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes
Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar
Abstract:
Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.Keywords: continuous query processing, dynamic database, moving object, skyline queries
Procedia PDF Downloads 2101962 Enhancing the Recruitment Process through Machine Learning: An Automated CV Screening System
Authors: Kaoutar Ben Azzou, Hanaa Talei
Abstract:
Human resources is an important department in each organization as it manages the life cycle of employees from recruitment training to retirement or termination of contracts. The recruitment process starts with a job opening, followed by a selection of the best-fit candidates from all applicants. Matching the best profile for a job position requires a manual way of looking at many CVs, which requires hours of work that can sometimes lead to choosing not the best profile. The work presented in this paper aims at reducing the workload of HR personnel by automating the preliminary stages of the candidate screening process, thereby fostering a more streamlined recruitment workflow. This tool introduces an automated system designed to help with the recruitment process by scanning candidates' CVs, extracting pertinent features, and employing machine learning algorithms to decide the most fitting job profile for each candidate. Our work employs natural language processing (NLP) techniques to identify and extract key features from unstructured text extracted from a CV, such as education, work experience, and skills. Subsequently, the system utilizes these features to match candidates with job profiles, leveraging the power of classification algorithms.Keywords: automated recruitment, candidate screening, machine learning, human resources management
Procedia PDF Downloads 561961 Words of Peace in the Speeches of the Egyptian President, Abdulfattah El-Sisi: A Corpus-Based Study
Authors: Mohamed S. Negm, Waleed S. Mandour
Abstract:
The present study aims primarily at investigating words of peace (lexemes of peace) in the formal speeches of the Egyptian president Abdulfattah El-Sisi in a two-year span of time, from 2018 to 2019. This paper attempts to shed light not only on the contextual use of the antonyms, war and peace, but also it underpins quantitative analysis through the current methods of corpus linguistics. As such, the researchers have deployed a corpus-based approach in collecting, encoding, and processing 30 presidential speeches over the stated period (23,411 words and 25,541 tokens in total). Further, semantic fields and collocational networkzs are identified and compared statistically. Results have shown a significant propensity of adopting peace, including its relevant collocation network, textually and therefore, ideationally, at the expense of war concept which in most cases surfaces euphemistically through the noun conflict. The president has not justified the action of war with an honorable cause or a valid reason. Such results, so far, have indicated a positive sociopolitical mindset the Egyptian president possesses and moreover, reveal national and international fair dealing on arising issues.Keywords: CADS, collocation network, corpus linguistics, critical discourse analysis
Procedia PDF Downloads 1551960 Improved Super-Resolution Using Deep Denoising Convolutional Neural Network
Authors: Pawan Kumar Mishra, Ganesh Singh Bisht
Abstract:
Super-resolution is the technique that is being used in computer vision to construct high-resolution images from a single low-resolution image. It is used to increase the frequency component, recover the lost details and removing the down sampling and noises that caused by camera during image acquisition process. High-resolution images or videos are desired part of all image processing tasks and its analysis in most of digital imaging application. The target behind super-resolution is to combine non-repetition information inside single or multiple low-resolution frames to generate a high-resolution image. Many methods have been proposed where multiple images are used as low-resolution images of same scene with different variation in transformation. This is called multi-image super resolution. And another family of methods is single image super-resolution that tries to learn redundancy that presents in image and reconstruction the lost information from a single low-resolution image. Use of deep learning is one of state of art method at present for solving reconstruction high-resolution image. In this research, we proposed Deep Denoising Super Resolution (DDSR) that is a deep neural network for effectively reconstruct the high-resolution image from low-resolution image.Keywords: resolution, deep-learning, neural network, de-blurring
Procedia PDF Downloads 5171959 Prioritizing the Factors Effective on Decreasing the Rate of Accidents on Freeways in Iran between 2013-2015
Authors: Mansour Hadji Hosseinlou, Alireza Mahdavi
Abstract:
Transportation is one of any society's needs which have developed after improving economically and socially and is one of civilization symbols today. Although it is so useful for human, it leads to many serious harms and injuries. The development of communication system and building new roads has resulted in increasing the rate of accidents; therefore, in practice, this increasing rate has decreased the advantages of transportation. Traffic accidents are one of the causes of death, serious financial and bodily harms and its significant social, economic and cultural consequences threatens the societies seriously. Iran's ground transportation system is one of the most eventful transportation systems in the world and mortality rate and financial harms cost too much for the country in national aspect. Therefore, we have presented a data collection by referring to recorded statistics of the accidents occurred in freeways from 2013 to 2015. These statistics are recorded in different related databases, generally police and road transportation system. The data is separated and arranged in tables and after preparing, processing and prioritizing the factors, the achieved collection is presented to the departments, managers and researchers to help them suggest practical solutions.Keywords: freeways’ accidents, humane causes, death, tiredness, drowsiness
Procedia PDF Downloads 1931958 Investigation of Detectability of Orbital Objects/Debris in Geostationary Earth Orbit by Microwave Kinetic Inductance Detectors
Authors: Saeed Vahedikamal, Ian Hepburn
Abstract:
Microwave Kinetic Inductance Detectors (MKIDs) are considered as one of the most promising photon detectors of the future in many Astronomical applications such as exoplanet detections. The MKID advantages stem from their single photon sensitivity (ranging from UV to optical and near infrared), photon energy resolution and high temporal capability (~microseconds). There has been substantial progress in the development of these detectors and MKIDs with Megapixel arrays is now possible. The unique capability of recording an incident photon and its energy (or wavelength) while also registering its time of arrival to within a microsecond enables an array of MKIDs to produce a four-dimensional data block of x, y, z and t comprising x, y spatial, z axis per pixel spectral and t axis per pixel which is temporal. This offers the possibility that the spectrum and brightness variation for any detected piece of space debris as a function of time might offer a unique identifier or fingerprint. Such a fingerprint signal from any object identified in multiple detections by different observers has the potential to determine the orbital features of the object and be used for their tracking. Modelling performed so far shows that with a 20 cm telescope located at an Astronomical observatory (e.g. La Palma, Canary Islands) we could detect sub cm objects at GEO. By considering a Lambertian sphere with a 10 % reflectivity (albedo of the Moon) we anticipate the following for a GEO object: 10 cm object imaged in a 1 second image capture; 1.2 cm object for a 70 second image integration or 0.65 cm object for a 4 minute image integration. We present details of our modelling and the potential instrument for a dedicated GEO surveillance system.Keywords: space debris, orbital debris, detection system, observation, microwave kinetic inductance detectors, MKID
Procedia PDF Downloads 981957 Religiosity and Social Factors on Alcohol Use among South African University Students
Authors: Godswill Nwabuisi Osuafor, Sonto Maria Maputle
Abstract:
Background: Abounding studies found that religiosity and social factors modulate alcohol use among university students. However, there is a scarcity of empirical studies examining the protective effects of religiosity and other social factors on alcohol use and abuse in South African universities. The aim of this study was therefore to assess the protective effects of religiosity and roles of social factors on alcohol use among university students. Methodology: A survey on the use of alcohol among 416 university students was conducted using structured questionnaire in 2014. Data were sourced on religiosity and contextual variables. Students were classified as practicing intrinsic religiosity or extrinsic religiosity based on the response to the measures of religiosity. Descriptive, chi square and binary logistic analyses were used in processing the data. Result: Results revealed that alcohol use was associated with religiosity, religion, sex, family history of alcohol use and experimenting with alcohol. Reporting alcohol abuse was significantly predicted by sex, family history of alcohol use and experimenting with alcohol. Religiosity mediated lower alcohol use whereas family history of alcohol use and experimenting with alcohol promoted alcohol use and abuse. Conclusion: Families, religious groups and societal factors may be the specific niches for intervention on alcohol use among university students.Keywords: religiosity, alcohol use, protective factors, university students
Procedia PDF Downloads 3971956 Thermal and Mechanical Properties of Powder Injection Molded Alumina Nano-Powder
Authors: Mostafa Rezaee Saraji, Ali Keshavarz Panahi
Abstract:
In this work, the processing steps for producing alumina parts using powder injection molding (PIM) technique and nano-powder were investigated and the thermal conductivity and flexural strength of samples were determined as a function of sintering temperature and holding time. In the first step, the feedstock with 58 vol. % of alumina nano-powder with average particle size of 100nm was prepared using Extrumixing method to obtain appropriate homogeneity. This feedstock was injection molded into the two cavity mold with rectangular shape. After injection molding step, thermal and solvent debinding methods were used for debinding of molded samples and then these debinded samples were sintered in different sintering temperatures and holding times. From the results, it was found that the flexural strength and thermal conductivity of samples increased by increasing sintering temperature and holding time; in sintering temperature of 1600ºC and holding time of 5h, the flexural strength and thermal conductivity of sintered samples reached to maximum values of 488MPa and 40.8 W/mK, respectively.Keywords: alumina nano-powder, thermal conductivity, flexural strength, powder injection molding
Procedia PDF Downloads 3291955 Insulation and Architectural Design to Have Sustainable Buildings in Iran
Authors: Ali Bayati, Jamileh Azarnoush
Abstract:
Nowadays according to increasing the population all around the world, consuming of fossil fuels increased dramatically. Many believe that most of the atmospheric pollution comes by using fossil fuels. The process of natural sources entering cities shows one of the large challenges in consumption sources management. Nowadays, everyone considered about the consumption of fossil fuels and also Reduction of consumption civil energy in megacities that play a key role in solving serious problems such as air pollution, producing greenhouse gasses, global warming and damage ozone layer. In the construction industry, we should use the materials with the lowest need to energy for making and carrying them, and also the materials which need the lowest energy and expenses to recycling. In this way, the kind of usage material, the way of processing, regional materials and the adaptation with the environment is critical. Otherwise, the isolation should be use and mention in the long term. Accordingly, in this article we investigates the new ways in order to reduce environmental pollution and save more energy by using materials that are not harmful to the environment, fully insulated materials in buildings, sustainable and diversified buildings, suitable urban design and using solar energy more efficiently in order to reduce energy consumption.Keywords: building design, construction masonry, insulation, sustainable construction
Procedia PDF Downloads 5401954 Investigating the Vehicle-Bicyclists Conflicts using LIDAR Sensor Technology at Signalized Intersections
Authors: Alireza Ansariyar, Mansoureh Jeihani
Abstract:
Light Detection and Ranging (LiDAR) sensors are capable of recording traffic data including the number of passing vehicles and bicyclists, the speed of vehicles and bicyclists, and the number of conflicts among both road users. In order to collect real-time traffic data and investigate the safety of different road users, a LiDAR sensor was installed at Cold Spring Ln – Hillen Rd intersection in Baltimore City. The frequency and severity of collected real-time conflicts were analyzed and the results highlighted that 122 conflicts were recorded over a 10-month time interval from May 2022 to February 2023. By using an innovative image-processing algorithm, a new safety Measure of Effectiveness (MOE) was proposed to recognize the critical zones for bicyclists entering each zone. Considering the trajectory of conflicts, the results of the analysis demonstrated that conflicts in the northern approach (zone N) are more frequent and severe. Additionally, sunny weather is more likely to cause severe vehicle-bike conflicts.Keywords: LiDAR sensor, post encroachment time threshold (PET), vehicle-bike conflicts, a measure of effectiveness (MOE), weather condition
Procedia PDF Downloads 2361953 Characterization and Monitoring of the Yarn Faults Using Diametric Fault System
Authors: S. M. Ishtiaque, V. K. Yadav, S. D. Joshi, J. K. Chatterjee
Abstract:
The DIAMETRIC FAULTS system has been developed that captures a bi-directional image of yarn continuously in sequentially manner and provides the detailed classification of faults. A novel mathematical framework developed on the acquired bi-directional images forms the basis of fault classification in four broad categories, namely, Thick1, Thick2, Thin and Normal Yarn. A discretised version of Radon transformation has been used to convert the bi-directional images into one-dimensional signals. Images were divided into training and test sample sets. Karhunen–Loève Transformation (KLT) basis is computed for the signals from the images in training set for each fault class taking top six highest energy eigen vectors. The fault class of the test image is identified by taking the Euclidean distance of its signal from its projection on the KLT basis for each sample realization and fault class in the training set. Euclidean distance applied using various techniques is used for classifying an unknown fault class. An accuracy of about 90% is achieved in detecting the correct fault class using the various techniques. The four broad fault classes were further sub classified in four sub groups based on the user set boundary limits for fault length and fault volume. The fault cross-sectional area and the fault length defines the total volume of fault. A distinct distribution of faults is found in terms of their volume and physical dimensions which can be used for monitoring the yarn faults. It has been shown from the configurational based characterization and classification that the spun yarn faults arising out of mass variation, exhibit distinct characteristics in terms of their contours, sizes and shapes apart from their frequency of occurrences.Keywords: Euclidean distance, fault classification, KLT, Radon Transform
Procedia PDF Downloads 2651952 The Effect of the Internal Organization Communications' Effectiveness through Employee's Performance of Faculty of Management Science, Suan Sunandha Rajabhat University
Authors: Malaiphan Pansap, Surasit Vithayarat
Abstract:
The purpose of this study was to study the relationship between internal organization communications’ effectiveness and employee’s performance of Faculty of Management Science, Suan Sunandha Rajabhat University. Study on solutions of communication were carried out within the organization. Questionnaire was used to collect information from 136 people of staff and instructor and data were analyzed by using frequency, percentage, mean and standard deviation and then data processing statistic programs. The result found that organization communication that affects their employee’s performance is sender which lack the skills for speaking and writing to convince audiences ready before taking message and the message which organizations are not always informed. The employees believe the behavior of good organization communication has a positive impact on the development of organization because the employees feel involved and be a part of the organization, by the cooperation in working to achieve the goal, the employees can work in the same direction and meet goal quickly.Keywords: employee’s performance, faculty of management science, internal organization communications’ effectiveness, management accounting, Suan Sunandha Rajabhat University
Procedia PDF Downloads 2391951 Early Recognition and Grading of Cataract Using a Combined Log Gabor/Discrete Wavelet Transform with ANN and SVM
Authors: Hadeer R. M. Tawfik, Rania A. K. Birry, Amani A. Saad
Abstract:
Eyes are considered to be the most sensitive and important organ for human being. Thus, any eye disorder will affect the patient in all aspects of life. Cataract is one of those eye disorders that lead to blindness if not treated correctly and quickly. This paper demonstrates a model for automatic detection, classification, and grading of cataracts based on image processing techniques and artificial intelligence. The proposed system is developed to ease the cataract diagnosis process for both ophthalmologists and patients. The wavelet transform combined with 2D Log Gabor Wavelet transform was used as feature extraction techniques for a dataset of 120 eye images followed by a classification process that classified the image set into three classes; normal, early, and advanced stage. A comparison between the two used classifiers, the support vector machine SVM and the artificial neural network ANN were done for the same dataset of 120 eye images. It was concluded that SVM gave better results than ANN. SVM success rate result was 96.8% accuracy where ANN success rate result was 92.3% accuracy.Keywords: cataract, classification, detection, feature extraction, grading, log-gabor, neural networks, support vector machines, wavelet
Procedia PDF Downloads 3321950 Optimization of Bio-Based Lightweight Mortars Containing Wood Waste
Authors: Valeria Corinaldesi, Nicola Generosi, Daniele Berdini
Abstract:
In this study, wood waste from processing by-products was used by replacing natural sand for producing bio-based lightweight mortars. Manufacturers of wood products and furniture usually generate sawdust and pieces of side-cuts. These are produced by cutting, drilling, and milling operations as well. Three different percentages of substitution of quartz sand were tried: 2.5%, 5%, and 10% by volume. Wood by-products were pre-soaked in calcium hydroxide aqueous solution in order to obtain wood mineralization to avoid undesirable effects on the bio-based building materials. Bio-based mortars were characterized by means of compression and bending tests, free drying shrinkage tests, resistance to water vapour permeability, water capillary absorption, and, finally, thermal conductivity measurements. Results obtained showed that a maximum dosage of 5% wood by-products should be used in order to avoid an excessive loss of bio-based mortar mechanical strength. On the other hand, by adding the proper dosage of water-reducing admixture, adequate mechanical performance can be achieved even with 10% wood waste addition.Keywords: bio-based mortar, energy efficiency, lightweight mortar, thermal insulation, wood waste
Procedia PDF Downloads 51949 Polarity Classification of Social Media Comments in Turkish
Authors: Migena Ceyhan, Zeynep Orhan, Dimitrios Karras
Abstract:
People in modern societies are continuously sharing their experiences, emotions, and thoughts in different areas of life. The information reaches almost everyone in real-time and can have an important impact in shaping people’s way of living. This phenomenon is very well recognized and advantageously used by the market representatives, trying to earn the most from this means. Given the abundance of information, people and organizations are looking for efficient tools that filter the countless data into important information, ready to analyze. This paper is a modest contribution in this field, describing the process of automatically classifying social media comments in the Turkish language into positive or negative. Once data is gathered and preprocessed, feature sets of selected single words or groups of words are build according to the characteristics of language used in the texts. These features are used later to train, and test a system according to different machine learning algorithms (Naïve Bayes, Sequential Minimal Optimization, J48, and Bayesian Linear Regression). The resultant high accuracies can be important feedback for decision-makers to improve the business strategies accordingly.Keywords: feature selection, machine learning, natural language processing, sentiment analysis, social media reviews
Procedia PDF Downloads 146