Search results for: processing map
1832 [Keynote Speech]: Bridge Damage Detection Using Frequency Response Function
Authors: Ahmed Noor Al-Qayyim
Abstract:
During the past decades, the bridge structures are considered very important portions of transportation networks, due to the fast urban sprawling. With the failure of bridges that under operating conditions lead to focus on updating the default bridge inspection methodology. The structures health monitoring (SHM) using the vibration response appeared as a promising method to evaluate the condition of structures. The rapid development in the sensors technology and the condition assessment techniques based on the vibration-based damage detection made the SHM an efficient and economical ways to assess the bridges. SHM is set to assess state and expects probable failures of designated bridges. In this paper, a presentation for Frequency Response function method that uses the captured vibration test information of structures to evaluate the structure condition. Furthermore, the main steps of the assessment of bridge using the vibration information are presented. The Frequency Response function method is applied to the experimental data of a full-scale bridge.Keywords: bridge assessment, health monitoring, damage detection, frequency response function (FRF), signal processing, structure identification
Procedia PDF Downloads 3481831 Mapping of Urban Green Spaces Towards a Balanced Planning in a Coastal Landscape
Authors: Rania Ajmi, Faiza Allouche Khebour, Aude Nuscia Taibi, Sirine Essasi
Abstract:
Urban green spaces (UGS) as an important contributor can be a significant part of sustainable development. A spatial method was employed to assess and map the spatial distribution of UGS in five districts in Sousse, Tunisia. Ecological management of UGS is an essential factor for the sustainable development of the city; hence the municipality of Sousse has decided to support the districts according to different green spaces characters. And to implement this policy, (1) a new GIS web application was developed, (2) then the implementation of the various green spaces was carried out, (3) a spatial mapping of UGS using Quantum GIS was realized, and (4) finally a data processing and statistical analysis with RStudio programming language was executed. The intersection of the results of the spatial and statistical analyzes highlighted the presence of an imbalance in terms of the spatial UGS distribution in the study area. The discontinuity between the coast and the city's green spaces was not designed in a spirit of network and connection, hence the lack of a greenway that connects these spaces to the city. Finally, this GIS support will be used to assess and monitor green spaces in the city of Sousse by decision-makers and will contribute to improve the well-being of the local population.Keywords: distributions, GIS, green space, imbalance, spatial analysis
Procedia PDF Downloads 2041830 Detection of Autistic Children's Voice Based on Artificial Neural Network
Authors: Royan Dawud Aldian, Endah Purwanti, Soegianto Soelistiono
Abstract:
In this research we have been developed an automatic investigation to classify normal children voice or autistic by using modern computation technology that is computation based on artificial neural network. The superiority of this computation technology is its capability on processing and saving data. In this research, digital voice features are gotten from the coefficient of linear-predictive coding with auto-correlation method and have been transformed in frequency domain using fast fourier transform, which used as input of artificial neural network in back-propagation method so that will make the difference between normal children and autistic automatically. The result of back-propagation method shows that successful classification capability for normal children voice experiment data is 100% whereas, for autistic children voice experiment data is 100%. The success rate using back-propagation classification system for the entire test data is 100%.Keywords: autism, artificial neural network, backpropagation, linier predictive coding, fast fourier transform
Procedia PDF Downloads 4611829 Impact of Sericin Treatment on Perfection Dyeing of Polyester Viscose Blend
Authors: Omaima G. Allam, O. A. Hakeim, K. Haggag, N. S. Elshemy
Abstract:
In the midst of the two decades the use of microwave dielectric warming in the field of science has transformed into a powerful methodology to redesign compound procedures. The potential benefit of the application of these modern methods of treatment emphasize so as to reach to optimum treatment conditions and the best results, especially hydrophobicity, moisture content and increase dyeing processing while maintaining the physical and chemical properties of each textile. Moreover, polyester fibres are sometimes spun together with natural fibres to produce a cloth with blended properties. So that at the present task, the polyester/viscose mix fabrics (60 /40) were pretreated with 4 g/l of KOH for 2 min in microwave irradiation with a liquor ratio 1:25. Subsequently fabrics were inundated with different concentrations of sericin (10, 30, 50 g/l). Treated fabrics were coloured with the commercial dyes samples: Reactive Red 84(Dye 1). C. I. Acid Blue 203(Dye 2) and C.I. Reactive violet 5 (Dye 3). Colour value was specified as well as fastness properties. Likewise, the physical properties of untreated and treated fabrics such as moisture content %, tensile strength, elongation % and were evaluated. The untreated and treated fabrics are described by infrared spectroscopy (FTIR) and scanning electron microscopy.Keywords: polyester viscose blends fabric, sericin, dyes, colour value
Procedia PDF Downloads 2381828 Disaster Management Using Wireless Sensor Networks
Authors: Akila Murali, Prithika Manivel
Abstract:
Disasters are defined as a serious disruption of the functioning of a community or a society, which involves widespread human, material, economic or environmental impacts. The number of people suffering food crisis as a result of natural disasters has tripled in the last thirty years. The economic losses due to natural disasters have shown an increase with a factor of eight over the past four decades, caused by the increased vulnerability of the global society, and also due to an increase in the number of weather-related disasters. Efficient disaster detection and alerting systems could reduce the loss of life and properties. In the event of a disaster, another important issue is a good search and rescue system with high levels of precision, timeliness and safety for both the victims and the rescuers. Wireless Sensor Networks technology has the capability of quick capturing, processing, and transmission of critical data in real-time with high resolution. This paper studies the capacity of sensors and a Wireless Sensor Network to collect, collate and analyze valuable and worthwhile data, in an ordered manner to help with disaster management.Keywords: alerting systems, disaster detection, Ad Hoc network, WSN technology
Procedia PDF Downloads 4041827 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis
Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho
Abstract:
This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis
Procedia PDF Downloads 1821826 Motor Controller Implementation Using Model Based Design
Authors: Cau Tran, Tu Nguyen, Tien Pham
Abstract:
Model-based design (MBD) is a mathematical and visual technique for addressing design issues in the fields of communications, signal processing, and complicated control systems. It is utilized in several automotive, aerospace, industrial, and motion control applications. Virtual models are at the center of the software development process with model based design. A method used in the creation of embedded software is model-based design. In this study, the LAT motor is modeled in a simulation environment, and the LAT motor control is designed with a cascade structure, a speed and current control loop, and a controller that is used in the next part. A PID structure serves as this controller. Based on techniques and motor parameters that match the design goals, the PID controller is created for the model using traditional design principles. The MBD approach will be used to build embedded software for motor control. The paper will be divided into three distinct sections. The first section will introduce the design process and the benefits and drawbacks of the MBD technique. The design of control software for LAT motors will be the main topic of the next section. The experiment's results are the subject of the last section.Keywords: model based design, limited angle torque, intellectual property core, hardware description language, controller area network, user datagram protocol
Procedia PDF Downloads 941825 Design of Labview Based DAQ System
Authors: Omar A. A. Shaebi, Matouk M. Elamari, Salaheddin Allid
Abstract:
The Information Computing System of Monitoring (ICSM) for the Research Reactor of Tajoura Nuclear Research Centre (TNRC) stopped working since early 1991. According to the regulations, the computer is necessary to operate the reactor up to its maximum power (10 MW). The fund is secured via IAEA to develop a modern computer based data acquisition system to replace the old computer. This paper presents the development of the Labview based data acquisition system to allow automated measurements using National Instruments Hardware and its labview software. The developed system consists of SCXI 1001 chassis, the chassis house four SCXI 1100 modules each can maintain 32 variables. The chassis is interfaced with the PC using NI PCI-6023 DAQ Card. Labview, developed by National Instruments, is used to run and operate the DAQ System. Labview is graphical programming environment suited for high level design. It allows integrating different signal processing components or subsystems within a graphical framework. The results showed system capabilities in monitoring variables, acquiring and saving data. Plus the capability of the labview to control the DAQ.Keywords: data acquisition, labview, signal conditioning, national instruments
Procedia PDF Downloads 4941824 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Authors: Ameur Abdelkader, Abed Bouarfa Hafida
Abstract:
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm
Procedia PDF Downloads 1421823 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot
Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan
Abstract:
With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.Keywords: object detection, feature, descriptors, SIFT, SURF, depth images, service robots
Procedia PDF Downloads 5461822 Climate Change and the Role of Foreign-Invested Enterprises
Authors: Xuemei Jiang, Kunfu Zhu, Shouyang Wang
Abstract:
In this paper, we selected China as a case and employ a time-series of unique input-output tables distinguishing firm ownership and processing exports, to evaluate the role of foreign-invested enterprises (FIEs) in China’s rapid carbon dioxide emission growth. The results suggested that FIEs contributed to 11.55% of the economic outputs’ growth in China between 1992-2010, but accounted for only 9.65% of the growth of carbon dioxide emissions. In relative term, until 2010 FIEs still emitted much less than Chinese-owned enterprises (COEs) when producing the same amount of outputs, although COEs experienced much faster technology upgrades. In an ideal scenario where we assume the final demands remain unchanged and COEs completely mirror the advanced technologies of FIEs, more than 2000 Mt of carbon dioxide emissions would be reduced for China in 2010. From a policy perspective, the widespread FIEs are very effective and efficient channel to encourage technology transfer from developed to developing countries.Keywords: carbon dioxide emissions, foreign-invested enterprises, technology transfer, input–output analysis, China
Procedia PDF Downloads 3981821 Hydrologic Balance and Surface Water Resources of the Cheliff-Zahrez Basin
Authors: Mehaiguene Madjid, Touhari Fadhila, Meddi Mohamed
Abstract:
The Cheliff basin offers a good hydrological example for the possibility of studying the problem which elucidated in the future, because of the unclearity in several aspects and hydraulic installation. Thus, our study of the Cheliff basin is divided into two principal parts: The spatial evaluation of the precipitation: also, the understanding of the modes of the reconstitution of the resource in water supposes a good knowledge of the structuring of the precipitation fields in the studied space. In the goal of a good knowledge of revitalizes them in water and their management integrated one judged necessary to establish a precipitation card of the Cheliff basin for a good understanding of the evolution of the resource in water in the basin and that goes will serve as basis for all study of hydraulic planning in the Cheliff basin. Then, the establishment of the precipitation card of the Cheliff basin answered a direct need of setting to the disposition of the researchers for the region and a document of reference that will be completed therefore and actualized. The hydrological study, based on the statistical hydrometric data processing will lead us to specify the hydrological terms of the assessment hydrological and to clarify the fundamental aspects of the annual flow, seasonal, extreme and thus of their variability and resources surface water.Keywords: hydrological assessment, surface water resources, Cheliff, Algeria
Procedia PDF Downloads 3041820 Stability Analysis and Controller Design of Further Development of Miniaturized Mössbauer Spectrometer II for Space Applications with Focus on the Extended Lyapunov Method – Part I –
Authors: Mohammad Beyki, Justus Pawlak, Robert Patzke, Franz Renz
Abstract:
In the context of planetary exploration, the MIMOS II (miniaturized Mössbauer spectrometer) serves as a proven and reliable measuring instrument. The transmission behaviour of the electronics in the Mössbauer spectroscopy is newly developed and optimized. For this purpose, the overall electronics is split into three parts. This elaboration deals exclusively with the first part of the signal chain for the evaluation of photons in experiments with gamma radiation. Parallel to the analysis of the electronics, a new method for the stability consideration of linear and non-linear systems is presented: The extended method of Lyapunov’s stability criteria. The design helps to weigh advantages and disadvantages against other simulated circuits in order to optimize the MIMOS II for the terestric and extraterestric measurment. Finally, after stability analysis, the controller design according to Ackermann is performed, achieving the best possible optimization of the output variable through a skillful pole assignment.Keywords: Mössbauer spectroscopy, electronic signal amplifier, light processing technology, photocurrent, trans-impedance amplifier, extended Lyapunov method
Procedia PDF Downloads 991819 LGG Architecture for Brain Tumor Segmentation Using Convolutional Neural Network
Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan
Abstract:
The most aggressive form of brain tumor is called glioma. Glioma is kind of tumor that arises from glial tissue of the brain and occurs quite often. A fully automatic 2D-CNN model for brain tumor segmentation is presented in this paper. We performed pre-processing steps to remove noise and intensity variances using N4ITK and standard intensity correction, respectively. We used Keras open-source library with Theano as backend for fast implementation of CNN model. In addition, we used BRATS 2015 MRI dataset to evaluate our proposed model. Furthermore, we have used SimpleITK open-source library in our proposed model to analyze images. Moreover, we have extracted random 2D patches for proposed 2D-CNN model for efficient brain segmentation. Extracting 2D patched instead of 3D due to less dimensional information present in 2D which helps us in reducing computational time. Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.77 for complete, 0.76 for core, 0.77 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.Keywords: brain tumor segmentation, convolutional neural networks, deep learning, LGG
Procedia PDF Downloads 1821818 An Online Adaptive Thresholding Method to Classify Google Trends Data Anomalies for Investor Sentiment Analysis
Authors: Duygu Dere, Mert Ergeneci, Kaan Gokcesu
Abstract:
Google Trends data has gained increasing popularity in the applications of behavioral finance, decision science and risk management. Because of Google’s wide range of use, the Trends statistics provide significant information about the investor sentiment and intention, which can be used as decisive factors for corporate and risk management fields. However, an anomaly, a significant increase or decrease, in a certain query cannot be detected by the state of the art applications of computation due to the random baseline noise of the Trends data, which is modelled as an Additive white Gaussian noise (AWGN). Since through time, the baseline noise power shows a gradual change an adaptive thresholding method is required to track and learn the baseline noise for a correct classification. To this end, we introduce an online method to classify meaningful deviations in Google Trends data. Through extensive experiments, we demonstrate that our method can successfully classify various anomalies for plenty of different data.Keywords: adaptive data processing, behavioral finance , convex optimization, online learning, soft minimum thresholding
Procedia PDF Downloads 1671817 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers
Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice
Abstract:
In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.Keywords: churn prediction, data mining, decision-theoretic rough set, feature selection
Procedia PDF Downloads 4461816 Surface Quality Improvement of Abrasive Waterjet Cutting for Spacecraft Structure
Authors: Tarek M. Ahmed, Ahmed S. El Mesalamy, Amro M. Youssef, Tawfik T. El Midany
Abstract:
Abrasive waterjet (AWJ) machining is considered as one of the most powerful cutting processes. It can be used for cutting heat sensitive, hard and reflective materials. Aluminum 2024 is a high-strength alloy which is widely used in aerospace and aviation industries. This paper aims to improve aluminum alloy and to investigate the effect of AWJ control parameters on surface geometry quality. Design of experiments (DoE) is used for establishing an experimental matrix. Statistical modeling is used to present a relation between the cutting parameters (pressure, speed, and distance between the nozzle and cut surface) and responses (taper angle and surface roughness). The results revealed a tangible improvement in productivity by using AWJ processing. The taper kerf angle can be improved by decreasing standoff distance and speed and increasing water pressure. While decreasing (cutting speed, pressure and distance between the nozzle and cut surface) improve the surface roughness in the operating window of cutting parameters.Keywords: abrasive waterjet machining, machining of aluminum alloy, non-traditional cutting, statistical modeling
Procedia PDF Downloads 2501815 SIF Computation of Cracked Plate by FEM
Authors: Sari Elkahina, Zergoug Mourad, Benachenhou Kamel
Abstract:
The main purpose of this paper is to perform a computations comparison of stress intensity factor 'SIF' evaluation in case of cracked thin plate with Aluminum alloy 7075-T6 and 2024-T3 used in aeronautics structure under uniaxial loading. This evaluation is based on finite element method with a virtual power principle through two techniques: the extrapolation and G−θ. The first one consists to extrapolate the nodal displacements near the cracked tip using a refined triangular mesh with T3 and T6 special elements, while the second, consists of determining the energy release rate G through G−θ method by potential energy derivation which corresponds numerically to the elastic solution post-processing of a cracked solid by a contour integration computation via Gauss points. The SIF obtained results from extrapolation and G−θ methods will be compared to an analytical solution in a particular case. To illustrate the influence of the meshing kind and the size of integration contour position simulations are presented and analyzed.Keywords: crack tip, SIF, finite element method, concentration technique, displacement extrapolation, aluminum alloy 7075-T6 and 2024-T3, energy release rate G, G-θ method, Gauss point numerical integration
Procedia PDF Downloads 3371814 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes
Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar
Abstract:
Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.Keywords: continuous query processing, dynamic database, moving object, skyline queries
Procedia PDF Downloads 2101813 Enhancing the Recruitment Process through Machine Learning: An Automated CV Screening System
Authors: Kaoutar Ben Azzou, Hanaa Talei
Abstract:
Human resources is an important department in each organization as it manages the life cycle of employees from recruitment training to retirement or termination of contracts. The recruitment process starts with a job opening, followed by a selection of the best-fit candidates from all applicants. Matching the best profile for a job position requires a manual way of looking at many CVs, which requires hours of work that can sometimes lead to choosing not the best profile. The work presented in this paper aims at reducing the workload of HR personnel by automating the preliminary stages of the candidate screening process, thereby fostering a more streamlined recruitment workflow. This tool introduces an automated system designed to help with the recruitment process by scanning candidates' CVs, extracting pertinent features, and employing machine learning algorithms to decide the most fitting job profile for each candidate. Our work employs natural language processing (NLP) techniques to identify and extract key features from unstructured text extracted from a CV, such as education, work experience, and skills. Subsequently, the system utilizes these features to match candidates with job profiles, leveraging the power of classification algorithms.Keywords: automated recruitment, candidate screening, machine learning, human resources management
Procedia PDF Downloads 561812 Words of Peace in the Speeches of the Egyptian President, Abdulfattah El-Sisi: A Corpus-Based Study
Authors: Mohamed S. Negm, Waleed S. Mandour
Abstract:
The present study aims primarily at investigating words of peace (lexemes of peace) in the formal speeches of the Egyptian president Abdulfattah El-Sisi in a two-year span of time, from 2018 to 2019. This paper attempts to shed light not only on the contextual use of the antonyms, war and peace, but also it underpins quantitative analysis through the current methods of corpus linguistics. As such, the researchers have deployed a corpus-based approach in collecting, encoding, and processing 30 presidential speeches over the stated period (23,411 words and 25,541 tokens in total). Further, semantic fields and collocational networkzs are identified and compared statistically. Results have shown a significant propensity of adopting peace, including its relevant collocation network, textually and therefore, ideationally, at the expense of war concept which in most cases surfaces euphemistically through the noun conflict. The president has not justified the action of war with an honorable cause or a valid reason. Such results, so far, have indicated a positive sociopolitical mindset the Egyptian president possesses and moreover, reveal national and international fair dealing on arising issues.Keywords: CADS, collocation network, corpus linguistics, critical discourse analysis
Procedia PDF Downloads 1551811 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison
Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo
Abstract:
A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.Keywords: affective computing, interface, brain, intelligent interaction
Procedia PDF Downloads 3881810 Improved Super-Resolution Using Deep Denoising Convolutional Neural Network
Authors: Pawan Kumar Mishra, Ganesh Singh Bisht
Abstract:
Super-resolution is the technique that is being used in computer vision to construct high-resolution images from a single low-resolution image. It is used to increase the frequency component, recover the lost details and removing the down sampling and noises that caused by camera during image acquisition process. High-resolution images or videos are desired part of all image processing tasks and its analysis in most of digital imaging application. The target behind super-resolution is to combine non-repetition information inside single or multiple low-resolution frames to generate a high-resolution image. Many methods have been proposed where multiple images are used as low-resolution images of same scene with different variation in transformation. This is called multi-image super resolution. And another family of methods is single image super-resolution that tries to learn redundancy that presents in image and reconstruction the lost information from a single low-resolution image. Use of deep learning is one of state of art method at present for solving reconstruction high-resolution image. In this research, we proposed Deep Denoising Super Resolution (DDSR) that is a deep neural network for effectively reconstruct the high-resolution image from low-resolution image.Keywords: resolution, deep-learning, neural network, de-blurring
Procedia PDF Downloads 5171809 Prioritizing the Factors Effective on Decreasing the Rate of Accidents on Freeways in Iran between 2013-2015
Authors: Mansour Hadji Hosseinlou, Alireza Mahdavi
Abstract:
Transportation is one of any society's needs which have developed after improving economically and socially and is one of civilization symbols today. Although it is so useful for human, it leads to many serious harms and injuries. The development of communication system and building new roads has resulted in increasing the rate of accidents; therefore, in practice, this increasing rate has decreased the advantages of transportation. Traffic accidents are one of the causes of death, serious financial and bodily harms and its significant social, economic and cultural consequences threatens the societies seriously. Iran's ground transportation system is one of the most eventful transportation systems in the world and mortality rate and financial harms cost too much for the country in national aspect. Therefore, we have presented a data collection by referring to recorded statistics of the accidents occurred in freeways from 2013 to 2015. These statistics are recorded in different related databases, generally police and road transportation system. The data is separated and arranged in tables and after preparing, processing and prioritizing the factors, the achieved collection is presented to the departments, managers and researchers to help them suggest practical solutions.Keywords: freeways’ accidents, humane causes, death, tiredness, drowsiness
Procedia PDF Downloads 1931808 Religiosity and Social Factors on Alcohol Use among South African University Students
Authors: Godswill Nwabuisi Osuafor, Sonto Maria Maputle
Abstract:
Background: Abounding studies found that religiosity and social factors modulate alcohol use among university students. However, there is a scarcity of empirical studies examining the protective effects of religiosity and other social factors on alcohol use and abuse in South African universities. The aim of this study was therefore to assess the protective effects of religiosity and roles of social factors on alcohol use among university students. Methodology: A survey on the use of alcohol among 416 university students was conducted using structured questionnaire in 2014. Data were sourced on religiosity and contextual variables. Students were classified as practicing intrinsic religiosity or extrinsic religiosity based on the response to the measures of religiosity. Descriptive, chi square and binary logistic analyses were used in processing the data. Result: Results revealed that alcohol use was associated with religiosity, religion, sex, family history of alcohol use and experimenting with alcohol. Reporting alcohol abuse was significantly predicted by sex, family history of alcohol use and experimenting with alcohol. Religiosity mediated lower alcohol use whereas family history of alcohol use and experimenting with alcohol promoted alcohol use and abuse. Conclusion: Families, religious groups and societal factors may be the specific niches for intervention on alcohol use among university students.Keywords: religiosity, alcohol use, protective factors, university students
Procedia PDF Downloads 3971807 Thermal and Mechanical Properties of Powder Injection Molded Alumina Nano-Powder
Authors: Mostafa Rezaee Saraji, Ali Keshavarz Panahi
Abstract:
In this work, the processing steps for producing alumina parts using powder injection molding (PIM) technique and nano-powder were investigated and the thermal conductivity and flexural strength of samples were determined as a function of sintering temperature and holding time. In the first step, the feedstock with 58 vol. % of alumina nano-powder with average particle size of 100nm was prepared using Extrumixing method to obtain appropriate homogeneity. This feedstock was injection molded into the two cavity mold with rectangular shape. After injection molding step, thermal and solvent debinding methods were used for debinding of molded samples and then these debinded samples were sintered in different sintering temperatures and holding times. From the results, it was found that the flexural strength and thermal conductivity of samples increased by increasing sintering temperature and holding time; in sintering temperature of 1600ºC and holding time of 5h, the flexural strength and thermal conductivity of sintered samples reached to maximum values of 488MPa and 40.8 W/mK, respectively.Keywords: alumina nano-powder, thermal conductivity, flexural strength, powder injection molding
Procedia PDF Downloads 3291806 Insulation and Architectural Design to Have Sustainable Buildings in Iran
Authors: Ali Bayati, Jamileh Azarnoush
Abstract:
Nowadays according to increasing the population all around the world, consuming of fossil fuels increased dramatically. Many believe that most of the atmospheric pollution comes by using fossil fuels. The process of natural sources entering cities shows one of the large challenges in consumption sources management. Nowadays, everyone considered about the consumption of fossil fuels and also Reduction of consumption civil energy in megacities that play a key role in solving serious problems such as air pollution, producing greenhouse gasses, global warming and damage ozone layer. In the construction industry, we should use the materials with the lowest need to energy for making and carrying them, and also the materials which need the lowest energy and expenses to recycling. In this way, the kind of usage material, the way of processing, regional materials and the adaptation with the environment is critical. Otherwise, the isolation should be use and mention in the long term. Accordingly, in this article we investigates the new ways in order to reduce environmental pollution and save more energy by using materials that are not harmful to the environment, fully insulated materials in buildings, sustainable and diversified buildings, suitable urban design and using solar energy more efficiently in order to reduce energy consumption.Keywords: building design, construction masonry, insulation, sustainable construction
Procedia PDF Downloads 5401805 Investigating the Vehicle-Bicyclists Conflicts using LIDAR Sensor Technology at Signalized Intersections
Authors: Alireza Ansariyar, Mansoureh Jeihani
Abstract:
Light Detection and Ranging (LiDAR) sensors are capable of recording traffic data including the number of passing vehicles and bicyclists, the speed of vehicles and bicyclists, and the number of conflicts among both road users. In order to collect real-time traffic data and investigate the safety of different road users, a LiDAR sensor was installed at Cold Spring Ln – Hillen Rd intersection in Baltimore City. The frequency and severity of collected real-time conflicts were analyzed and the results highlighted that 122 conflicts were recorded over a 10-month time interval from May 2022 to February 2023. By using an innovative image-processing algorithm, a new safety Measure of Effectiveness (MOE) was proposed to recognize the critical zones for bicyclists entering each zone. Considering the trajectory of conflicts, the results of the analysis demonstrated that conflicts in the northern approach (zone N) are more frequent and severe. Additionally, sunny weather is more likely to cause severe vehicle-bike conflicts.Keywords: LiDAR sensor, post encroachment time threshold (PET), vehicle-bike conflicts, a measure of effectiveness (MOE), weather condition
Procedia PDF Downloads 2361804 The Effect of the Internal Organization Communications' Effectiveness through Employee's Performance of Faculty of Management Science, Suan Sunandha Rajabhat University
Authors: Malaiphan Pansap, Surasit Vithayarat
Abstract:
The purpose of this study was to study the relationship between internal organization communications’ effectiveness and employee’s performance of Faculty of Management Science, Suan Sunandha Rajabhat University. Study on solutions of communication were carried out within the organization. Questionnaire was used to collect information from 136 people of staff and instructor and data were analyzed by using frequency, percentage, mean and standard deviation and then data processing statistic programs. The result found that organization communication that affects their employee’s performance is sender which lack the skills for speaking and writing to convince audiences ready before taking message and the message which organizations are not always informed. The employees believe the behavior of good organization communication has a positive impact on the development of organization because the employees feel involved and be a part of the organization, by the cooperation in working to achieve the goal, the employees can work in the same direction and meet goal quickly.Keywords: employee’s performance, faculty of management science, internal organization communications’ effectiveness, management accounting, Suan Sunandha Rajabhat University
Procedia PDF Downloads 2391803 Early Recognition and Grading of Cataract Using a Combined Log Gabor/Discrete Wavelet Transform with ANN and SVM
Authors: Hadeer R. M. Tawfik, Rania A. K. Birry, Amani A. Saad
Abstract:
Eyes are considered to be the most sensitive and important organ for human being. Thus, any eye disorder will affect the patient in all aspects of life. Cataract is one of those eye disorders that lead to blindness if not treated correctly and quickly. This paper demonstrates a model for automatic detection, classification, and grading of cataracts based on image processing techniques and artificial intelligence. The proposed system is developed to ease the cataract diagnosis process for both ophthalmologists and patients. The wavelet transform combined with 2D Log Gabor Wavelet transform was used as feature extraction techniques for a dataset of 120 eye images followed by a classification process that classified the image set into three classes; normal, early, and advanced stage. A comparison between the two used classifiers, the support vector machine SVM and the artificial neural network ANN were done for the same dataset of 120 eye images. It was concluded that SVM gave better results than ANN. SVM success rate result was 96.8% accuracy where ANN success rate result was 92.3% accuracy.Keywords: cataract, classification, detection, feature extraction, grading, log-gabor, neural networks, support vector machines, wavelet
Procedia PDF Downloads 332