Search results for: encrypted traffic classification
2054 Rangeland Monitoring by Computerized Technologies
Abstract:
Every piece of rangeland has a different set of physical and biological characteristics. This requires the manager to synthesis various information for regular monitoring to define changes trend to get wright decision for sustainable management. So range managers need to use computerized technologies to monitor rangeland, and select. The best management practices. There are four examples of computerized technologies that can benefit sustainable management: (1) Photographic method for cover measurement: The method was tested in different vegetation communities in semi humid and arid regions. Interpretation of pictures of quadrats was done using Arc View software. Data analysis was done by SPSS software using paired t test. Based on the results, generally, photographic method can be used to measure ground cover in most vegetation communities. (2) GPS application for corresponding ground samples and satellite pixels: In two provinces of Tehran and Markazi, six reference points were selected and in each point, eight GPS models were tested. Significant relation among GPS model, time and location with accuracy of estimated coordinates was found. After selection of suitable method, in Markazi province coordinates of plots along four transects in each 6 sites of rangelands was recorded. The best time of GPS application was in the morning hours, Etrex Vista had less error than other models, and a significant relation among GPS model, time and location with accuracy of estimated coordinates was found. (3) Application of satellite data for rangeland monitoring: Focusing on the long term variation of vegetation parameters such as vegetation cover and production is essential. Our study in grass and shrub lands showed that there were significant correlations between quantitative vegetation characteristics and satellite data. So it is possible to monitor rangeland vegetation using digital data for sustainable utilization. (4) Rangeland suitability classification with GIS: Range suitability assessment can facilitate sustainable management planning. Three sub-models of sensitivity to erosion, water suitability and forage production out puts were entered to final range suitability classification model. GIS was facilitate classification of range suitability and produced suitability maps for sheep grazing. Generally digital computers assist range managers to interpret, modify, calibrate or integrating information for correct management.Keywords: computer, GPS, GIS, remote sensing, photographic method, monitoring, rangeland ecosystem, management, suitability, sheep grazing
Procedia PDF Downloads 3692053 Stabilization of Spent Engine Oil Contaminated Lateritic Soil Admixed with Cement Kiln Dust for Use as Road Construction Materials
Authors: Johnson Rotimi Oluremi, A. Adedayo Adegbola, A. Samson Adediran, O. Solomon Oladapo
Abstract:
Spent engine oil contains heavy metals and polycyclic aromatic hydrocarbons which contribute to chronic health hazards, poor soil aeration, immobilisation of nutrients and lowering of pH in soil. It affects geotechnical properties of lateritic soil thereby constituting geotechnical and foundation problems. This study is therefore based on the stabilization of spent engine oil (SEO) contaminated lateritic soil using cement kiln dust (CKD) as a mean of restoring it to its pristine state. Geotechnical tests which include sieve analysis, atterberg limit, compaction, California bearing ratio and unconfined compressive strength tests were carried out on the natural, SEO contaminated and CKD stabilized SEO contaminated lateritic soil samples. The natural soil classified as A-2-7 (2) by AASHTO classification and GC according to the Unified Soil Classification System changed to A-4 non-plastic soil due to SEO contaminated even under the influence of CKD it remained unchanged. However, the maximum dry density (MDD) of the SEO contaminated soil increased while the optimum moisture content (OMC) behaved vice versa with the increase in the percentages of CKD. Similarly, the bearing strength of the stabilized SEO contaminated soil measured by California Bearing Ratio (CBR) increased with percentage increment in CKD. In conclusion, spent engine oil has a detrimental effect on the geotechnical properties of the lateritic soil sample but which can be remediated using 10% CKD as a stand alone admixture in stabilizing spent engine oil contaminated soil.Keywords: spent engine oil, lateritic soil, cement kiln dust, stabilization, compaction, unconfined compressive strength
Procedia PDF Downloads 3912052 An Approach for Ensuring Data Flow in Freight Delivery and Management Systems
Authors: Aurelija Burinskienė, Dalė Dzemydienė, Arūnas Miliauskas
Abstract:
This research aims at developing the approach for more effective freight delivery and transportation process management. The road congestions and the identification of causes are important, as well as the context information recognition and management. The measure of many parameters during the transportation period and proper control of driver work became the problem. The number of vehicles per time unit passing at a given time and point for drivers can be evaluated in some situations. The collection of data is mainly used to establish new trips. The flow of the data is more complex in urban areas. Herein, the movement of freight is reported in detail, including the information on street level. When traffic density is extremely high in congestion cases, and the traffic speed is incredibly low, data transmission reaches the peak. Different data sets are generated, which depend on the type of freight delivery network. There are three types of networks: long-distance delivery networks, last-mile delivery networks and mode-based delivery networks; the last one includes different modes, in particular, railways and other networks. When freight delivery is switched from one type of the above-stated network to another, more data could be included for reporting purposes and vice versa. In this case, a significant amount of these data is used for control operations, and the problem requires an integrated methodological approach. The paper presents an approach for providing e-services for drivers by including the assessment of the multi-component infrastructure needed for delivery of freights following the network type. The construction of such a methodology is required to evaluate data flow conditions and overloads, and to minimize the time gaps in data reporting. The results obtained show the possibilities of the proposing methodological approach to support the management and decision-making processes with functionality of incorporating networking specifics, by helping to minimize the overloads in data reporting.Keywords: transportation networks, freight delivery, data flow, monitoring, e-services
Procedia PDF Downloads 1292051 [Keynote Talk]: sEMG Interface Design for Locomotion Identification
Authors: Rohit Gupta, Ravinder Agarwal
Abstract:
Surface electromyographic (sEMG) signal has the potential to identify the human activities and intention. This potential is further exploited to control the artificial limbs using the sEMG signal from residual limbs of amputees. The paper deals with the development of multichannel cost efficient sEMG signal interface for research application, along with evaluation of proposed class dependent statistical approach of the feature selection method. The sEMG signal acquisition interface was developed using ADS1298 of Texas Instruments, which is a front-end interface integrated circuit for ECG application. Further, the sEMG signal is recorded from two lower limb muscles for three locomotions namely: Plane Walk (PW), Stair Ascending (SA), Stair Descending (SD). A class dependent statistical approach is proposed for feature selection and also its performance is compared with 12 preexisting feature vectors. To make the study more extensive, performance of five different types of classifiers are compared. The outcome of the current piece of work proves the suitability of the proposed feature selection algorithm for locomotion recognition, as compared to other existing feature vectors. The SVM Classifier is found as the outperformed classifier among compared classifiers with an average recognition accuracy of 97.40%. Feature vector selection emerges as the most dominant factor affecting the classification performance as it holds 51.51% of the total variance in classification accuracy. The results demonstrate the potentials of the developed sEMG signal acquisition interface along with the proposed feature selection algorithm.Keywords: classifiers, feature selection, locomotion, sEMG
Procedia PDF Downloads 2942050 Breaking the Barrier of Service Hostility: A Lean Approach to Achieve Operational Excellence
Authors: Mofizul Islam Awwal
Abstract:
Due to globalization, industries are rapidly growing throughout the world which leads to many manufacturing organizations. But recently, service industries are beginning to emerge in large numbers almost in all parts of the world including some developing countries. In this context, organizations need to have strong competitive advantage over their rivals to achieve their strategic business goals. Manufacturing industries are adopting many methods and techniques in order to achieve such competitive edge. Over the last decades, manufacturing industries have been successfully practicing lean concept to optimize their production lines. Due to its huge success in manufacturing context, lean has made its way into the service industry. Very little importance has been addressed to service in the area of operations management. Service industries are far behind than manufacturing industries in terms of operations improvement. It will be a hectic job to transfer the lean concept from production floor to service back/front office which will obviously yield possible improvement. Service processes are not as visible as production processes and can be very complex. Lack of research in this area made it quite difficult for service industries as there are no standardized frameworks for successfully implementing lean concept in service organization. The purpose of this research paper is to capture the present scenario of service industry in terms of lean implementation. Thorough analysis of past literature will be done on the applicability and understanding of lean in service structure. Classification of research papers will be done and critical factors will be unveiled for implementing lean in service industry to achieve operational excellence.Keywords: lean service, lean literature classification, lean implementation, service industry, service excellence
Procedia PDF Downloads 3772049 Impacts of Aquaculture Farms on the Mangroves Forests of Sundarbans, India (2010-2018): Temporal Changes of NDVI
Authors: Sandeep Thakur, Ismail Mondal, Phani Bhusan Ghosh, Papita Das, Tarun Kumar De
Abstract:
Sundarbans Reserve forest of India has been undergoing major transformations in the recent past owing to population pressure and related changes. This has brought about major changes in the spatial landscape of the region especially in the western parts. This study attempts to assess the impacts of the Landcover changes on the mangrove habitats. Time series imageries of Landsat were used to analyze the Normalized Differential Vegetation Index (NDVI) patterns over the western parts of Indian Sundarbans forest in order to assess the heath of the mangroves in the region. The images were subjected to Land use Land cover (LULC) classification using sub-pixel classification techniques in ERDAS Imagine software and the changes were mapped. The spatial proliferation of aquaculture farms during the study period was also mapped. A multivariate regression analysis was carried out between the obtained NDVI values and the LULC classes. Similarly, the observed meteorological data sets (time series rainfall and minimum and maximum temperature) were also statistically correlated for regression. The study demonstrated the application of NDVI in assessing the environmental status of mangroves as the relationship between the changes in the environmental variables and the remote sensing based indices felicitate an efficient evaluation of environmental variables, which can be used in the coastal zone monitoring and development processes.Keywords: aquaculture farms, LULC, Mangrove, NDVI
Procedia PDF Downloads 1842048 Comparison of Injuries and Accidents Globally and in Finland
Authors: R. Pääkkönen, L. Korpinen
Abstract:
We tried statistically to determine the biggest risks for accidents and injuries in Finland compared to other countries. We have a very high incidence of domestic falls and accidental poisoning compared to other European countries. On the other side, we have a relatively low number of accidents in traffic or at work globally, and in European scale, because we have worked hard to diminish these forms of accidents. In Finland, there is work to be done to improve attitudes and actions against domestic accidents.Keywords: injuries, accident, comparison, Finland
Procedia PDF Downloads 2292047 TRAC: A Software Based New Track Circuit for Traffic Regulation
Authors: Jérôme de Reffye, Marc Antoni
Abstract:
Following the development of the ERTMS system, we think it is interesting to develop another software-based track circuit system which would fit secondary railway lines with an easy-to-work implementation and a low sensitivity to rail-wheel impedance variations. We called this track circuit 'Track Railway by Automatic Circuits.' To be internationally implemented, this system must not have any mechanical component and must be compatible with existing track circuit systems. For example, the system is independent from the French 'Joints Isolants Collés' that isolate track sections from one another, and it is equally independent from component used in Germany called 'Counting Axles,' in French 'compteur d’essieux.' This track circuit is fully interoperable. Such universality is obtained by replacing the train detection mechanical system with a space-time filtering of train position. The various track sections are defined by the frequency of a continuous signal. The set of frequencies related to the track sections is a set of orthogonal functions in a Hilbert Space. Thus the failure probability of track sections separation is precisely calculated on the basis of signal-to-noise ratio. SNR is a function of the level of traction current conducted by rails. This is the reason why we developed a very powerful algorithm to reject noise and jamming to obtain an SNR compatible with the precision required for the track circuit and SIL 4 level. The SIL 4 level is thus reachable by an adjustment of the set of orthogonal functions. Our major contributions to railway engineering signalling science are i) Train space localization is precisely defined by a calibration system. The operation bypasses the GSM-R radio system of the ERTMS system. Moreover, the track circuit is naturally protected against radio-type jammers. After the calibration operation, the track circuit is autonomous. ii) A mathematical topology adapted to train space localization by following the train through a linear time filtering of the received signal. Track sections are numerically defined and can be modified with a software update. The system was numerically simulated, and results were beyond our expectations. We achieved a precision of one meter. Rail-ground and rail-wheel impedance sensitivity analysis gave excellent results. Results are now complete and ready to be published. This work was initialised as a research project of the French Railways developed by the Pi-Ramses Company under SNCF contract and required five years to obtain the results. This track circuit is already at Level 3 of the ERTMS system, and it will be much cheaper to implement and to work. The traffic regulation is based on variable length track sections. As the traffic growths, the maximum speed is reduced, and the track section lengths are decreasing. It is possible if the elementary track section is correctly defined for the minimum speed and if every track section is able to emit with variable frequencies.Keywords: track section, track circuits, space-time crossing, adaptive track section, automatic railway signalling
Procedia PDF Downloads 3332046 The Use of Space Syntax in Urban Transportation Planning and Evaluation: Limits and Potentials
Authors: Chuan Yang, Jing Bie, Yueh-Lung Lin, Zhong Wang
Abstract:
Transportation planning is an academic integration discipline combining research and practice with the aim of mobility and accessibility improvements at both strategic-level policy-making and operational dimensions of practical planning. Transportation planning could build the linkage between traffic and social development goals, for instance, economic benefits and environmental sustainability. The transportation planning analysis and evaluation tend to apply empirical quantitative approaches with the guidance of the fundamental principles, such as efficiency, equity, safety, and sustainability. Space syntax theory has been applied in the spatial distribution of pedestrian movement or vehicle flow analysis, however rare has been written about its application in transportation planning. The correlated relationship between the variables of space syntax analysis and authentic observations have declared that the urban configurations have a significant effect on urban dynamics, for instance, land value, building density, traffic, crime. This research aims to explore the potentials of applying Space Syntax methodology to evaluate urban transportation planning through studying the effects of urban configuration on cities transportation performance. By literature review, this paper aims to discuss the effects that urban configuration with different degrees of integration and accessibility have on three elementary components of transportation planning - transportation efficiency, transportation safety, and economic agglomeration development - via intensifying and stabilising the nature movements generated by the street network. And then the potential and limits of Space Syntax theory to study the performance of urban transportation and transportation planning would be discussed in the paper. In practical terms, this research will help future research explore the effects of urban design on transportation performance, and identify which patterns of urban street networks would allow for most efficient and safe transportation performance with higher economic benefits.Keywords: transportation planning, space syntax, economic agglomeration, transportation efficiency, transportation safety
Procedia PDF Downloads 1982045 A Robust Spatial Feature Extraction Method for Facial Expression Recognition
Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda
Abstract:
This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure
Procedia PDF Downloads 4272044 The Outcome of Using Machine Learning in Medical Imaging
Authors: Adel Edwar Waheeb Louka
Abstract:
Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.Keywords: artificial intelligence, convolutional neural networks, deeplearning, image processing, machine learningSarapin, intraarticular, chronic knee pain, osteoarthritisFNS, trauma, hip, neck femur fracture, minimally invasive surgery
Procedia PDF Downloads 742043 'I Broke the Line Back to the Ancient Ones': Rethinking Intersectional Theory through Wounded Histories in Once Were Warriors (1994) and Whale Rider (2002).
Authors: Kerry Mackereth
Abstract:
Kimberle Crenshaw’s theory of intersectionality has become immensely influential in the fields of women’s and gender studies. However, intersectionality’s widespread use among feminist scholars and activists has been accompanied by critiques of its reliance upon subject categorization. These critiques are of particular import when connected to Wendy Brown’s characterization of identity politics as static 'wounded attachments'. Together, these critiques show how the gridlock model proposed by intersectionality’s primary metaphor, the traffic accident at the intersection, is useful for identifying discrimination but not for remembering historical injustices or imagining feminist and anti-racist resistance. Through the lens of New Zealand Maori film, focusing upon Once Were Warriors (1994) and Whale Rider (2002), this article examines how wounded histories need not be passively reproduced by contemporaneously oppressed groups. Instead, the metaphor of the traffic intersection should be complemented by the metaphor of the wound. Against Brown’s characterization of wounded attachments as negative, static identities, Gloria Anzaldua’s account of the borderland between the United States and Mexico as “una herida abierta”, an open wound, offers an alternative reading of the wound. Through Anzaldua’s and Hortense Spillers’ political thought, the wound is reconceptualized as not only a site of suffering but also as a regenerative space. The coexistence of deterioration and regeneration at the site of the wound underpins the narrative arc of both Once Were Warriors and Whale Rider. In both films, the respective child protagonists attempt to reconcile the pain of wounded histories with the imagination of cultural regeneration. The metaphor of the wound thus serves as an alternative theoretical resource for mapping experiences of oppression, one that enriches feminist theory by balancing the remembrance of historical grievance with the forging of hopeful political projects.Keywords: gender theory, historical grievance, intersectionality, New Zealand film, postcolonialism
Procedia PDF Downloads 2542042 Open Source Knowledge Management Approach to Manage and Disseminate Distributed Content in a Global Enterprise
Authors: Rahul Thakur, Onkar Chandel
Abstract:
Red Hat is the world leader in providing open source software and solutions. A global enterprise, like Red Hat, has unique issues of connecting employees with content because of distributed offices, multiple teams spread across geographies, multiple languages, and different cultures. Employees, of a global company, create content that is distributed across departments, teams, regions, and countries. This makes finding the best content difficult since owners keep iterating on the existing content. When employees are unable to find the content, they end up creating it once again and in the process duplicating existing material and effort. Also, employees may not find the relevant content and spend time reviewing obsolete duplicate, or irrelevant content. On an average, a person spends 15 minutes/day in failed searches that might result in missed business opportunities, employee frustration, and substandard deliverables. Red Hat Knowledge Management Office (KMO) applied 'open source strategy' to solve the above problems. Under the Open Source Strategy, decisions are taken collectively. The strategy aims at accomplishing common goals with the help of communities. The objectives of this initiative were to save employees' time, get them authentic content, improve their content search experience, avoid duplicate content creation, provide context based search, improve analytics, improve content management workflows, automate content classification, and automate content upload. This session will describe open source strategy, its applicability in content management, challenges, recommended solutions, and outcome.Keywords: content classification, content management, knowledge management, open source
Procedia PDF Downloads 2112041 Deep Learning for Qualitative and Quantitative Grain Quality Analysis Using Hyperspectral Imaging
Authors: Ole-Christian Galbo Engstrøm, Erik Schou Dreier, Birthe Møller Jespersen, Kim Steenstrup Pedersen
Abstract:
Grain quality analysis is a multi-parameterized problem that includes a variety of qualitative and quantitative parameters such as grain type classification, damage type classification, and nutrient regression. Currently, these parameters require human inspection, a multitude of instruments employing a variety of sensor technologies, and predictive model types or destructive and slow chemical analysis. This paper investigates the feasibility of applying near-infrared hyperspectral imaging (NIR-HSI) to grain quality analysis. For this study two datasets of NIR hyperspectral images in the wavelength range of 900 nm - 1700 nm have been used. Both datasets contain images of sparsely and densely packed grain kernels. The first dataset contains ~87,000 image crops of bulk wheat samples from 63 harvests where protein value has been determined by the FOSS Infratec NOVA which is the golden industry standard for protein content estimation in bulk samples of cereal grain. The second dataset consists of ~28,000 image crops of bulk grain kernels from seven different wheat varieties and a single rye variety. In the first dataset, protein regression analysis is the problem to solve while variety classification analysis is the problem to solve in the second dataset. Deep convolutional neural networks (CNNs) have the potential to utilize spatio-spectral correlations within a hyperspectral image to simultaneously estimate the qualitative and quantitative parameters. CNNs can autonomously derive meaningful representations of the input data reducing the need for advanced preprocessing techniques required for classical chemometric model types such as artificial neural networks (ANNs) and partial least-squares regression (PLS-R). A comparison between different CNN architectures utilizing 2D and 3D convolution is conducted. These results are compared to the performance of ANNs and PLS-R. Additionally, a variety of preprocessing techniques from image analysis and chemometrics are tested. These include centering, scaling, standard normal variate (SNV), Savitzky-Golay (SG) filtering, and detrending. The results indicate that the combination of NIR-HSI and CNNs has the potential to be the foundation for an automatic system unifying qualitative and quantitative grain quality analysis within a single sensor technology and predictive model type.Keywords: deep learning, grain analysis, hyperspectral imaging, preprocessing techniques
Procedia PDF Downloads 1002040 CRYPTO COPYCAT: A Fashion Centric Blockchain Framework for Eliminating Fashion Infringement
Authors: Magdi Elmessiry, Adel Elmessiry
Abstract:
The fashion industry represents a significant portion of the global gross domestic product, however, it is plagued by cheap imitators that infringe on the trademarks which destroys the fashion industry's hard work and investment. While eventually the copycats would be found and stopped, the damage has already been done, sales are missed and direct and indirect jobs are lost. The infringer thrives on two main facts: the time it takes to discover them and the lack of tracking technologies that can help the consumer distinguish them. Blockchain technology is a new emerging technology that provides a distributed encrypted immutable and fault resistant ledger. Blockchain presents a ripe technology to resolve the infringement epidemic facing the fashion industry. The significance of the study is that a new approach leveraging the state of the art blockchain technology coupled with artificial intelligence is used to create a framework addressing the fashion infringement problem. It transforms the current focus on legal enforcement, which is difficult at best, to consumer awareness that is far more effective. The framework, Crypto CopyCat, creates an immutable digital asset representing the actual product to empower the customer with a near real time query system. This combination emphasizes the consumer's awareness and appreciation of the product's authenticity, while provides real time feedback to the producer regarding the fake replicas. The main findings of this study are that implementing this approach can delay the fake product penetration of the original product market, thus allowing the original product the time to take advantage of the market. The shift in the fake adoption results in reduced returns, which impedes the copycat market and moves the emphasis to the original product innovation.Keywords: fashion, infringement, blockchain, artificial intelligence, textiles supply chain
Procedia PDF Downloads 2622039 User-Awareness from Eye Line Tracing During Specification Writing to Improve Specification Quality
Authors: Yoshinori Wakatake
Abstract:
Many defects after the release of software packages are caused due to omissions of sufficient test items in test specifications. Poor test specifications are detected by manual review, which imposes a high human load. The prevention of omissions depends on the end-user awareness of test specification writers. If test specifications were written while envisioning the behavior of end-users, the number of omissions in test items would be greatly reduced. The paper pays attention to the point that writers who can achieve it differ from those who cannot in not only the description richness but also their gaze information. It proposes a method to estimate the degree of user-awareness of writers through the analysis of their gaze information when writing test specifications. We conduct an experiment to obtain the gaze information of a writer of the test specifications. Test specifications are automatically classified using gaze information. In this method, a Random Forest model is constructed for the classification. The classification is highly accurate. By looking at the explanatory variables which turn out to be important variables, we know behavioral features to distinguish test specifications of high quality from others. It is confirmed they are pupil diameter size and the number and the duration of blinks. The paper also investigates test specifications automatically classified with gaze information to discuss features in their writing ways in each quality level. The proposed method enables us to automatically classify test specifications. It also prevents test item omissions, because it reveals writing features that test specifications of high quality should satisfy.Keywords: blink, eye tracking, gaze information, pupil diameter, quality improvement, specification document, user-awareness
Procedia PDF Downloads 652038 Statistical Feature Extraction Method for Wood Species Recognition System
Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof
Abstract:
Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.Keywords: classification, feature extraction, fuzzy, inspection system, image analysis, macroscopic images
Procedia PDF Downloads 4272037 Parking Service Effectiveness at Commercial Malls
Authors: Ahmad AlAbdullah, Ali AlQallaf, Mahdi Hussain, Mohammed AlAttar, Salman Ashknani, Magdy Helal
Abstract:
We study the effectiveness of the parking service provided at Kuwaiti commercial malls and explore potential problems and feasible improvements. Commercial malls are important to Kuwaitis as the entertainment and shopping centers due to the lack of other alternatives. The difficulty and relatively long times wasted in finding a parking spot at the mall are real annoyances. We applied queuing analysis to one of the major malls that offer paid-parking (1040 parking spots) in addition to free parking. Patrons of the mall usually complained of the traffic jams and delays at entering the paid parking (average delay to park exceeds 15 min for about 62% of the patrons, while average time spent in the mall is about 2.6 hours). However, the analysis showed acceptable service levels at the check-in gates of the parking garage. Detailed review of the vehicle movement at the gateways indicated that arriving and departing cars both had to share parts of the gateway to the garage, which caused the traffic jams and delays. A simple comparison we made indicated that the largest commercial mall in Kuwait does not suffer such parking issues, while other smaller, yet important malls do, including the one we studied. It was suggested that well-designed inlets and outlets of that gigantic mall permitted smooth parking despite being totally free and mall is the first choice for most people for entertainment and shopping. A simulation model is being developed for further analysis and verification. Simulation can overcome the mathematical difficulty in using non-Poisson queuing models. The simulation model is used to explore potential changes to the parking garage entrance layout. And with the inclusion of the drivers’ behavior inside the parking, effectiveness indicators can be derived to address the economic feasibility of extending the parking capacity and increasing service levels. Outcomes of the study are planned to be generalized as appropriate to other commercial malls in KuwaitKeywords: commercial malls, parking service, queuing analysis, simulation modeling
Procedia PDF Downloads 3402036 Classification for Obstructive Sleep Apnea Syndrome Based on Random Forest
Authors: Cheng-Yu Tsai, Wen-Te Liu, Shin-Mei Hsu, Yin-Tzu Lin, Chi Wu
Abstract:
Background: Obstructive Sleep apnea syndrome (OSAS) is a common respiratory disorder during sleep. In addition, Body parameters were identified high predictive importance for OSAS severity. However, the effects of body parameters on OSAS severity remain unclear. Objective: In this study, the objective is to establish a prediction model for OSAS by using body parameters and investigate the effects of body parameters in OSAS. Methodologies: Severity was quantified as the polysomnography and the mean hourly number of greater than 3% dips in oxygen saturation during examination in a hospital in New Taipei City (Taiwan). Four levels of OSAS severity were classified by the apnea and hypopnea index (AHI) with American Academy of Sleep Medicine (AASM) guideline. Body parameters, including neck circumference, waist size, and body mass index (BMI) were obtained from questionnaire. Next, dividing the collecting subjects into two groups: training and testing groups. The training group was used to establish the random forest (RF) to predicting, and test group was used to evaluated the accuracy of classification. Results: There were 3330 subjects recruited in this study, whom had been done polysomnography for evaluating severity for OSAS. A RF of 1000 trees achieved correctly classified 79.94 % of test cases. When further evaluated on the test cohort, RF showed the waist and BMI as the high import factors in OSAS. Conclusion It is possible to provide patient with prescreening by body parameters which can pre-evaluate the health risks.Keywords: apnea and hypopnea index, Body parameters, obstructive sleep apnea syndrome, Random Forest
Procedia PDF Downloads 1552035 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules
Authors: Mohsen Maraoui
Abstract:
In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing
Procedia PDF Downloads 1412034 Machine Learning Classification of Fused Sentinel-1 and Sentinel-2 Image Data Towards Mapping Fruit Plantations in Highly Heterogenous Landscapes
Authors: Yingisani Chabalala, Elhadi Adam, Khalid Adem Ali
Abstract:
Mapping smallholder fruit plantations using optical data is challenging due to morphological landscape heterogeneity and crop types having overlapped spectral signatures. Furthermore, cloud covers limit the use of optical sensing, especially in subtropical climates where they are persistent. This research assessed the effectiveness of Sentinel-1 (S1) and Sentinel-2 (S2) data for mapping fruit trees and co-existing land-use types by using support vector machine (SVM) and random forest (RF) classifiers independently. These classifiers were also applied to fused data from the two sensors. Feature ranks were extracted using the RF mean decrease accuracy (MDA) and forward variable selection (FVS) to identify optimal spectral windows to classify fruit trees. Based on RF MDA and FVS, the SVM classifier resulted in relatively high classification accuracy with overall accuracy (OA) = 0.91.6% and kappa coefficient = 0.91% when applied to the fused satellite data. Application of SVM to S1, S2, S2 selected variables and S1S2 fusion independently produced OA = 27.64, Kappa coefficient = 0.13%; OA= 87%, Kappa coefficient = 86.89%; OA = 69.33, Kappa coefficient = 69. %; OA = 87.01%, Kappa coefficient = 87%, respectively. Results also indicated that the optimal spectral bands for fruit tree mapping are green (B3) and SWIR_2 (B10) for S2, whereas for S1, the vertical-horizontal (VH) polarization band. Including the textural metrics from the VV channel improved crop discrimination and co-existing land use cover types. The fusion approach proved robust and well-suited for accurate smallholder fruit plantation mapping.Keywords: smallholder agriculture, fruit trees, data fusion, precision agriculture
Procedia PDF Downloads 572033 Floristic Diversity, Composition and Environmental Correlates on the Arid, Coralline Islands of the Farasan Archipelago, Red SEA, Saudi Arabia
Authors: Khalid Al Mutairi, Mashhor Mansor, Magdy El-Bana, Asyraf Mansor, Saud AL-Rowaily
Abstract:
Urban expansion and the associated increase in anthropogenic pressures have led to a great loss of the Red Sea’s biodiversity. Floristic composition, diversity, and environmental controls were investigated for 210 relive's on twenty coral islands of Farasan in the Red Sea, Saudi Arabia. Multivariate statistical analyses for classification (Cluster Analysis), ordination (Detrended Correspondence Analysis (DCA), and Redundancy Analysis (RDA) were employed to identify vegetation types and their relevance to the underlying environmental gradients. A total of 191 flowering plants belonging to 53 families and 129 genera were recorded. Geophytes and chamaephytes were the main life forms in the saline habitats, whereas therophytes and hemicryptophytes dominated the sandy formations and coral rocks. The cluster analysis and DCA ordination identified twelve vegetation groups that linked to five main habitats with definite floristic composition and environmental characteristics. The constrained RDA with Monte Carlo permutation tests revealed that elevation and soil salinity were the main environmental factors explaining the vegetation distributions. These results indicate that the flora of the study archipelago represents a phytogeographical linkage between Africa and Saharo-Arabian landscape functional elements. These findings should guide conservation and management efforts to maintain species diversity, which is threatened by anthropogenic activities and invasion by the exotic invasive tree Prosopis juliflora (Sw.) DC.Keywords: biodiversity, classification, conservation, ordination, Red Sea
Procedia PDF Downloads 3452032 Assessment of Urban Heat Island through Remote Sensing in Nagpur Urban Area Using Landsat 7 ETM+ Satellite Images
Authors: Meenal Surawar, Rajashree Kotharkar
Abstract:
Urban Heat Island (UHI) is found more pronounced as a prominent urban environmental concern in developing cities. To study the UHI effect in the Indian context, the Nagpur urban area has been explored in this paper using Landsat 7 ETM+ satellite images through Remote Sensing and GIS techniques. This paper intends to study the effect of LU/LC pattern on daytime Land Surface Temperature (LST) variation, contributing UHI formation within the Nagpur Urban area. Supervised LU/LC area classification was carried to study urban Change detection using ENVI 5. Change detection has been studied by carrying Normalized Difference Vegetation Index (NDVI) to understand the proportion of vegetative cover with respect to built-up ratio. Detection of spectral radiance from the thermal band of satellite images was processed to calibrate LST. Specific representative areas on the basis of urban built-up and vegetation classification were selected for observation of point LST. The entire Nagpur urban area shows that, as building density increases with decrease in vegetation cover, LST increases, thereby causing the UHI effect. UHI intensity has gradually increased by 0.7°C from 2000 to 2006; however, a drastic increase has been observed with difference of 1.8°C during the period 2006 to 2013. Within the Nagpur urban area, the UHI effect was formed due to increase in building density and decrease in vegetative cover.Keywords: land use/land cover, land surface temperature, remote sensing, urban heat island
Procedia PDF Downloads 2832031 A User Interface for Easiest Way Image Encryption with Chaos
Authors: D. López-Mancilla, J. M. Roblero-Villa
Abstract:
Since 1990, the research on chaotic dynamics has received considerable attention, particularly in light of potential applications of this phenomenon in secure communications. Data encryption using chaotic systems was reported in the 90's as a new approach for signal encoding that differs from the conventional methods that use numerical algorithms as the encryption key. The algorithms for image encryption have received a lot of attention because of the need to find security on image transmission in real time over the internet and wireless networks. Known algorithms for image encryption, like the standard of data encryption (DES), have the drawback of low level of efficiency when the image is large. The encrypting based on chaos proposes a new and efficient way to get a fast and highly secure image encryption. In this work, a user interface for image encryption and a novel and easiest way to encrypt images using chaos are presented. The main idea is to reshape any image into a n-dimensional vector and combine it with vector extracted from a chaotic system, in such a way that the vector image can be hidden within the chaotic vector. Once this is done, an array is formed with the original dimensions of the image and turns again. An analysis of the security of encryption from the images using statistical analysis is made and is used a stage of optimization for image encryption security and, at the same time, the image can be accurately recovered. The user interface uses the algorithms designed for the encryption of images, allowing you to read an image from the hard drive or another external device. The user interface, encrypt the image allowing three modes of encryption. These modes are given by three different chaotic systems that the user can choose. Once encrypted image, is possible to observe the safety analysis and save it on the hard disk. The main results of this study show that this simple method of encryption, using the optimization stage, allows an encryption security, competitive with complicated encryption methods used in other works. In addition, the user interface allows encrypting image with chaos, and to submit it through any public communication channel, including internet.Keywords: image encryption, chaos, secure communications, user interface
Procedia PDF Downloads 4932030 Radiographic Predictors of Mandibular Third Molar Extraction Difficulties under General Anaesthetic
Authors: Carolyn Whyte, Tina Halai, Sonita Koshal
Abstract:
Aim: There are many methods available to assess the potential difficulty of third molar surgery. This study investigated various factors to assess whether they had a bearing on the difficulties encountered. Study design: A retrospective study was completed of 62 single mandibular third molar teeth removed under day case general anaesthesia between May 2016 and August 2016 by 3 consultant oral surgeons. Method: Data collection was by examining the OPG radiographs of each tooth and recording the necessary data. This was depth of impaction, angulation, bony impaction, point of application in relation to second molar, root morphology, Pell and Gregory classification and Winters Lines. This was completed by one assessor and verified by another. Information on medical history, anxiety, ethnicity and age were recorded. Case notes and surgical entries were examined for any difficulties encountered. Results: There were 5 cases which encountered surgical difficulties which included fracture of root apices (3) which were left in situ, prolonged bleeding (1) and post-operative numbness >6 months(1). Four of the 5 cases had Pell and Gregory classification as (B) where the occlusal plane of the impacted tooth is between the occlusal plane and the cervical line of the adjacent tooth. 80% of cases had the point of application as either coronal or apical one third (1/3) in relation to the second molar. However, there was variability in all other aspects of assessment in predicting difficulty of removal. Conclusions: Of the cases which encountered difficulties they all had at least one predictor of potential complexity but these varied case by case.Keywords: impaction, mandibular third molar, radiographic assessment, surgical removal
Procedia PDF Downloads 1822029 A Proposed Algorithm for Obtaining the Map of Subscribers’ Density Distribution for a Mobile Wireless Communication Network
Authors: C. Temaneh-Nyah, F. A. Phiri, D. Karegeya
Abstract:
This paper presents an algorithm for obtaining the map of subscriber’s density distribution for a mobile wireless communication network based on the actual subscriber's traffic data obtained from the base station. This is useful in statistical characterization of the mobile wireless network.Keywords: electromagnetic compatibility, statistical analysis, simulation of communication network, subscriber density
Procedia PDF Downloads 3112028 Demand for Care in Primary Health Care in the Governorate of Ariana: Results of a Survey in Ariana Primary Health Care and Comparison with the Last 30 Years
Authors: Chelly Souhir, Harizi Chahida, Hachaichi Aicha, Aissaoui Sihem, Chahed Mohamed Kouni
Abstract:
Introduction: In Tunisia, few studies have attempted to describe the demand for primary care in a standardized and systematic way. The purpose of this study is to describe the main reasons for demand for care in primary health care, through a survey of the Ariana Governorate PHC and to identify their evolutionary trend compared to last 30 years, reported by studies of the same type. Materials and methods: This is a cross-sectional descriptive study which concerns the study of consultants in the first line of the governorate of Ariana and their use of care recorded during 2 days in the same week during the month of May 2016, in each of these PHC. The same data collection sheet was used in all CSBs. The coding of the information was done according to the International Classification of Primary Care (ICPC). The data was entered and analyzed by the EPI Info 7 software. Results: Our study found that the most common ICPC chapters are respiratory (42%) and digestive (13.2%). In 1996 were the respiratory (43.5%) and circulatory (7.8%). In 2000, we found also the respiratory (39,6%) and circulatory (10,9%). In 2002, respiratory (43%) and digestive (10.1%) motives were the most frequent. According to the ICPC, the pathologies in our study were acute angina (19%), acute bronchitis and bronchiolitis (8%). In 1996, it was tonsillitis ( 21.6%) and acute bronchitis (7.2%). For Ben Abdelaziz in 2000, tonsillitis (14.5%) follow by acute bronchitis (8.3%). In 2002, acute angina (15.7%), acute bronchitis and bronchiolitis (11.2%) were the most common. Conclusion: Acute angina and tonsillitis are the most common in all studies conducted in Tunisia.Keywords: acute angina, classification of primary care, primary health care, tonsillitis, Tunisia
Procedia PDF Downloads 5332027 Geospatial Techniques and VHR Imagery Use for Identification and Classification of Slums in Gujrat City, Pakistan
Authors: Muhammad Ameer Nawaz Akram
Abstract:
The 21st century has been revealed that many individuals around the world are living in urban settlements than in rural zones. The evolution of numerous cities in emerging and newly developed countries is accompanied by the rise of slums. The precise definition of a slum varies countries to countries, but the universal harmony is that slums are dilapidated settlements facing severe poverty and have lacked access to sanitation, water, electricity, good living styles, and land tenure. The slum settlements always vary in unique patterns within and among the countries and cities. The core objective of this study is the spatial identification and classification of slums in Gujrat city Pakistan from very high-resolution GeoEye-1 (0.41m) satellite imagery. Slums were first identified using GPS for sample site identification and ground-truthing; through this process, 425 slums were identified. Then Object-Oriented Analysis (OOA) was applied to classify slums on digital image. Spatial analysis softwares, e.g., ArcGIS 10.3, Erdas Imagine 9.3, and Envi 5.1, were used for processing data and performing the analysis. Results show that OOA provides up to 90% accuracy for the identification of slums. Jalal Cheema and Allah Ho colonies are severely affected by slum settlements. The ratio of criminal activities is also higher here than in other areas. Slums are increasing with the passage of time in urban areas, and they will be like a hazardous problem in coming future. So now, the executive bodies need to make effective policies and move towards the amelioration process of the city.Keywords: slums, GPS, satellite imagery, object oriented analysis, zonal change detection
Procedia PDF Downloads 1362026 Computer Aided Diagnosis Bringing Changes in Breast Cancer Detection
Authors: Devadrita Dey Sarkar
Abstract:
Regardless of the many technologic advances in the past decade, increased training and experience, and the obvious benefits of uniform standards, the false-negative rate in screening mammography remains unacceptably high .A computer aided neural network classification of regions of suspicion (ROS) on digitized mammograms is presented in this abstract which employs features extracted by a new technique based on independent component analysis. CAD is a concept established by taking into account equally the roles of physicians and computers, whereas automated computer diagnosis is a concept based on computer algorithms only. With CAD, the performance by computers does not have to be comparable to or better than that by physicians, but needs to be complementary to that by physicians. In fact, a large number of CAD systems have been employed for assisting physicians in the early detection of breast cancers on mammograms. A CAD scheme that makes use of lateral breast images has the potential to improve the overall performance in the detection of breast lumps. Because breast lumps can be detected reliably by computer on lateral breast mammographs, radiologists’ accuracy in the detection of breast lumps would be improved by the use of CAD, and thus early diagnosis of breast cancer would become possible. In the future, many CAD schemes could be assembled as packages and implemented as a part of PACS. For example, the package for breast CAD may include the computerized detection of breast nodules, as well as the computerized classification of benign and malignant nodules. In order to assist in the differential diagnosis, it would be possible to search for and retrieve images (or lesions) with these CAD systems, which would be reliable and useful method for quantifying the similarity of a pair of images for visual comparison by radiologists.Keywords: CAD(computer-aided design), lesions, neural network, ROS(region of suspicion)
Procedia PDF Downloads 4572025 Dynamic Modelling and Assessment for Urban Growth and Transport in Riyadh City, Saudi Arabia
Authors: Majid Aldalbahi
Abstract:
In 2009, over 3.4 billion people in the world resided in urban areas as a result of rapid urban growth. This figure is estimated to increase to 6.5 billion by 2050. This urban growth phenomenon has raised challenges for many countries in both the developing and developed worlds. Urban growth is a complicated process involving the spatiotemporal changes of all socio-economic and physical components at different scales. The socio-economic components of urban growth are related to urban population growth and economic growth, while physical components of urban growth and economic growth are related to spatial expansion, land cover change and land use change which are the focus of this research. The interactions between these components are complex and no-linear. Several factors and forces cause these complex interactions including transportation and communication, internal and international migrations, public policies, high natural growth rates of urban populations and public policies. Urban growth has positive and negative consequences. The positive effects relates to planned and orderly urban growth, while negative effects relate to unplanned and scattered growth, which is called sprawl. Although urban growth is considered as necessary for sustainable urbanization, uncontrolled and rapid growth cause various problems including consumption of precious rural land resources at urban fringe, landscape alteration, traffic congestion, infrastructure pressure, and neighborhood conflicts. Traditional urban planning approaches in fast growing cities cannot accommodate the negative consequences of rapid urban growth. Microsimulation programme, and modelling techniques are effective means to provide new urban development, management and planning methods and approaches. This paper aims to use these techniques to understand and analyse the complex interactions for the case study of Riyadh city, a fast growing city in Saudi Arabia.Keywords: policy implications, urban planning, traffic congestion, urban growth, Suadi Arabia, Riyadh
Procedia PDF Downloads 485