Search results for: multi criteria inventory classification models
14065 The Design of a Vehicle Traffic Flow Prediction Model for a Gauteng Freeway Based on an Ensemble of Multi-Layer Perceptron
Authors: Tebogo Emma Makaba, Barnabas Ndlovu Gatsheni
Abstract:
The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.Keywords: bagging ensemble methods, confusion matrix, multi-layer perceptron, vehicle traffic flow
Procedia PDF Downloads 34414064 Monitoring of Cannabis Cultivation with High-Resolution Images
Authors: Levent Basayigit, Sinan Demir, Burhan Kara, Yusuf Ucar
Abstract:
Cannabis is mostly used for drug production. In some countries, an excessive amount of illegal cannabis is cultivated and sold. Most of the illegal cannabis cultivation occurs on the lands far from settlements. In farmlands, it is cultivated with other crops. In this method, cannabis is surrounded by tall plants like corn and sunflower. It is also cultivated with tall crops as the mixed culture. The common method of the determination of the illegal cultivation areas is to investigate the information obtained from people. This method is not sufficient for the determination of illegal cultivation in remote areas. For this reason, more effective methods are needed for the determination of illegal cultivation. Remote Sensing is one of the most important technologies to monitor the plant growth on the land. The aim of this study is to monitor cannabis cultivation area using satellite imagery. The main purpose of this study was to develop an applicable method for monitoring the cannabis cultivation. For this purpose, cannabis was grown as single or surrounded by the corn and sunflower in plots. The morphological characteristics of cannabis were recorded two times per month during the vegetation period. The spectral signature library was created with the spectroradiometer. The parcels were monitored with high-resolution satellite imagery. With the processing of satellite imagery, the cultivation areas of cannabis were classified. To separate the Cannabis plots from the other plants, the multiresolution segmentation algorithm was found to be the most successful for classification. WorldView Improved Vegetative Index (WV-VI) classification was the most accurate method for monitoring the plant density. As a result, an object-based classification method and vegetation indices were sufficient for monitoring the cannabis cultivation in multi-temporal Earthwiev images.Keywords: Cannabis, drug, remote sensing, object-based classification
Procedia PDF Downloads 27214063 4D Modelling of Low Visibility Underwater Archaeological Excavations Using Multi-Source Photogrammetry in the Bulgarian Black Sea
Authors: Rodrigo Pacheco-Ruiz, Jonathan Adams, Felix Pedrotti
Abstract:
This paper introduces the applicability of underwater photogrammetric survey within challenging conditions as the main tool to enhance and enrich the process of documenting archaeological excavation through the creation of 4D models. Photogrammetry was being attempted on underwater archaeological sites at least as early as the 1970s’ and today the production of traditional 3D models is becoming a common practice within the discipline. Photogrammetry underwater is more often implemented to record exposed underwater archaeological remains and less so as a dynamic interpretative tool. Therefore, it tends to be applied in bright environments and when underwater visibility is > 1m, reducing its implementation on most submerged archaeological sites in more turbid conditions. Recent years have seen significant development of better digital photographic sensors and the improvement of optical technology, ideal for darker environments. Such developments, in tandem with powerful processing computing systems, have allowed underwater photogrammetry to be used by this research as a standard recording and interpretative tool. Using multi-source photogrammetry (5, GoPro5 Hero Black cameras) this paper presents the accumulation of daily (4D) underwater surveys carried out in the Early Bronze Age (3,300 BC) to Late Ottoman (17th Century AD) archaeological site of Ropotamo in the Bulgarian Black Sea under challenging conditions (< 0.5m visibility). It proves that underwater photogrammetry can and should be used as one of the main recording methods even in low light and poor underwater conditions as a way to better understand the complexity of the underwater archaeological record.Keywords: 4D modelling, Black Sea Maritime Archaeology Project, multi-source photogrammetry, low visibility underwater survey
Procedia PDF Downloads 23814062 Performance Evaluation of Contemporary Classifiers for Automatic Detection of Epileptic EEG
Authors: K. E. Ch. Vidyasagar, M. Moghavvemi, T. S. S. T. Prabhat
Abstract:
Epilepsy is a global problem, and with seizures eluding even the smartest of diagnoses a requirement for automatic detection of the same using electroencephalogram (EEG) would have a huge impact in diagnosis of the disorder. Among a multitude of methods for automatic epilepsy detection, one should find the best method out, based on accuracy, for classification. This paper reasons out, and rationalizes, the best methods for classification. Accuracy is based on the classifier, and thus this paper discusses classifiers like quadratic discriminant analysis (QDA), classification and regression tree (CART), support vector machine (SVM), naive Bayes classifier (NBC), linear discriminant analysis (LDA), K-nearest neighbor (KNN) and artificial neural networks (ANN). Results show that ANN is the most accurate of all the above stated classifiers with 97.7% accuracy, 97.25% specificity and 98.28% sensitivity in its merit. This is followed closely by SVM with 1% variation in result. These results would certainly help researchers choose the best classifier for detection of epilepsy.Keywords: classification, seizure, KNN, SVM, LDA, ANN, epilepsy
Procedia PDF Downloads 52414061 3D Receiver Operator Characteristic Histogram
Authors: Xiaoli Zhang, Xiongfei Li, Yuncong Feng
Abstract:
ROC curves, as a widely used evaluating tool in machine learning field, are the tradeoff of true positive rate and negative rate. However, they are blamed for ignoring some vital information in the evaluation process, such as the amount of information about the target that each instance carries, predicted score given by each classification model to each instance. Hence, in this paper, a new classification performance method is proposed by extending the Receiver Operator Characteristic (ROC) curves to 3D space, which is denoted as 3D ROC Histogram. In the histogram, theKeywords: classification, performance evaluation, receiver operating characteristic histogram, hardness prediction
Procedia PDF Downloads 31514060 PhilSHORE: Development of a WebGIS-Based Marine Spatial Planning Tool for Tidal Current Energy Resource Assessment and Site Suitability Analysis
Authors: Ma. Rosario Concepcion O. Ang, Luis Caezar Ian K. Panganiban, Charmyne B. Mamador, Oliver Dan G. De Luna, Michael D. Bausas, Joselito P. Cruz
Abstract:
PhilSHORE is a multi-site, multi-device and multi-criteria decision support tool designed to support the development of tidal current energy in the Philippines. Its platform is based on Geographic Information Systems (GIS) which allows for the collection, storage, processing, analyses and display of geospatial data. Combining GIS tools with open source web development applications, PhilSHORE becomes a webGIS-based marine spatial planning tool. To date, PhilSHORE displays output maps and graphs of power and energy density, site suitability and site-device analysis. It enables stakeholders and the public easy access to the results of tidal current energy resource assessments and site suitability analyses. Results of the initial development shows PhilSHORE is a promising decision support tool for ORE project developments.Keywords: gis, site suitability analysis, tidal current energy resource assessment, webgis
Procedia PDF Downloads 52714059 A Combination of Independent Component Analysis, Relative Wavelet Energy and Support Vector Machine for Mental State Classification
Authors: Nguyen The Hoang Anh, Tran Huy Hoang, Vu Tat Thang, T. T. Quyen Bui
Abstract:
Mental state classification is an important step for realizing a control system based on electroencephalography (EEG) signals which could benefit a lot of paralyzed people including the locked-in or Amyotrophic Lateral Sclerosis. Considering that EEG signals are nonstationary and often contaminated by various types of artifacts, classifying thoughts into correct mental states is not a trivial problem. In this work, our contribution is that we present and realize a novel model which integrates different techniques: Independent component analysis (ICA), relative wavelet energy, and support vector machine (SVM) for the same task. We applied our model to classify thoughts in two types of experiment whether with two or three mental states. The experimental results show that the presented model outperforms other models using Artificial Neural Network, K-Nearest Neighbors, etc.Keywords: EEG, ICA, SVM, wavelet
Procedia PDF Downloads 38414058 User Acceptance Criteria for Digital Libraries
Authors: Yu-Ming Wang, Jia-Hong Jian
Abstract:
The Internet and digital publication technologies have brought dramatic impacts on how people collect, organize, disseminate, access, store, and use information. More and more governments, schools, and organizations spent huge funds to develop digital libraries. A digital library can be regarded as a web extension of traditional physically libraries. People can search diverse publications, find out the position of knowledge resources, and borrow or buy publications through digital libraries. People can gain knowledge and students or employees can finish their reports by using digital libraries. Since the considerable funds and energy have been invested in implementing digital libraries, it is important to understand the evaluative criteria from the users’ viewpoint in order to enhance user acceptance. This study develops a list of user acceptance criteria for digital libraries. An initial criteria list was developed based on some previously validated instruments related to digital libraries. Data were collected from user experiences of digital libraries. The exploratory factor analysis and confirmatory factor analysis were adopted to purify the criteria list. The reliabilities and validities were tested. After validating the criteria list, a user survey was conducted to collect the comparative importance of criteria. The analytic hierarchy process (AHP) method was utilized to derive the importance of each criterion. The results of this study contribute to an e understanding of the criteria and relative importance that users evaluate for digital libraries.Keywords: digital library, user acceptance, analytic hierarchy process, factor analysis
Procedia PDF Downloads 25414057 Combined Odd Pair Autoregressive Coefficients for Epileptic EEG Signals Classification by Radial Basis Function Neural Network
Authors: Boukari Nassim
Abstract:
This paper describes the use of odd pair autoregressive coefficients (Yule _Walker and Burg) for the feature extraction of electroencephalogram (EEG) signals. In the classification: the radial basis function neural network neural network (RBFNN) is employed. The RBFNN is described by his architecture and his characteristics: as the RBF is defined by the spread which is modified for improving the results of the classification. Five types of EEG signals are defined for this work: Set A, Set B for normal signals, Set C, Set D for interictal signals, set E for ictal signal (we can found that in Bonn university). In outputs, two classes are given (AC, AD, AE, BC, BD, BE, CE, DE), the best accuracy is calculated at 99% for the combined odd pair autoregressive coefficients. Our method is very effective for the diagnosis of epileptic EEG signals.Keywords: epilepsy, EEG signals classification, combined odd pair autoregressive coefficients, radial basis function neural network
Procedia PDF Downloads 34614056 Time Series Regression with Meta-Clusters
Authors: Monika Chuchro
Abstract:
This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain a subgroups of time series data with normal distribution from inflow into waste water treatment plant data which Composed of several groups differing by mean value. Two simple algorithms: K-mean and EM were chosen as a clustering method. The rand index was used to measure the similarity. After simple meta-clustering, regression model was performed for each subgroups. The final model was a sum of subgroups models. The quality of obtained model was compared with the regression model made using the same explanatory variables but with no clustering of data. Results were compared by determination coefficient (R2), measure of prediction accuracy mean absolute percentage error (MAPE) and comparison on linear chart. Preliminary results allows to foresee the potential of the presented technique.Keywords: clustering, data analysis, data mining, predictive models
Procedia PDF Downloads 46614055 An Optimization Modelling to Evaluate Flights Scheduling at Tourist Airports
Authors: Dimitrios J. Dimitriou
Abstract:
Airport’s serving a tourist destination are an essential counterpart of the tourist demand supply chain, and their productivity is related to the region’s attractiveness and is enhanced by the air transport business. In this paper, the evaluation framework of the scheduled flights between two tourist airports is taken into consideration. By adopting a systemic approach, the arrivals from an airport that its connectivity heavily depended on the departures of another major airport are reviewed. The methodology framework, based on inventory control theory and the numerical example, promotes the use of the modelling formulation. The results would be essential for comparison and exercising to other similar cases.Keywords: airport connectivity, inventory control, optimization, optimum allocation
Procedia PDF Downloads 33514054 Accuracy Analysis of the American Society of Anesthesiologists Classification Using ChatGPT
Authors: Jae Ni Jang, Young Uk Kim
Abstract:
Background: Chat Generative Pre-training Transformer-3 (ChatGPT; San Francisco, California, Open Artificial Intelligence) is an artificial intelligence chatbot based on a large language model designed to generate human-like text. As the usage of ChatGPT is increasing among less knowledgeable patients, medical students, and anesthesia and pain medicine residents or trainees, we aimed to evaluate the accuracy of ChatGPT-3 responses to questions about the American Society of Anesthesiologists (ASA) classification based on patients’ underlying diseases and assess the quality of the generated responses. Methods: A total of 47 questions were submitted to ChatGPT using textual prompts. The questions were designed for ChatGPT-3 to provide answers regarding ASA classification in response to common underlying diseases frequently observed in adult patients. In addition, we created 18 questions regarding the ASA classification for pediatric patients and pregnant women. The accuracy of ChatGPT’s responses was evaluated by cross-referencing with Miller’s Anesthesia, Morgan & Mikhail’s Clinical Anesthesiology, and the American Society of Anesthesiologists’ ASA Physical Status Classification System (2020). Results: Out of the 47 questions pertaining to adults, ChatGPT -3 provided correct answers for only 23, resulting in an accuracy rate of 48.9%. Furthermore, the responses provided by ChatGPT-3 regarding children and pregnant women were mostly inaccurate, as indicated by a 28% accuracy rate (5 out of 18). Conclusions: ChatGPT provided correct responses to questions relevant to the daily clinical routine of anesthesiologists in approximately half of the cases, while the remaining responses contained errors. Therefore, caution is advised when using ChatGPT to retrieve anesthesia-related information. Although ChatGPT may not yet be suitable for clinical settings, we anticipate significant improvements in ChatGPT and other large language models in the near future. Regular assessments of ChatGPT's ASA classification accuracy are essential due to the evolving nature of ChatGPT as an artificial intelligence entity. This is especially important because ChatGPT has a clinically unacceptable rate of error and hallucination, particularly in pediatric patients and pregnant women. The methodology established in this study may be used to continue evaluating ChatGPT.Keywords: American Society of Anesthesiologists, artificial intelligence, Chat Generative Pre-training Transformer-3, ChatGPT
Procedia PDF Downloads 5014053 Use of Segmentation and Color Adjustment for Skin Tone Classification in Dermatological Images
Authors: Fernando Duarte
Abstract:
The work aims to evaluate the use of classical image processing methodologies towards skin tone classification in dermatological images. The skin tone is an important attribute when considering several factor for skin cancer diagnosis. Currently, there is a lack of clear methodologies to classify the skin tone based only on the dermatological image. In this work, a recent released dataset with the label for skin tone was used as reference for the evaluation of classical methodologies for segmentation and adjustment of color space for classification of skin tone in dermatological images. It was noticed that even though the classical methodologies can work fine for segmentation and color adjustment, classifying the skin tone without proper control of the aquisition of the sample images ended being very unreliable.Keywords: segmentation, classification, color space, skin tone, Fitzpatrick
Procedia PDF Downloads 3714052 Deep Learning Approach to Trademark Design Code Identification
Authors: Girish J. Showkatramani, Arthi M. Krishna, Sashi Nareddi, Naresh Nula, Aaron Pepe, Glen Brown, Greg Gabel, Chris Doninger
Abstract:
Trademark examination and approval is a complex process that involves analysis and review of the design components of the marks such as the visual representation as well as the textual data associated with marks such as marks' description. Currently, the process of identifying marks with similar visual representation is done manually in United States Patent and Trademark Office (USPTO) and takes a considerable amount of time. Moreover, the accuracy of these searches depends heavily on the experts determining the trademark design codes used to catalog the visual design codes in the mark. In this study, we explore several methods to automate trademark design code classification. Based on recent successes of convolutional neural networks in image classification, we have used several different convolutional neural networks such as Google’s Inception v3, Inception-ResNet-v2, and Xception net. The study also looks into other techniques to augment the results from CNNs such as using Open Source Computer Vision Library (OpenCV) to pre-process the images. This paper reports the results of the various models trained on year of annotated trademark images.Keywords: trademark design code, convolutional neural networks, trademark image classification, trademark image search, Inception-ResNet-v2
Procedia PDF Downloads 23314051 Study the Difference Between the Mohr-Coulomb and the Barton-Bandis Joint Constitutive Models: A Case Study from the Iron Open Pit Mine, Canada
Authors: Abbas Kamalibandpey, Alain Beland, Joseph Mukendi Kabuya
Abstract:
Since a rock mass is a discontinuum medium, its behaviour is governed by discontinuities such as faults, joint sets, lithologic contact, and bedding planes. Thus, rock slope stability analysis in jointed rock masses is largely dependent upon discontinuities constitutive equations. This paper studies the difference between the Mohr-Coulomb (MC) and the Barton-Bandis (BB) joint constitutive numerical models for lithological contacts and joint sets. For the rock in these models, generalized Hoek-Brown criteria have been considered. The joint roughness coefficient (JRC) and the joint wall compressive strength (JCS) are vital parameters in the BB model. The numerical models are applied to the rock slope stability analysis in the Mont-Wright (MW) mine. The Mont-Wright mine is owned and operated by ArcelorMittal Mining Canada (AMMC), one of the largest iron-ore open pit operations in Canada. In this regard, one of the high walls of the mine has been selected to undergo slope stability analysis with RS2D software, finite element method. Three piezometers have been installed in this zone to record pore water pressure and it is monitored by radar. In this zone, the AMP-IF and QRMS-IF contacts and very persistent and altered joint sets in IF control the rock slope behaviour. The height of the slope is more than 250 m and consists of different lithologies such as AMP, IF, GN, QRMS, and QR. To apply the B-B model, the joint sets and geological contacts have been scanned by Maptek, and their JRC has been calculated by different methods. The numerical studies reveal that the JRC of geological contacts, AMP-IF and QRMS-IF, and joint sets in IF had a significant influence on the safety factor. After evaluating the results of rock slope stability analysis and the radar data, the B-B constitutive equation for discontinuities has shown acceptable results to the real condition in the mine. It should be noted that the difference in safety factors in MC and BB joint constitutive models in some cases is more than 30%.Keywords: barton-Bandis criterion, Hoek-brown and Mohr-Coulomb criteria, open pit, slope stability
Procedia PDF Downloads 10914050 Credit Risk Evaluation Using Genetic Programming
Authors: Ines Gasmi, Salima Smiti, Makram Soui, Khaled Ghedira
Abstract:
Credit risk is considered as one of the important issues for financial institutions. It provokes great losses for banks. To this objective, numerous methods for credit risk evaluation have been proposed. Many evaluation methods are black box models that cannot adequately reveal information hidden in the data. However, several works have focused on building transparent rules-based models. For credit risk assessment, generated rules must be not only highly accurate, but also highly interpretable. In this paper, we aim to build both, an accurate and transparent credit risk evaluation model which proposes a set of classification rules. In fact, we consider the credit risk evaluation as an optimization problem which uses a genetic programming (GP) algorithm, where the goal is to maximize the accuracy of generated rules. We evaluate our proposed approach on the base of German and Australian credit datasets. We compared our finding with some existing works; the result shows that the proposed GP outperforms the other models.Keywords: credit risk assessment, rule generation, genetic programming, feature selection
Procedia PDF Downloads 35514049 Hate Speech Detection Using Deep Learning and Machine Learning Models
Authors: Nabil Shawkat, Jamil Saquer
Abstract:
Social media has accelerated our ability to engage with others and eliminated many communication barriers. On the other hand, the widespread use of social media resulted in an increase in online hate speech. This has drastic impacts on vulnerable individuals and societies. Therefore, it is critical to detect hate speech to prevent innocent users and vulnerable communities from becoming victims of hate speech. We investigate the performance of different deep learning and machine learning algorithms on three different datasets. Our results show that the BERT model gives the best performance among all the models by achieving an F1-score of 90.6% on one of the datasets and F1-scores of 89.7% and 88.2% on the other two datasets.Keywords: hate speech, machine learning, deep learning, abusive words, social media, text classification
Procedia PDF Downloads 13914048 Periurban Landscape as an Opportunity Field to Solve Ecological Urban Conflicts
Authors: Cristina Galiana Carballo, Ibon Doval Martínez
Abstract:
Urban boundaries often result in a controversial limit between countryside and city in Europe. This territory is normally defined by the very limited land uses and the abundance of open space. The dimension and dynamics of peri-urbanization in the last decades have increased this land stock, which has influenced/impacted in several factors in terms of economic costs (maintenance, transport), ecological disturbances of the territory and changes in inhabitant´s behaviour. In an increasingly urbanised world and a growing urban population, cities also face challenges such as Climate Change. In this context, new near-future corrective trends including circular economies for local food supply or decentralised waste management became key strategies towards more sustainable urban models. Those new solutions need to be planned and implemented considering the potential conflict with current land uses. The city of Vitoria-Gasteiz (Basque Country, Spain) has triplicated land consumption per habitant in 10 years, resulting in a vast extension of low-density urban type confronting rural land and threatening agricultural uses, landscape and urban sustainability. Urban planning allows managing and optimum use allocation based on soil vocation and socio-ecosystem needs, while peri-urban space arises as an opportunity for developing different uses which do not match either within the compact city, not in open agricultural lands, such as medium-size agrocomposting systems or biomass plants. Therefore, a qualitative multi-criteria methodology has been developed for Vitoria-Gasteiz city to assess the spatial definition of peri-urban land. Therefore, a qualitative multi-criteria methodology has been developed for Vitoria-Gasteiz city to assess the spatial definition of peri-urban land. Climate change and circular economy were identified as frameworks where to determine future land, soil vocation and urban planning requirements which eventually become estimations of required local food and renewable energy supply along with alternative waste management system´s implementation. By means of it, it has been developed an urban planning proposal which overcomes urban-non urban dichotomy in Vitoria-Gasteiz. The proposal aims to enhance rural system and improve urban sustainability performance through the normative recognition of an agricultural peri-urban belt.Keywords: landscape ecology, land-use management, periurban, urban planning
Procedia PDF Downloads 16314047 Identification of Successful Criteria for Measuring Large Infrastructure Projects Performance in Malaysia
Authors: M. A. N. Masrom, M. H. I. A. Rahim, G. K. Chen, S. Mohamed
Abstract:
Large infrastructure project is one of significant category in the development of Malaysian construction industry. This type of project has been recognized as a high complexity project with numerous construction risks, large cost involvement, highly technical requirements and divers of resources. Besides, the development of large infrastructure such as highway, railway, Mass Rapid Transit (MRT) and airport are also needed a large investment of public and private sector. To accomplish the development successfully, several challenges has to be determined prior the project commencement. To date, a comprehensive assessment of key success criteria particularly for large infrastructure in developing country such as Malaysia, is still not systematically defined and therefore, it needs further investigation. This paper aims to explore the potential success criteria that would be useful in gauging overall performance of large infrastructure implementation particularly in developing country. Previous successful criteria studies were used to develop a conceptual framework that possibly suitable for measuring large infrastructure performance. The findings show that successful criteria of infrastructure projects implementation could be grouped according to several key elements as it seems significant to the participants in prioritizing project challenges more systematically.Keywords: successful criteria, performance, large infrastructure, Malaysia
Procedia PDF Downloads 41114046 Comparison of the Factor of Safety and Strength Reduction Factor Values from Slope Stability Analysis of a Large Open Pit
Authors: James Killian, Sarah Cox
Abstract:
The use of stability criteria within geotechnical engineering is the way the results of analyses are conveyed, and sensitivities and risk assessments are performed. Historically, the primary stability criteria for slope design has been the Factor of Safety (FOS) coming from a limit calculation. Increasingly, the value derived from Strength Reduction Factor (SRF) analysis is being used as the criteria for stability analysis. The purpose of this work was to study in detail the relationship between SRF values produced from a numerical modeling technique and the traditional FOS values produced from Limit Equilibrium (LEM) analyses. This study utilized a model of a 3000-foot-high slope with a 45-degree slope angle, assuming a perfectly plastic mohr-coulomb constitutive model with high cohesion and friction angle values typical of a large hard rock mine slope. A number of variables affecting the values of the SRF in a numerical analysis were tested, including zone size, in-situ stress, tensile strength, and dilation angle. This paper demonstrates that in most cases, SRF values are lower than the corresponding LEM FOS values. Modeled zone size has the greatest effect on the estimated SRF value, which can vary as much as 15% to the downside compared to FOS. For consistency when using SRF as a stability criteria, the authors suggest that numerical model zone sizes should not be constructed to be smaller than about 1% of the overall problem slope height and shouldn’t be greater than 2%. Future work could include investigations of the effect of anisotropic strength assumptions or advanced constitutive models.Keywords: FOS, SRF, LEM, comparison
Procedia PDF Downloads 31214045 The Relevance of Environmental, Social, and Governance in Sustainable Supplier Selection
Authors: Christoph Koester
Abstract:
Supplier selection is one of the key issues in supply chain management with a growing emphasis on sustainability driven by increasing stakeholder expectations and proactivity. In addition, new regulations, such as the German Supply Chain Act, fostered the inclusion of sustainable incl. governance selection criteria in the selection process. In order to provide a systematic approach to select the most suitable sustainable suppliers, this study quantifies the importance and prioritizes the relevant selection criteria across 17 German industries using the Fuzzy Analytical Hierarchy Process. Results show that economic criteria are still the most important in the selection decision averaging a global weight of 51%. However, environmental, social, and governance (ESG) criteria are combined, on average, almost equally important, with global weights of 22%, 16%, and 11%, respectively. While the type of industry influences criteria weights, other factors, such as type of purchasing or demographic factors, appear to have little impact.Keywords: ESG, fuzzy analytical hierarchy process, sustainable supplier selection, sustainability
Procedia PDF Downloads 8714044 Body Image Dissatifaction with and Personal Behavioral Control in Obese Patients Who are Attending to Treatment
Authors: Mariela Gonzalez, Zoraide Lugli, Eleonora Vivas, Rosana Guzmán
Abstract:
The objective was to determine the predictive capacity of self-efficacy perceived for weight control, locus of weight control and skills of weight self-management in the dissatisfaction of the body image in obese people who attend treatment. Sectional study conducted in the city of Maracay, Venezuela, with 243 obese who attend to treatment, 173 of the feminine gender and 70 of the male, with ages ranging between 18 and 57 years old. The sample body mass index ranged between 29.39 and 44.14. The following instruments were used: The Body Shape Questionnaire (BSQ), the inventory of body weight self-regulation, The Inventory of self-efficacy in the regulation of body weight and the Inventory of the Locus of weight control. Calculating the descriptive statistics and of central tendency, coefficients of correlation and multiple regression; it was found that a low ‘perceived Self-efficacy in the weight control’ and a high ‘Locus of external control’, predict the dissatisfaction with body image in obese who attend treatment. The findings are a first approximation to give an account of the importance of the personal control variables in the study of the psychological grief on the overweight individual.Keywords: dissatisfaction with body image, obese people, personal control, psychological variables
Procedia PDF Downloads 43414043 Countering the Bullwhip Effect by Absorbing It Downstream in the Supply Chain
Authors: Geng Cui, Naoto Imura, Katsuhiro Nishinari, Takahiro Ezaki
Abstract:
The bullwhip effect, which refers to the amplification of demand variance as one moves up the supply chain, has been observed in various industries and extensively studied through analytic approaches. Existing methods to mitigate the bullwhip effect, such as decentralized demand information, vendor-managed inventory, and the Collaborative Planning, Forecasting, and Replenishment System, rely on the willingness and ability of supply chain participants to share their information. However, in practice, information sharing is often difficult to realize due to privacy concerns. The purpose of this study is to explore new ways to mitigate the bullwhip effect without the need for information sharing. This paper proposes a 'bullwhip absorption strategy' (BAS) to alleviate the bullwhip effect by absorbing it downstream in the supply chain. To achieve this, a two-stage supply chain system was employed, consisting of a single retailer and a single manufacturer. In each time period, the retailer receives an order generated according to an autoregressive process. Upon receiving the order, the retailer depletes the ordered amount, forecasts future demand based on past records, and places an order with the manufacturer using the order-up-to replenishment policy. The manufacturer follows a similar process. In essence, the mechanism of the model is similar to that of the beer game. The BAS is implemented at the retailer's level to counteract the bullwhip effect. This strategy requires the retailer to reduce the uncertainty in its orders, thereby absorbing the bullwhip effect downstream in the supply chain. The advantage of the BAS is that upstream participants can benefit from a reduced bullwhip effect. Although the retailer may incur additional costs, if the gain in the upstream segment can compensate for the retailer's loss, the entire supply chain will be better off. Two indicators, order variance and inventory variance, were used to quantify the bullwhip effect in relation to the strength of absorption. It was found that implementing the BAS at the retailer's level results in a reduction in both the retailer's and the manufacturer's order variances. However, when examining the impact on inventory variances, a trade-off relationship was observed. The manufacturer's inventory variance monotonically decreases with an increase in absorption strength, while the retailer's inventory variance does not always decrease as the absorption strength grows. This is especially true when the autoregression coefficient has a high value, causing the retailer's inventory variance to become a monotonically increasing function of the absorption strength. Finally, numerical simulations were conducted for verification, and the results were consistent with our theoretical analysis.Keywords: bullwhip effect, supply chain management, inventory management, demand forecasting, order-to-up policy
Procedia PDF Downloads 7614042 Geoinformation Technology of Agricultural Monitoring Using Multi-Temporal Satellite Imagery
Authors: Olena Kavats, Dmitry Khramov, Kateryna Sergieieva, Vladimir Vasyliev, Iurii Kavats
Abstract:
Geoinformation technologies of space agromonitoring are a means of operative decision making support in the tasks of managing the agricultural sector of the economy. Existing technologies use satellite images in the optical range of electromagnetic spectrum. Time series of optical images often contain gaps due to the presence of clouds and haze. A geoinformation technology is created. It allows to fill gaps in time series of optical images (Sentinel-2, Landsat-8, PROBA-V, MODIS) with radar survey data (Sentinel-1) and use information about agrometeorological conditions of the growing season for individual monitoring years. The technology allows to perform crop classification and mapping for spring-summer (winter and spring crops) and autumn-winter (winter crops) periods of vegetation, monitoring the dynamics of crop state seasonal changes, crop yield forecasting. Crop classification is based on supervised classification algorithms, takes into account the peculiarities of crop growth at different vegetation stages (dates of sowing, emergence, active vegetation, and harvesting) and agriculture land state characteristics (row spacing, seedling density, etc.). A catalog of samples of the main agricultural crops (Ukraine) is created and crop spectral signatures are calculated with the preliminary removal of row spacing, cloud cover, and cloud shadows in order to construct time series of crop growth characteristics. The obtained data is used in grain crop growth tracking and in timely detection of growth trends deviations from reference samples of a given crop for a selected date. Statistical models of crop yield forecast are created in the forms of linear and nonlinear interconnections between crop yield indicators and crop state characteristics (temperature, precipitation, vegetation indices, etc.). Predicted values of grain crop yield are evaluated with an accuracy up to 95%. The developed technology was used for agricultural areas monitoring in a number of Great Britain and Ukraine regions using EOS Crop Monitoring Platform (https://crop-monitoring.eos.com). The obtained results allow to conclude that joint use of Sentinel-1 and Sentinel-2 images improve separation of winter crops (rapeseed, wheat, barley) in the early stages of vegetation (October-December). It allows to separate successfully the soybean, corn, and sunflower sowing areas that are quite similar in their spectral characteristics.Keywords: geoinformation technology, crop classification, crop yield prediction, agricultural monitoring, EOS Crop Monitoring Platform
Procedia PDF Downloads 45814041 Multi-Wavelength Q-Switched Erbium-Doped Fiber Laser with Photonic Crystal Fiber and Multi-Walled Carbon Nanotubes
Authors: Zian Cheak Tiu, Harith Ahmad, Sulaiman Wadi Harun
Abstract:
A simple multi-wavelength passively Q-switched Erbium-doped fiber laser (EDFL) is demonstrated using low cost multi-walled carbon nanotubes (MWCNTs) based saturable absorber (SA), which is prepared using polyvinyl alcohol (PVA) as a host polymer. The multi-wavelength operation is achieved based on nonlinear polarization rotation (NPR) effect by incorporating 50 m long photonic crystal fiber (PCF) in the ring cavity. The EDFL produces a stable multi-wavelength comb spectrum for more than 14 lines with a fixed spacing of 0.48 nm. The laser also demonstrates a stable pulse train with the repetition rate increases from 14.9 kHz to 25.4 kHz as the pump power increases from the threshold power of 69.0 mW to the maximum pump power of 133.8 mW. The minimum pulse width of 4.4 µs was obtained at the maximum pump power of 133.8 mW while the highest energy of 0.74 nJ was obtained at pump power of 69.0 mW.Keywords: multi-wavelength Q-switched, multi-walled carbon nanotube, photonic crystal fiber
Procedia PDF Downloads 53514040 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 6914039 Algorithms Utilizing Wavelet to Solve Various Partial Differential Equations
Authors: K. P. Mredula, D. C. Vakaskar
Abstract:
The article traces developments and evolution of various algorithms developed for solving partial differential equations using the significant combination of wavelet with few already explored solution procedures. The approach depicts a study over a decade of traces and remarks on the modifications in implementing multi-resolution of wavelet, finite difference approach, finite element method and finite volume in dealing with a variety of partial differential equations in the areas like plasma physics, astrophysics, shallow water models, modified Burger equations used in optical fibers, biology, fluid dynamics, chemical kinetics etc.Keywords: multi-resolution, Haar Wavelet, partial differential equation, numerical methods
Procedia PDF Downloads 29914038 Data Mining Model for Predicting the Status of HIV Patients during Drug Regimen Change
Authors: Ermias A. Tegegn, Million Meshesha
Abstract:
Human Immunodeficiency Virus and Acquired Immunodeficiency Syndrome (HIV/AIDS) is a major cause of death for most African countries. Ethiopia is one of the seriously affected countries in sub Saharan Africa. Previously in Ethiopia, having HIV/AIDS was almost equivalent to a death sentence. With the introduction of Antiretroviral Therapy (ART), HIV/AIDS has become chronic, but manageable disease. The study focused on a data mining technique to predict future living status of HIV/AIDS patients at the time of drug regimen change when the patients become toxic to the currently taking ART drug combination. The data is taken from University of Gondar Hospital ART program database. Hybrid methodology is followed to explore the application of data mining on ART program dataset. Data cleaning, handling missing values and data transformation were used for preprocessing the data. WEKA 3.7.9 data mining tools, classification algorithms, and expertise are utilized as means to address the research problem. By using four different classification algorithms, (i.e., J48 Classifier, PART rule induction, Naïve Bayes and Neural network) and by adjusting their parameters thirty-two models were built on the pre-processed University of Gondar ART program dataset. The performances of the models were evaluated using the standard metrics of accuracy, precision, recall, and F-measure. The most effective model to predict the status of HIV patients with drug regimen substitution is pruned J48 decision tree with a classification accuracy of 98.01%. This study extracts interesting attributes such as Ever taking Cotrim, Ever taking TbRx, CD4 count, Age, Weight, and Gender so as to predict the status of drug regimen substitution. The outcome of this study can be used as an assistant tool for the clinician to help them make more appropriate drug regimen substitution. Future research directions are forwarded to come up with an applicable system in the area of the study.Keywords: HIV drug regimen, data mining, hybrid methodology, predictive model
Procedia PDF Downloads 14214037 Selection of Strategic Suppliers for Partnership: A Model with Two Stages Approach
Authors: Safak Isik, Ozalp Vayvay
Abstract:
Strategic partnerships with suppliers play a vital role for the long-term value-based supply chain. This strategic collaboration keeps still being one of the top priority of many business organizations in order to create more additional value; benefiting mainly from supplier’s specialization, capacity and innovative power, securing supply and better managing costs and quality. However, many organizations encounter difficulties in initiating, developing and managing those partnerships and many attempts result in failures. One of the reasons for such failure is the incompatibility of members of this partnership or in other words wrong supplier selection which emphasize the significance of the selection process since it is the beginning stage. An effective selection process of strategic suppliers is critical to the success of the partnership. Although there are several research studies to select the suppliers in literature, only a few of them is related to strategic supplier selection for long-term partnership. The purpose of this study is to propose a conceptual model for the selection of strategic partnership suppliers. A two-stage approach has been used in proposed model incorporating first segmentation and second selection. In the first stage; considering the fact that not all suppliers are strategically equal and instead of a long list of potential suppliers, Kraljic’s purchasing portfolio matrix can be used for segmentation. This supplier segmentation is the process of categorizing suppliers based on a defined set of criteria in order to identify types of suppliers and determine potential suppliers for strategic partnership. In the second stage, from a pool of potential suppliers defined at first phase, a comprehensive evaluation and selection can be performed to finally define strategic suppliers considering various tangible and intangible criteria. Since a long-term relationship with strategic suppliers is anticipated, criteria should consider both current and future status of the supplier. Based on an extensive literature review; strategical, operational and organizational criteria have been determined and elaborated. The result of the selection can also be used to determine suppliers who are not ready for a partnership but to be developed for strategic partnership. Since the model is based on multiple criteria for both stages, it provides a framework for further utilization of Multi-Criteria Decision Making (MCDM) techniques. The model may also be applied to a wide range of industries and involve managerial features in business organizations.Keywords: Kraljic’s matrix, purchasing portfolio, strategic supplier selection, supplier collaboration, supplier partnership, supplier segmentation
Procedia PDF Downloads 23914036 Automatic Classification Using Dynamic Fuzzy C Means Algorithm and Mathematical Morphology: Application in 3D MRI Image
Authors: Abdelkhalek Bakkari
Abstract:
Image segmentation is a critical step in image processing and pattern recognition. In this paper, we proposed a new robust automatic image classification based on a dynamic fuzzy c-means algorithm and mathematical morphology. The proposed segmentation algorithm (DFCM_MM) has been applied to MR perfusion images. The obtained results show the validity and robustness of the proposed approach.Keywords: segmentation, classification, dynamic, fuzzy c-means, MR image
Procedia PDF Downloads 481