Search results for: ABC-VED inventory classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2911

Search results for: ABC-VED inventory classification

1171 A Network-Theorical Perspective on Music Analysis

Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria

Abstract:

The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.

Keywords: computational musicology, mathematical music modelling, music analysis, style classification

Procedia PDF Downloads 104
1170 Automatic Detection and Classification of Diabetic Retinopathy Using Retinal Fundus Images

Authors: A. Biran, P. Sobhe Bidari, A. Almazroe, V. Lakshminarayanan, K. Raahemifar

Abstract:

Diabetic Retinopathy (DR) is a severe retinal disease which is caused by diabetes mellitus. It leads to blindness when it progress to proliferative level. Early indications of DR are the appearance of microaneurysms, hemorrhages and hard exudates. In this paper, an automatic algorithm for detection of DR has been proposed. The algorithm is based on combination of several image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Also, Support Vector Machine (SVM) Classifier is used to classify retinal images to normal or abnormal cases including non-proliferative or proliferative DR. The proposed method has been tested on images selected from Structured Analysis of the Retinal (STARE) database using MATLAB code. The method is perfectly able to detect DR. The sensitivity specificity and accuracy of this approach are 90%, 87.5%, and 91.4% respectively.

Keywords: diabetic retinopathy, fundus images, STARE, Gabor filter, support vector machine

Procedia PDF Downloads 294
1169 Stock Prediction and Portfolio Optimization Thesis

Authors: Deniz Peksen

Abstract:

This thesis aims to predict trend movement of closing price of stock and to maximize portfolio by utilizing the predictions. In this context, the study aims to define a stock portfolio strategy from models created by using Logistic Regression, Gradient Boosting and Random Forest. Recently, predicting the trend of stock price has gained a significance role in making buy and sell decisions and generating returns with investment strategies formed by machine learning basis decisions. There are plenty of studies in the literature on the prediction of stock prices in capital markets using machine learning methods but most of them focus on closing prices instead of the direction of price trend. Our study differs from literature in terms of target definition. Ours is a classification problem which is focusing on the market trend in next 20 trading days. To predict trend direction, fourteen years of data were used for training. Following three years were used for validation. Finally, last three years were used for testing. Training data are between 2002-06-18 and 2016-12-30 Validation data are between 2017-01-02 and 2019-12-31 Testing data are between 2020-01-02 and 2022-03-17 We determine Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate as benchmarks which we should outperform. We compared our machine learning basis portfolio return on test data with return of Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate. We assessed our model performance with the help of roc-auc score and lift charts. We use logistic regression, Gradient Boosting and Random Forest with grid search approach to fine-tune hyper-parameters. As a result of the empirical study, the existence of uptrend and downtrend of five stocks could not be predicted by the models. When we use these predictions to define buy and sell decisions in order to generate model-based-portfolio, model-based-portfolio fails in test dataset. It was found that Model-based buy and sell decisions generated a stock portfolio strategy whose returns can not outperform non-model portfolio strategies on test dataset. We found that any effort for predicting the trend which is formulated on stock price is a challenge. We found same results as Random Walk Theory claims which says that stock price or price changes are unpredictable. Our model iterations failed on test dataset. Although, we built up several good models on validation dataset, we failed on test dataset. We implemented Random Forest, Gradient Boosting and Logistic Regression. We discovered that complex models did not provide advantage or additional performance while comparing them with Logistic Regression. More complexity did not lead us to reach better performance. Using a complex model is not an answer to figure out the stock-related prediction problem. Our approach was to predict the trend instead of the price. This approach converted our problem into classification. However, this label approach does not lead us to solve the stock prediction problem and deny or refute the accuracy of the Random Walk Theory for the stock price.

Keywords: stock prediction, portfolio optimization, data science, machine learning

Procedia PDF Downloads 81
1168 Evaluation of Groundwater Suitability for Irrigation Purposes: A Case Study for an Arid Region

Authors: Mustafa M. Bob, Norhan Rahman, Abdalla Elamin, Saud Taher

Abstract:

The objective of this study was to assess the suitability of Madinah city groundwater for irrigation purposes. Of the twenty three wells that were drilled in different locations in the city for the purposes of this study, twenty wells were sampled for water quality analyses. The United States Department of Agriculture (USDA) classification of irrigation water that is based on Sodium hazard (SAR) and salinity hazard was used for suitability assessment. In addition, the residual sodium carbonate (RSC) was calculated for all samples and also used for irrigation suitability assessment. Results showed that all groundwater samples are in the acceptable quality range for irrigation based on RSC values. When SAR and salinity hazard were assessed, results showed that while all groundwater samples (except one) fell in the acceptable range of SAR, they were either in the high or very high salinity zone which indicates that care should be taken regarding the type of soil and crops in the study area.

Keywords: irrigation suitability, TDS, salinity, SAR

Procedia PDF Downloads 372
1167 Time Series Regression with Meta-Clusters

Authors: Monika Chuchro

Abstract:

This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain a subgroups of time series data with normal distribution from inflow into waste water treatment plant data which Composed of several groups differing by mean value. Two simple algorithms: K-mean and EM were chosen as a clustering method. The rand index was used to measure the similarity. After simple meta-clustering, regression model was performed for each subgroups. The final model was a sum of subgroups models. The quality of obtained model was compared with the regression model made using the same explanatory variables but with no clustering of data. Results were compared by determination coefficient (R2), measure of prediction accuracy mean absolute percentage error (MAPE) and comparison on linear chart. Preliminary results allows to foresee the potential of the presented technique.

Keywords: clustering, data analysis, data mining, predictive models

Procedia PDF Downloads 466
1166 Automated Detection of Women Dehumanization in English Text

Authors: Maha Wiss, Wael Khreich

Abstract:

Animals, objects, foods, plants, and other non-human terms are commonly used as a source of metaphors to describe females in formal and slang language. Comparing women to non-human items not only reflects cultural views that might conceptualize women as subordinates or in a lower position than humans, yet it conveys this degradation to the listeners. Moreover, the dehumanizing representation of females in the language normalizes the derogation and even encourages sexism and aggressiveness against women. Although dehumanization has been a popular research topic for decades, according to our knowledge, no studies have linked women's dehumanizing language to the machine learning field. Therefore, we introduce our research work as one of the first attempts to create a tool for the automated detection of the dehumanizing depiction of females in English texts. We also present the first labeled dataset on the charted topic, which is used for training supervised machine learning algorithms to build an accurate classification model. The importance of this work is that it accomplishes the first step toward mitigating dehumanizing language against females.

Keywords: gender bias, machine learning, NLP, women dehumanization

Procedia PDF Downloads 80
1165 Credit Risk Evaluation Using Genetic Programming

Authors: Ines Gasmi, Salima Smiti, Makram Soui, Khaled Ghedira

Abstract:

Credit risk is considered as one of the important issues for financial institutions. It provokes great losses for banks. To this objective, numerous methods for credit risk evaluation have been proposed. Many evaluation methods are black box models that cannot adequately reveal information hidden in the data. However, several works have focused on building transparent rules-based models. For credit risk assessment, generated rules must be not only highly accurate, but also highly interpretable. In this paper, we aim to build both, an accurate and transparent credit risk evaluation model which proposes a set of classification rules. In fact, we consider the credit risk evaluation as an optimization problem which uses a genetic programming (GP) algorithm, where the goal is to maximize the accuracy of generated rules. We evaluate our proposed approach on the base of German and Australian credit datasets. We compared our finding with some existing works; the result shows that the proposed GP outperforms the other models.

Keywords: credit risk assessment, rule generation, genetic programming, feature selection

Procedia PDF Downloads 355
1164 Vector-Based Analysis in Cognitive Linguistics

Authors: Chuluundorj Begz

Abstract:

This paper presents the dynamic, psycho-cognitive approach to study of human verbal thinking on the basis of typologically different languages /as a Mongolian, English and Russian/. Topological equivalence in verbal communication serves as a basis of Universality of mental structures and therefore deep structures. Mechanism of verbal thinking consisted at the deep level of basic concepts, rules for integration and classification, neural networks of vocabulary. In neuro cognitive study of language, neural architecture and neuro psychological mechanism of verbal cognition are basis of a vector-based modeling. Verbal perception and interpretation of the infinite set of meanings and propositions in mental continuum can be modeled by applying tensor methods. Euclidean and non-Euclidean spaces are applied for a description of human semantic vocabulary and high order structures.

Keywords: Euclidean spaces, isomorphism and homomorphism, mental lexicon, mental mapping, semantic memory, verbal cognition, vector space

Procedia PDF Downloads 520
1163 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue

Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni

Abstract:

Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.

Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM

Procedia PDF Downloads 334
1162 Open-Source YOLO CV For Detection of Dust on Solar PV Surface

Authors: Jeewan Rai, Kinzang, Yeshi Jigme Choden

Abstract:

Accumulation of dust on solar panels impacts the overall efficiency and the amount of energy they produce. While various techniques exist for detecting dust to schedule cleaning, many of these methods use MATLAB image processing tools and other licensed software, which can be financially burdensome. This study will investigate the efficiency of a free open-source computer vision library using the YOLO algorithm. The proposed approach has been tested on images of solar panels with varying dust levels through an experiment setup. The experimental findings illustrated the effectiveness of using the YOLO-based image classification method and the overall dust detection approach with an accuracy of 90% in distinguishing between clean and dusty panels. This open-source solution provides a cost effective and accessible alternative to commercial image processing tools, offering solutions for optimizing solar panel maintenance and enhancing energy production.

Keywords: YOLO, openCV, dust detection, solar panels, computer vision, image processing

Procedia PDF Downloads 36
1161 Hierarchical Control Structure to Control the Power Distribution System Components in Building Systems

Authors: Hamed Sarbazy, Zohre Gholipour Haftkhani, Ali Safari, Pejman Hosseiniun

Abstract:

Scientific and industrial progress in the past two decades has resulted in energy distribution systems based on power electronics, as an enabling technology in various industries and building management systems can be considered. Grading and standardization module power electronics systems and its use in a distributed control system, a strategy for overcoming the limitations of using this system. The purpose of this paper is to investigate strategies for scheduling and control structure of standard modules is a power electronic systems. This paper introduces the classical control methods and disadvantages of these methods will be discussed, The hierarchical control as a mechanism for distributed control structure of the classification module explains. The different levels of control and communication between these levels are fully introduced. Also continue to standardize software distribution system control structure is discussed. Finally, as an example, the control structure will be presented in a DC distribution system.

Keywords: application management, hardware management, power electronics, building blocks

Procedia PDF Downloads 521
1160 Emotional Analysis for Text Search Queries on Internet

Authors: Gemma García López

Abstract:

The goal of this study is to analyze if search queries carried out in search engines such as Google, can offer emotional information about the user that performs them. Knowing the emotional state in which the Internet user is located can be a key to achieve the maximum personalization of content and the detection of worrying behaviors. For this, two studies were carried out using tools with advanced natural language processing techniques. The first study determines if a query can be classified as positive, negative or neutral, while the second study extracts emotional content from words and applies the categorical and dimensional models for the representation of emotions. In addition, we use search queries in Spanish and English to establish similarities and differences between two languages. The results revealed that text search queries performed by users on the Internet can be classified emotionally. This allows us to better understand the emotional state of the user at the time of the search, which could involve adapting the technology and personalizing the responses to different emotional states.

Keywords: emotion classification, text search queries, emotional analysis, sentiment analysis in text, natural language processing

Procedia PDF Downloads 142
1159 Net Zero Energy Schools: The Starting Block for the Canadian Energy Neutral K-12 Schools

Authors: Hamed Hakim, Roderic Archambault, Charles J. Kibert, Maryam Mirhadi Fard

Abstract:

Changes in the patterns of life in the late 20th and early 21st century have created new challenges for educational systems. Greening the physical environment of school buildings has emerged as a response to some of those challenges and led to the design of energy efficient K-12 school buildings. With the advancement in knowledge and technology, the successful construction of Net Zero Energy Schools, such as the Lady Bird Johnson Middle School demonstrates a cutting edge generation of sustainable schools, and solves the former challenge of attaining energy self-sufficient educational facilities. There are approximately twenty net zero energy K-12 schools in the U.S. of which about six are located in Climate Zone 5 and 6 based on ASHRAE climate zone classification. This paper aims to describe and analyze the current status of energy efficient and NZE schools in Canada. An attempt is made to study existing U.S. energy neutral strategies closest to the climate zones in Canada (zones 5 and 6) and identify the best practices for Canadian schools.

Keywords: Canada K-12 schools, green school, energy efficient, net-zero energy schools

Procedia PDF Downloads 407
1158 Corporate Governance and Corporate Sustainability: Evidence from a Developing Country

Authors: Edmund Gyimah

Abstract:

Using data from 146 annual reports of listed firms in Ghana for the period 2013-2020, this study presents indicative findings which inspire practical actions and future research. Firms which prepared and presented sustainability reports were excluded from this study for a coverage of corporate sustainability disclosures centred on annual reports. Also, corporate sustainability disclosures of the firms on corporate websites were not included in the study considering the tendency of updates which cannot easily be traced. The corporate sustainability disclosures in the annual reports since the commencement of the G4 Guidelines in 2013 have been below average for all the dimensions of sustainability and the general sustainability disclosures. Few traditional elements of the board composition such as board size and board independence could affect the corporate sustainability disclosures in the annual reports as well as the age of the firm, firm size, and industry classification of the firm. Sustainability disclosures are greater in sustainability reports than in annual reports, however, firms without sustainability reports should have a considerable amount of sustainability disclosures in their annual reports. Also, because of the essence of sustainability, this study suggests to firms to have sustainability committee perhaps, they could make a difference in disclosing the enough sustainability information even when they do not present sustainability information in stand-alone reports.

Keywords: disclosures, sustainability, board, reports

Procedia PDF Downloads 188
1157 Tree Species Classification Using Effective Features of Polarimetric SAR and Hyperspectral Images

Authors: Milad Vahidi, Mahmod R. Sahebi, Mehrnoosh Omati, Reza Mohammadi

Abstract:

Forest management organizations need information to perform their work effectively. Remote sensing is an effective method to acquire information from the Earth. Two datasets of remote sensing images were used to classify forested regions. Firstly, all of extractable features from hyperspectral and PolSAR images were extracted. The optical features were spectral indexes related to the chemical, water contents, structural indexes, effective bands and absorption features. Also, PolSAR features were the original data, target decomposition components, and SAR discriminators features. Secondly, the particle swarm optimization (PSO) and the genetic algorithms (GA) were applied to select optimization features. Furthermore, the support vector machine (SVM) classifier was used to classify the image. The results showed that the combination of PSO and SVM had higher overall accuracy than the other cases. This combination provided overall accuracy about 90.56%. The effective features were the spectral index, the bands in shortwave infrared (SWIR) and the visible ranges and certain PolSAR features.

Keywords: hyperspectral, PolSAR, feature selection, SVM

Procedia PDF Downloads 419
1156 Calculate Product Carbon Footprint through the Internet of Things from Network Science

Authors: Jing Zhang

Abstract:

To reduce the carbon footprint of mankind and become more sustainable is one of the major challenges in our era. Internet of Things (IoT) mainly resolves three problems: Things to Things (T2T), Human to Things, H2T), and Human to Human (H2H). Borrowing the classification of IoT, we can find carbon prints of industries also can be divided in these three ways. Therefore, monitoring the routes of generation and circulation of products may help calculate product carbon print. This paper does not consider any technique used by IoT itself, but the ideas of it look at the connection of products. Carbon prints are like a gene or mark of a product from raw materials to the final products, which never leave the products. The contribution of this paper is to combine the characteristics of IoT and the methodology of network science to find a way to calculate the product's carbon footprint. Life cycle assessment, LCA is a traditional and main tool to calculate the carbon print of products. LCA is a traditional but main tool, which includes three kinds.

Keywords: product carbon footprint, Internet of Things, network science, life cycle assessment

Procedia PDF Downloads 116
1155 Privacy-Preserving Model for Social Network Sites to Prevent Unwanted Information Diffusion

Authors: Sanaz Kavianpour, Zuraini Ismail, Bharanidharan Shanmugam

Abstract:

Social Network Sites (SNSs) can be served as an invaluable platform to transfer the information across a large number of individuals. A substantial component of communicating and managing information is to identify which individual will influence others in propagating information and also whether dissemination of information in the absence of social signals about that information will be occurred or not. Classifying the final audience of social data is difficult as controlling the social contexts which transfers among individuals are not completely possible. Hence, undesirable information diffusion to an unauthorized individual on SNSs can threaten individuals’ privacy. This paper highlights the information diffusion in SNSs and moreover it emphasizes the most significant privacy issues to individuals of SNSs. The goal of this paper is to propose a privacy-preserving model that has urgent regards with individuals’ data in order to control availability of data and improve privacy by providing access to the data for an appropriate third parties without compromising the advantages of information sharing through SNSs.

Keywords: anonymization algorithm, classification algorithm, information diffusion, privacy, social network sites

Procedia PDF Downloads 321
1154 Ideal Posture in Regulating Legal Regulations in Indonesia

Authors: M Jeffri Arlinandes Chandra, Puwaningdyah Murti Wahyuni, Dewi Mutiara M Jeffri Arlinandes Chandra, Puwaningdyah Murti Wahyuni, Dewi Mutiara

Abstract:

Indonesia is a state of the law in accordance with article 1 paragraph 3 of the Constitution of the Republic of Indonesia (1945 Constitution), namely, 'the State of Indonesia is a state of law'. The consequences of the rule of law are making the law as the main commanding officer or making the law as a basis for carrying out an action taken by the state. The types of regulations and procedures for the formation of legislation in Indonesia are contained in Law Number 12 of 2011 concerning the Formation of Legislation. Various attempts were made to make quality regulations both in the formal hierarchy and material hierarchy such as synchronization and harmonization in the formation of laws and regulations so that there is no conflict between equal and hierarchical laws, but the fact is that there are still many conflicting regulations found between one another. This can be seen clearly in the many laws and regulations that were sued to judicial institutions such as the Constitutional Court (MK) and the Supreme Court (MA). Therefore, it is necessary to have a formulation regarding the governance of the formation of laws and regulations so as to minimize the occurrence of lawsuits to the court so that positive law can be realized which can be used today and for the future (ius constituendum). The research method that will be used in this research is a combination of normative research (library research) supported by empirical data from field research so that it can formulate concepts and answer the challenges being faced. First, the structuring of laws and regulations in Indonesia must start from the inventory of laws and regulations, whether they can be classified based on the type of legislation, what are they set about, the year of manufacture, etc. so that they can be clearly traced to the regulations relating to the formation of laws and regulations. Second, the search and revocation/revocation of laws and regulations that do not exist in the state registration system. Third, the periodic evaluation system is carried out at every level of the hierarchy of laws and regulations. These steps will form an ideal model of laws and regulations in Indonesia both in terms of content and material so that the instructions can be codified and clearly inventoried so that they can be accessed by the wider community as a concrete manifestation of the principle that all people know the law (presumptio iures de iure).

Keywords: legislation, review, evaluation, reconstruction

Procedia PDF Downloads 150
1153 Methods for Distinction of Cattle Using Supervised Learning

Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl

Abstract:

Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.

Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning

Procedia PDF Downloads 552
1152 Study of Physico-Chimical Properties of a Silty Soil

Authors: Moulay Smaïne Ghembaza, Mokhtar Dadouch, Nour-Said Ikhlef

Abstract:

Soil treatment is to make use soil that does not have the characteristics required in a given context. We limit ourselves in this work to the field of road earthworks where we have chosen to develop a local material in the region of Sidi Bel Abbes (Algeria). This material has poor characteristics not meeting the standards used in road geo technics. To remedy this, firstly, we were trying to improve the Proctor Standard characteristics of this material by mechanical treatment increasing the compaction energy. Then, by a chemical treatment, adding some cement dosages, our results show that this material classified A1h a increase maximum dry density and a reduction in the water content of compaction. A comparative study is made on the optimal properties of the material between the two modes of treatment. On the other hand, after treatment, one finds a decrease in the plasticity index and the methylene blue value. This material exhibits a change of class. Therefore, soil class CL turned into a soil class composed CL-ML (Silt of low plasticity). This observation allows this material to be used as backfill or sub grade.

Keywords: treatment of soil, cement, subgrade, Atteberg limits, classification, optimum proctor properties

Procedia PDF Downloads 473
1151 Traffic Light Detection Using Image Segmentation

Authors: Vaishnavi Shivde, Shrishti Sinha, Trapti Mishra

Abstract:

Traffic light detection from a moving vehicle is an important technology both for driver safety assistance functions as well as for autonomous driving in the city. This paper proposed a deep-learning-based traffic light recognition method that consists of a pixel-wise image segmentation technique and a fully convolutional network i.e., UNET architecture. This paper has used a method for detecting the position and recognizing the state of the traffic lights in video sequences is presented and evaluated using Traffic Light Dataset which contains masked traffic light image data. The first stage is the detection, which is accomplished through image processing (image segmentation) techniques such as image cropping, color transformation, segmentation of possible traffic lights. The second stage is the recognition, which means identifying the color of the traffic light or knowing the state of traffic light which is achieved by using a Convolutional Neural Network (UNET architecture).

Keywords: traffic light detection, image segmentation, machine learning, classification, convolutional neural networks

Procedia PDF Downloads 176
1150 Nutrient in River Ecosystems Follows Human Activities More Than Climate Warming

Authors: Mohammed Abdulridha Hamdan

Abstract:

To face the water crisis, understanding the role of human activities on nutrient concentrations in aquatic ecosystems needs more investigations to compare to extensively studies which have been carried out to understand these impacts on the water quality of different aquatic ecosystems. We hypothesized human activates on the catchments of Tigris river may change nutrient concentrations in water along the river. The results showed that phosphate concentration differed significantly among the studied sites due to distributed human activities, while nitrate concentration did not. Phosphate and nitrate concentrations were not affected by water temperature. We concluded that human activities on the surrounding landscapes could be more essential sources for nutrients of aquatic ecosystems than role of ongoing climate warming. Despite the role of warming in driving nutrients availability in aquatic ecosystems, our findings suggest to take the different activities on the surrounding catchments into account in the studies caring about the trophic status classification of aquatic ecosystems.

Keywords: nitrate, phosphate, anthropogenic, warming

Procedia PDF Downloads 82
1149 Integrating Dependent Material Planning Cycle into Building Information Management: A Building Information Management-Based Material Management Automation Framework

Authors: Faris Elghaish, Sepehr Abrishami, Mark Gaterell, Richard Wise

Abstract:

The collaboration and integration between all building information management (BIM) processes and tasks are necessary to ensure that all project objectives can be delivered. The literature review has been used to explore the state of the art BIM technologies to manage construction materials as well as the challenges which have faced the construction process using traditional methods. Thus, this paper aims to articulate a framework to integrate traditional material planning methods such as ABC analysis theory (Pareto principle) to analyse and categorise the project materials, as well as using independent material planning methods such as Economic Order Quantity (EOQ) and Fixed Order Point (FOP) into the BIM 4D, and 5D capabilities in order to articulate a dependent material planning cycle into BIM, which relies on the constructability method. Moreover, we build a model to connect between the material planning outputs and the BIM 4D and 5D data to ensure that all project information will be accurately presented throughout integrated and complementary BIM reporting formats. Furthermore, this paper will present a method to integrate between the risk management output and the material management process to ensure that all critical materials are monitored and managed under the all project stages. The paper includes browsers which are proposed to be embedded in any 4D BIM platform in order to predict the EOQ as well as FOP and alarm the user during the construction stage. This enables the planner to check the status of the materials on the site as well as to get alarm when the new order will be requested. Therefore, this will lead to manage all the project information in a single context and avoid missing any information at early design stage. Subsequently, the planner will be capable of building a more reliable 4D schedule by allocating the categorised material with the required EOQ to check the optimum locations for inventory and the temporary construction facilitates.

Keywords: building information management, BIM, economic order quantity, EOQ, fixed order point, FOP, BIM 4D, BIM 5D

Procedia PDF Downloads 173
1148 Multi-scale Geographic Object-Based Image Analysis (GEOBIA) Approach to Segment a Very High Resolution Images for Extraction of New Degraded Zones. Application to The Region of Mécheria in The South-West of Algeria

Authors: Bensaid A., Mostephaoui T., Nedjai R.

Abstract:

A considerable area of Algerian lands are threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mécheriadepartment generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of PlanetScope PSB.SB sensors images by September 29, 2021. As a second step, we prospect the use of a multi-scale geographic object-based image analysis (GEOBIA) approach to segment the high spatial resolution images acquired on heterogeneous surfaces that vary according to human influence on the environment. We have used the fractal net evolution approach (FNEA) algorithm to segment images (Baatz&Schäpe, 2000). Multispectral data, a digital terrain model layer, ground truth data, a normalized difference vegetation index (NDVI) layer, and a first-order texture (entropy) layer were used to segment the multispectral images at three segmentation scales, with an emphasis on accurately delineating the boundaries and components of the sand accumulation areas (Dune, dunes fields, nebka, and barkhane). It is important to note that each auxiliary data contributed to improve the segmentation at different scales. The silted areas were classified using a nearest neighbor approach over the Naâma area using imagery. The classification of silted areas was successfully achieved over all study areas with an accuracy greater than 85%, although the results suggest that, overall, a higher degree of landscape heterogeneity may have a negative effect on segmentation and classification. Some areas suffered from the greatest over-segmentation and lowest mapping accuracy (Kappa: 0.79), which was partially attributed to confounding a greater proportion of mixed siltation classes from both sandy areas and bare ground patches. This research has demonstrated a technique based on very high-resolution images for mapping sanded and degraded areas using GEOBIA, which can be applied to the study of other lands in the steppe areas of the northern countries of the African continent.

Keywords: land development, GIS, sand dunes, segmentation, remote sensing

Procedia PDF Downloads 109
1147 A Hybrid System of Hidden Markov Models and Recurrent Neural Networks for Learning Deterministic Finite State Automata

Authors: Pavan K. Rallabandi, Kailash C. Patidar

Abstract:

In this paper, we present an optimization technique or a learning algorithm using the hybrid architecture by combining the most popular sequence recognition models such as Recurrent Neural Networks (RNNs) and Hidden Markov models (HMMs). In order to improve the sequence or pattern recognition/ classification performance by applying a hybrid/neural symbolic approach, a gradient descent learning algorithm is developed using the Real Time Recurrent Learning of Recurrent Neural Network for processing the knowledge represented in trained Hidden Markov Models. The developed hybrid algorithm is implemented on automata theory as a sample test beds and the performance of the designed algorithm is demonstrated and evaluated on learning the deterministic finite state automata.

Keywords: hybrid systems, hidden markov models, recurrent neural networks, deterministic finite state automata

Procedia PDF Downloads 392
1146 Patent Protection for AI Innovations in Pharmaceutical Products

Authors: Nerella Srinivas

Abstract:

This study explores the significance of patent protection for artificial intelligence (AI) innovations in the pharmaceutical sector, emphasizing applications in drug discovery, personalized medicine, and clinical trial optimization. The challenges of patenting AI-driven inventions are outlined, focusing on the classification of algorithms as abstract ideas, meeting the non-obviousness standard, and issues around defining inventorship. The methodology includes examining case studies and existing patents, with an emphasis on how companies like Benevolent AI and Insilico Medicine have successfully secured patent rights. Findings demonstrate that a strategic approach to patent protection is essential, with particular attention to showcasing AI’s technical contributions to pharmaceutical advancements. Conclusively, the study underscores the critical role of understanding patent law and innovation strategies in leveraging intellectual property rights in the rapidly advancing field of AI-driven pharmaceuticals.

Keywords: artificial intelligence, pharmaceutical industry, patent protection, drug discovery, personalized medicine, clinical trials, intellectual property, non-obviousness

Procedia PDF Downloads 15
1145 Using Data Mining Techniques to Evaluate the Different Factors Affecting the Academic Performance of Students at the Faculty of Information Technology in Hashemite University in Jordan

Authors: Feras Hanandeh, Majdi Shannag

Abstract:

This research studies the different factors that could affect the Faculty of Information Technology in Hashemite University students’ accumulative average. The research paper verifies the student information, background, their academic records, and how this information will affect the student to get high grades. The student information used in the study is extracted from the student’s academic records. The data mining tools and techniques are used to decide which attribute(s) will affect the student’s accumulative average. The results show that the most important factor which affects the students’ accumulative average is the student Acceptance Type. And we built a decision tree model and rules to determine how the student can get high grades in their courses. The overall accuracy of the model is 44% which is accepted rate.

Keywords: data mining, classification, extracting rules, decision tree

Procedia PDF Downloads 417
1144 Examining Effects of Electronic Market Functions on Decrease in Product Unit Cost and Response Time to Customer

Authors: Maziyar Nouraee

Abstract:

Electronic markets in recent decades contribute remarkably in business transactions. Many organizations consider traditional ways of trade non-economical and therefore they do trade only through electronic markets. There are different categorizations of electronic markets functions. In one classification, functions of electronic markets are categorized into classes as information, transactions, and value added. In the present paper, effects of the three classes on the two major elements of the supply chain management are measured. The two elements are decrease in the product unit cost and reduction in response time to the customer. The results of the current research show that among nine minor elements related to the three classes of electronic markets functions, six factors and three factors influence on reduction of the product unit cost and reduction of response time to the customer, respectively.

Keywords: electronic commerce, electronic market, B2B trade, supply chain management

Procedia PDF Downloads 392
1143 Extracorporeal Shock Wave Therapy versus Functional Electrical Stimulation on Spasticity, Function and Gait Parameters in Hemiplegic Cerebral Palsy

Authors: Mohamed A. Eid, Sobhy M. Aly

Abstract:

Background: About 75% of children with spastic hemiplegic cerebral palsy walk independently, but most still show abnormal gait patterns because of contractures across the joints and muscle spasticity. Objective: The purpose of this study was to investigate and compare the effects of extracorporeal shock wave therapy (ESWT) versus functional electrical stimulation (FES) on spasticity, function, and gait parameters in children with hemiplegic cerebral palsy (CP). Methods: A randomized controlled trail was conducted for 45 children with hemiplegic CP ranging in age from 6 to 9 years. They were assigned randomly using opaque envelopes into three groups. Physical Therapy (PT) group consisted of 15 children and received the conventional physical therapy program (CPTP) in addition to ankle foot orthosis (AFO). ESWT group consisted of 15 children and received the CPTP, AFO in addition to ESWT. FES group also consisted of 15 children and received the CPTP, AFO in addition to FES. All groups received the program of treatment 3 days/week for 12 weeks. Evaluation of spasticity by using the Modified Ashworth Scale (MAS), function by using the Pediatric Evaluation Disability Inventory (PEDI) and gait parameters by using the 3-D gait analysis was conducted at baseline and after 12 weeks of the treatment program. Results: Within groups, significant improvements in spasticity, function, and gait (P = 0.05) were observed in both ESWT and FES groups after treatment. While between groups, ESWT group showed significant improvements in all measured variables compared with FES and PT groups (P ˂ 0.05) after treatment. Conclusion: ESWT induced significant improvement than FES in decreasing spasticity and improving function and gait in children with hemiplegic CP. Therefore, ESWT should be included as an adjunctive therapy in the rehabilitation program of these children.

Keywords: cerebral palsy, extracorporeal shock wave therapy, functional electrical stimulation, function, gait, spasticity

Procedia PDF Downloads 130
1142 Ontology-Driven Generation of Radiation Protection Procedures

Authors: Chamseddine Barki, Salam Labidi, Hanen Boussi Rahmouni

Abstract:

In this article, we present the principle and suitable methodology for the design of a medical ontology that highlights the radiological and dosimetric knowledge, applied in diagnostic radiology and radiation-therapy. Our ontology, which we named «Onto.Rap», is the subject of radiation protection in medical and radiology centers by providing a standardized regulatory oversight. Thanks to its added values of knowledge-sharing, reuse and the ease of maintenance, this ontology tends to solve many problems. Of which we name the confusion between radiological procedures a practitioner might face while performing a patient radiological exam. Adding to it, the difficulties they might have in interpreting applicable patient radioprotection standards. Here, the ontology, thanks to its concepts simplification and expressiveness capabilities, can ensure an efficient classification of radiological procedures. It also provides an explicit representation of the relations between the different components of the studied concept. In fact, an ontology based-radioprotection expert system, when used in radiological center, could implement systematic radioprotection best practices during patient exam and a regulatory compliance service auditing afterwards.

Keywords: knowledge, ontology, radiation protection, radiology

Procedia PDF Downloads 315