Search results for: maximal data sets
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25280

Search results for: maximal data sets

24140 The Dialectic of Law and Politics for George Friedrich Wilhelm Hegel

Authors: Djehich Mohamed Yousri

Abstract:

This paper aims to address the dialectic of law and politics in the philosophy of the state of the philosopher Hegel by addressing the concept of law, which refers to its general meaning to the set of rules and legislation that man sets to apply them within society, as it is considered one of the primary and necessary conditions for the functioning of And organizing social life, when it defines the rights and duties of every individual belonging to the state, by approaching it with central concepts in political philosophy, such as the state, freedom and the people. The most prominent result that we reached through our analysis of the details of the problematic research is the relationship between law and politics in the philosophical system of Hegel; on the one hand, We find that the state is rational only to the extent that it resorts to the law and works under it, and the latter does not realize its essence and effectiveness unless it is extracted from the customs, traditions, and culture of the people so that it does not conflict with the ideal goal of its existence, which is to achieve freedom and protect it from all possible. A state does not mean at all to reduce the freedom of the people, so there is no conflict between law and freedom.

Keywords: hegel, the law, country, freedom, citizen

Procedia PDF Downloads 66
24139 Design and Development of a Platform for Analyzing Spatio-Temporal Data from Wireless Sensor Networks

Authors: Walid Fantazi

Abstract:

The development of sensor technology (such as microelectromechanical systems (MEMS), wireless communications, embedded systems, distributed processing and wireless sensor applications) has contributed to a broad range of WSN applications which are capable of collecting a large amount of spatiotemporal data in real time. These systems require real-time data processing to manage storage in real time and query the data they process. In order to cover these needs, we propose in this paper a Snapshot spatiotemporal data model based on object-oriented concepts. This model allows saving storing and reducing data redundancy which makes it easier to execute spatiotemporal queries and save analyzes time. Further, to ensure the robustness of the system as well as the elimination of congestion from the main access memory we propose a spatiotemporal indexing technique in RAM called Captree *. As a result, we offer an RIA (Rich Internet Application) -based SOA application architecture which allows the remote monitoring and control.

Keywords: WSN, indexing data, SOA, RIA, geographic information system

Procedia PDF Downloads 240
24138 The Study of Platelet-Rich Plasma(PRP) on Wounds of OLEFT Rats Using Expression of MMP-2, MMP-9 mRNA

Authors: Ho Seong Shin

Abstract:

Introduction: A research in relation to wound healing also showed that platelet-rich plasma (PRP) was effective on normal tissue regeneration. Nonetheless, there is no evidence that when platelet-rich plasma was applied on diabetic wound, it normalize diabetic wound healing process. In this study, we have analyzed matrix metalloproteinase-2 (MMP-2), matrix metalloproteinase-9 (MMP-9) expression to know the effect of PRP on diabetic wounds using Reverse transcription-polymerase chain reaction (RT-PCR) of MMP-2, MMP-9 mRNA. Materials and Methods: Platelet-rich plasma (PRP) was prepared from blood of 6 rats. The whole 120-mL was added immediately to an anticoagulant. Citrate phosphonate dextrose(CPD) buffer (0.15 mg CPDmL) in a ratio of 1 mL of CPD buffer to 5 mL of blood. The blood was then centrifuged at 220g for 20minutes. The supernatant was saved to produce fibrin glue. The participate containing PRP was used for second centrifugation at 480g for 20 minutes. The pellet from the second centrifugation was saved and diluted with supernatant until the platelet concentration became 900,000/μL. Twenty male, 4week-old OLETF rats were underwent operation; each rat had two wounds created on left and right sides. The each wound of left side was treated with PRP gel, the wound of right side was treated with physiologic saline gauze. Results: RT-PCR analysis; The levels of MMP-2 mRNA in PRP applied tissues were positively related to postwounding days, whereas MMP-2 mRNA expression in saline-applied tissues remained in 5day after treatment. MMP-9 mRNA was undetectable in saline-applied tissues for either tissue, except 3day after treatment. Following PRP-applied tissues, MMP-9 mRNA expression was detected, with maximal expression being seen at third day. The levels of MMP-9 mRNA in PRP applied tissues were reported high intensity of optical density related to saline applied tissues.

Keywords: diabetes, MMP-2, MMP-9, OLETF, PRP, wound healing MMP-9

Procedia PDF Downloads 262
24137 Antioxidant Activity of the Methanolic Extract and Antimicrobial Activity of the Essential Oil of Rosmarinus officinalis L. Grown in Algeria

Authors: Nassim Belkacem, Amina Azzam, Dalila Haouchine, Kahina Bennacer, Samira Soufit

Abstract:

Objective: To evaluate the antioxidant activity of the methanolic extract along with the antimicrobial activity of the essential oil of the aerial parts of Rosmarinus officinalis L. collected in the region of Bejaia (northern center of Algeria). Materials and methods: The polyphenols and flavonoids contents of the methanolic extract were measured. The antioxidant activity was evaluated using two methods: the ABTS method and DPPH assay. The antimicrobial activity was studied by the agar diffusion method against five bacterial strains (Three Gram positive strains and two Gram negative strains) and one fungus. Results: The total polyphenol and flavonoid content was about 43.8 mg gallic acid equivalent per gram (GA Eq/g) and 7.04 mg quercetin equivalent per gram (Q Eq/g), respectively. In the ABTS assay, the rosemary extract has shown an inhibition of 98.02% at the concentration of 500ug/ml with a half maximal inhibitory concentration value (IC50) of 194.92ug/ml. The results of DPPH assay have shown that the rosemary extract has an inhibition of 94.67 % with an IC50 value of 17.87ug/ml, which is lower than that of Butylhydroxyanisol (BHA) about 6.03ug/ml and ascorbic acid about 1.24μg/ml. The yield in essential oil of rosemary obtained by hydrodistillation was 1.42%. Based on the determination of the diameter of inhibition, different antimicrobial activity of the essential oil was revealed against the six tested microbes. Escherichia coli from the University Hospital (UH), Streptococcus aureus (UH) and Pseudomonas aeruginosa ATCC have a minimum inhibitory concentration value (MIC) of 62.5µl/ml. However, Bacillus sp (UH) and Staphylococcus aureus ATCC have an MIC value of 125μl/ml. The inhibition zone against Candida sp was about 24 mm. The aromatograms showed that the essential oil of rosemary exercises an antifungal activity more important than the antibacterial one.

Keywords: Rosmarinus officinalis L., maceration, essential oil, antioxidant, antimicrobial activity

Procedia PDF Downloads 502
24136 Optical Fiber Data Throughput in a Quantum Communication System

Authors: Arash Kosari, Ali Araghi

Abstract:

A mathematical model for an optical-fiber communication channel is developed which results in an expression that calculates the throughput and loss of the corresponding link. The data are assumed to be transmitted by using of separate photons with different polarizations. The derived model also shows the dependency of data throughput with length of the channel and depolarization factor. It is observed that absorption of photons affects the throughput in a more intensive way in comparison with that of depolarization. Apart from that, the probability of depolarization and the absorption of radiated photons are obtained.

Keywords: absorption, data throughput, depolarization, optical fiber

Procedia PDF Downloads 277
24135 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network

Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi

Abstract:

Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.

Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication

Procedia PDF Downloads 435
24134 Offshore Outsourcing: Global Data Privacy Controls and International Compliance Issues

Authors: Michelle J. Miller

Abstract:

In recent year, there has been a rise of two emerging issues that impact the global employment and business market that the legal community must review closer: offshore outsourcing and data privacy. These two issues intersect because employment opportunities are shifting due to offshore outsourcing and some States, like the United States, anti-outsourcing legislation has been passed or presented to retain jobs within the country. In addition, the legal requirements to retain the privacy of data as a global employer extends to employees and third party service provides, including services outsourced to offshore locations. For this reason, this paper will review the intersection of these two issues with a specific focus on data privacy.

Keywords: outsourcing, data privacy, international compliance, multinational corporations

Procedia PDF Downloads 397
24133 Weighted Data Replication Strategy for Data Grid Considering Economic Approach

Authors: N. Mansouri, A. Asadi

Abstract:

Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.

Keywords: data grid, data replication, simulation, replica selection, replica placement

Procedia PDF Downloads 248
24132 Evaluation of Satellite and Radar Rainfall Product over Seyhan Plain

Authors: Kazım Kaba, Erdem Erdi, M. Akif Erdoğan, H. Mustafa Kandırmaz

Abstract:

Rainfall is crucial data source for very different discipline such as agriculture, hydrology and climate. Therefore rain rate should be known well both spatial and temporal for any area. Rainfall is measured by using rain-gauge at meteorological ground stations traditionally for many years. At the present time, rainfall products are acquired from radar and satellite images with a temporal and spatial continuity. In this study, we investigated the accuracy of these rainfall data according to rain-gauge data. For this purpose, we used Adana-Hatay radar hourly total precipitation product (RN1) and Meteosat convective rainfall rate (CRR) product over Seyhan plain. We calculated daily rainfall values from RN1 and CRR hourly precipitation products. We used the data of rainy days of four stations located within range of the radar from October 2013 to November 2015. In the study, we examined two rainfall data over Seyhan plain and the correlation between the rain-gauge data and two raster rainfall data was observed lowly.

Keywords: meteosat, radar, rainfall, rain-gauge, Turkey

Procedia PDF Downloads 309
24131 Spatial Data Mining by Decision Trees

Authors: Sihem Oujdi, Hafida Belbachir

Abstract:

Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.

Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining

Procedia PDF Downloads 601
24130 Data-Driven Dynamic Overbooking Model for Tour Operators

Authors: Kannapha Amaruchkul

Abstract:

We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.

Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator

Procedia PDF Downloads 117
24129 Modeling and Statistical Analysis of a Soap Production Mix in Bejoy Manufacturing Industry, Anambra State, Nigeria

Authors: Okolie Chukwulozie Paul, Iwenofu Chinwe Onyedika, Sinebe Jude Ebieladoh, M. C. Nwosu

Abstract:

The research work is based on the statistical analysis of the processing data. The essence is to analyze the data statistically and to generate a design model for the production mix of soap manufacturing products in Bejoy manufacturing company Nkpologwu, Aguata Local Government Area, Anambra state, Nigeria. The statistical analysis shows the statistical analysis and the correlation of the data. T test, Partial correlation and bi-variate correlation were used to understand what the data portrays. The design model developed was used to model the data production yield and the correlation of the variables show that the R2 is 98.7%. However, the results confirm that the data is fit for further analysis and modeling. This was proved by the correlation and the R-squared.

Keywords: General Linear Model, correlation, variables, pearson, significance, T-test, soap, production mix and statistic

Procedia PDF Downloads 428
24128 Helping the Development of Public Policies with Knowledge of Criminal Data

Authors: Diego De Castro Rodrigues, Marcelo B. Nery, Sergio Adorno

Abstract:

The project aims to develop a framework for social data analysis, particularly by mobilizing criminal records and applying descriptive computational techniques, such as associative algorithms and extraction of tree decision rules, among others. The methods and instruments discussed in this work will enable the discovery of patterns, providing a guided means to identify similarities between recurring situations in the social sphere using descriptive techniques and data visualization. The study area has been defined as the city of São Paulo, with the structuring of social data as the central idea, with a particular focus on the quality of the information. Given this, a set of tools will be validated, including the use of a database and tools for visualizing the results. Among the main deliverables related to products and the development of articles are the discoveries made during the research phase. The effectiveness and utility of the results will depend on studies involving real data, validated both by domain experts and by identifying and comparing the patterns found in this study with other phenomena described in the literature. The intention is to contribute to evidence-based understanding and decision-making in the social field.

Keywords: social data analysis, criminal records, computational techniques, data mining, big data

Procedia PDF Downloads 68
24127 System Survivability in Networks in the Context of Defense/Attack Strategies: The Large Scale

Authors: Asma Ben Yaghlane, Mohamed Naceur Azaiez, Mehdi Mrad

Abstract:

We investigate the large scale of networks in the context of network survivability under attack. We use appropriate techniques to evaluate and the attacker-based- and the defender-based-network survivability. The attacker is unaware of the operated links by the defender. Each attacked link has some pre-specified probability to be disconnected. The defender choice is so that to maximize the chance of successfully sending the flow to the destination node. The attacker however will select the cut-set with the highest chance to be disabled in order to partition the network. Moreover, we extend the problem to the case of selecting the best p paths to operate by the defender and the best k cut-sets to target by the attacker, for arbitrary integers p,k > 1. We investigate some variations of the problem and suggest polynomial-time solutions.

Keywords: defense/attack strategies, large scale, networks, partitioning a network

Procedia PDF Downloads 263
24126 Effect of 8 Weeks of Intervention on Physical Fitness, Hepatokines, and Insulin Resistance in Obese Subjects

Authors: Adela Penesova, Zofia Radikova, Boris Bajer, Andrea Havranova, Miroslav Vlcek

Abstract:

Background: The aim of our study was to compare the effect of intensified lifestyle intervention on insulin resistance (HOMA-IR), alanine aminotransferase (ALT), aspartate aminotransferase (AST), and Fibroblast growth factor (FGF) 21 after 8 weeks of lifestyle intervention. Methods: A group of 43 obese patients (13M/30F; 43.0±12.4 years; BMI (body mass index) 31.2±6.3 kg/m2 participated in a weight loss interventional program (NCT02325804) following an 8-week hypocaloric diet (-30% energy expenditure) and physical activity 150 minutes/week. Insulin sensitivity was evaluated according to the homeostasis model assessment of insulin resistance (HOMA-IR) and insulin sensitivity indices according to Matsuda and Cederholm were calculated (ISImat and ISIced). Plasma ALT, AST, Fetuin-A, FGF 21, and physical fitness were measured. Results: The average reduction of body weight was 6.8±4.9 kg (0-15 kg; p=0.0006), accompanied with a significant reduction of body fat amount of fat mass (p=0.03), and waist circumference (p=0.02). Insulin sensitivity has been improved (IR HOMA 2.71±3.90 vs 1.24±0.83; p=0.01; ISIMat 6.64±4.38 vs 8.93±5.36 p ≤ 0.001). Total, LDL cholesterol, and triglycerides decreased (p=0.05, p=0.04, p=0.04, respectively). Physical fitness significantly improved after intervention (as measure VO2 max (maximal oxygen uptake) (p ≤ 0.001). ALT decreased significantly (0.44±0.26 vs post 0.33±0.18 ukat/l, p=0.004); however, AST not (pre 0.40±0.15 vs 0.35±0.09 ukat/l, p=0.07). Hepatokine Fetuin-A significantly decreased after intervention (43.1±10.8 vs 32.6±8.6 ng/ml, p < 0.001); however, FGF 21 levels tended to decrease (146±152 vs 132±164 pg/ml, p=0.07). Conclusion: 8-weeks of diet and physical activity intervention program in obese otherwise healthy subjects led to an improvement of insulin resistance parameters and liver marker profiles, as well as increased physical fitness. This study was supported by grants APVV 15-0228; VEGA 2/0161/16.

Keywords: obesity, diet, exercice, insulin sensitivity

Procedia PDF Downloads 185
24125 Optimization of Real Time Measured Data Transmission, Given the Amount of Data Transmitted

Authors: Michal Kopcek, Tomas Skulavik, Michal Kebisek, Gabriela Krizanova

Abstract:

The operation of nuclear power plants involves continuous monitoring of the environment in their area. This monitoring is performed using a complex data acquisition system, which collects status information about the system itself and values of many important physical variables e.g. temperature, humidity, dose rate etc. This paper describes a proposal and optimization of communication that takes place in teledosimetric system between the central control server responsible for the data processing and storing and the decentralized measuring stations, which are measuring the physical variables. Analyzes of ongoing communication were performed and consequently the optimization of the system architecture and communication was done.

Keywords: communication protocol, transmission optimization, data acquisition, system architecture

Procedia PDF Downloads 504
24124 Facial Recognition on the Basis of Facial Fragments

Authors: Tetyana Baydyk, Ernst Kussul, Sandra Bonilla Meza

Abstract:

There are many articles that attempt to establish the role of different facial fragments in face recognition. Various approaches are used to estimate this role. Frequently, authors calculate the entropy corresponding to the fragment. This approach can only give approximate estimation. In this paper, we propose to use a more direct measure of the importance of different fragments for face recognition. We propose to select a recognition method and a face database and experimentally investigate the recognition rate using different fragments of faces. We present two such experiments in the paper. We selected the PCNC neural classifier as a method for face recognition and parts of the LFW (Labeled Faces in the Wild) face database as training and testing sets. The recognition rate of the best experiment is comparable with the recognition rate obtained using the whole face.

Keywords: face recognition, labeled faces in the wild (LFW) database, random local descriptor (RLD), random features

Procedia PDF Downloads 341
24123 The Death of God: Between Nietzsche and Muhammad Iqbal

Authors: Maruf Hasan

Abstract:

This article will investigate how Muhammad Iqbal integrated key aspects of Nietzsche's philosophy into Islamic spirituality. At the same time, it sees how Iqbal developed a Tawhidic paradigm which differed from Nietzsche's atheism. The literary study of Nietzsche and Iqbal's texts, as well as their influence on Western and Muslim discourse, are used in this study as part of a qualitative research methodology. The result is that Iqbal accepted most of Nietzsche's ideas, while affirming Islamic Aqida. The article's conclusion is that, so long as a Tawhidic paradigm is upheld, there are no restrictions on the integration of knowledge. Iqbal's writings serve as a template for the fusion of Western and Islamic thought, presenting a solution to the apparent incompatibility of these two sets of beliefs. In order to deepen our awareness of the world and foster respect and understanding among cultures and religions, this research emphasizes the value of connecting with other perspectives.

Keywords: Nietzsche, Muhammad Iqbal, post-Islamism, post-Islamism interpretative approach

Procedia PDF Downloads 105
24122 Proposed Alternative System for Existing Traffic Signal System

Authors: Alluri Swaroopa, L. V. N. Prasad

Abstract:

Alone with fast urbanization in world, traffic control problem became a big issue in urban construction. Having an efficient and reliable traffic control system is crucial to macro-traffic control. Traffic signal is used to manage conflicting requirement by allocating different sets of mutually compatible traffic movement during distinct time interval. Many approaches have been made proposed to solve this discrete stochastic problem. Recognizing the need to minimize right-of-way impacts while efficiently handling the anticipated high traffic volumes, the proposed alternative system gives effective design. This model allows for increased traffic capacity and reduces delays by eliminating a step in maneuvering through the freeway interchange. The concept proposed in this paper involves construction of bridges and ramps at intersection of four roads to control the vehicular congestion and to prevent traffic breakdown.

Keywords: bridges, junctions, ramps, urban traffic control

Procedia PDF Downloads 538
24121 The Duty of Application and Connection Providers Regarding the Supply of Internet Protocol by Court Order in Brazil to Determine Authorship of Acts Practiced on the Internet

Authors: João Pedro Albino, Ana Cláudia Pires Ferreira de Lima

Abstract:

Humanity has undergone a transformation from the physical to the virtual world, generating an enormous amount of data on the world wide web, known as big data. Many facts that occur in the physical world or in the digital world are proven through records made on the internet, such as digital photographs, posts on social media, contract acceptances by digital platforms, email, banking, and messaging applications, among others. These data recorded on the internet have been used as evidence in judicial proceedings. The identification of internet users is essential for the security of legal relationships. This research was carried out on scientific articles and materials from courses and lectures, with an analysis of Brazilian legislation and some judicial decisions on the request of static data from logs and Internet Protocols (IPs) from application and connection providers. In this article, we will address the determination of authorship of data processing on the internet by obtaining the IP address and the appropriate judicial procedure for this purpose under Brazilian law.

Keywords: IP address, digital forensics, big data, data analytics, information and communication technology

Procedia PDF Downloads 112
24120 Comparison of the Seismic Response of Planar Regular and Irregular Steel Frames

Authors: Robespierre Chavez, Eden Bojorquez, Alfredo Reyes-Salazar

Abstract:

This study compares the seismic response of regular and vertically irregular steel frames determined by nonlinear time history analysis and by using several sets of earthquake records, which are divided in two categories: The first category having 20 stiff-soil ground motion records obtained from the NGA database, and the second category having 30 soft-soil ground motions recorded in the Lake Zone of Mexico City and exhibiting a dominant period (Ts) of two seconds. The steel frames in both format regular and irregular were designed according to the Mexico City Seismic Design Provisions (MCSDP). The effects of irregularity throught the height on the maximum interstory drifts are estimated.

Keywords: irregular steel frames, maximum interstory drifts, seismic response, seismic records

Procedia PDF Downloads 317
24119 Sourcing and Compiling a Maltese Traffic Dataset MalTra

Authors: Gabriele Borg, Alexei De Bono, Charlie Abela

Abstract:

There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale.

Keywords: Big Data, vehicular traffic, traffic management, mobile data patterns

Procedia PDF Downloads 96
24118 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study

Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar

Abstract:

Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.

Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices

Procedia PDF Downloads 492
24117 Comparison of Different Artificial Intelligence-Based Protein Secondary Structure Prediction Methods

Authors: Jamerson Felipe Pereira Lima, Jeane Cecília Bezerra de Melo

Abstract:

The difficulty and cost related to obtaining of protein tertiary structure information through experimental methods, such as X-ray crystallography or NMR spectroscopy, helped raising the development of computational methods to do so. An approach used in these last is prediction of tridimensional structure based in the residue chain, however, this has been proved an NP-hard problem, due to the complexity of this process, explained by the Levinthal paradox. An alternative solution is the prediction of intermediary structures, such as the secondary structure of the protein. Artificial Intelligence methods, such as Bayesian statistics, artificial neural networks (ANN), support vector machines (SVM), among others, were used to predict protein secondary structure. Due to its good results, artificial neural networks have been used as a standard method to predict protein secondary structure. Recent published methods that use this technique, in general, achieved a Q3 accuracy between 75% and 83%, whereas the theoretical accuracy limit for protein prediction is 88%. Alternatively, to achieve better results, support vector machines prediction methods have been developed. The statistical evaluation of methods that use different AI techniques, such as ANNs and SVMs, for example, is not a trivial problem, since different training sets, validation techniques, as well as other variables can influence the behavior of a prediction method. In this study, we propose a prediction method based on artificial neural networks, which is then compared with a selected SVM method. The chosen SVM protein secondary structure prediction method is the one proposed by Huang in his work Extracting Physico chemical Features to Predict Protein Secondary Structure (2013). The developed ANN method has the same training and testing process that was used by Huang to validate his method, which comprises the use of the CB513 protein data set and three-fold cross-validation, so that the comparative analysis of the results can be made comparing directly the statistical results of each method.

Keywords: artificial neural networks, protein secondary structure, protein structure prediction, support vector machines

Procedia PDF Downloads 602
24116 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence

Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno

Abstract:

Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.

Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index

Procedia PDF Downloads 151
24115 Database Management System for Orphanages to Help Track of Orphans

Authors: Srivatsav Sanjay Sridhar, Asvitha Raja, Prathit Kalra, Soni Gupta

Abstract:

Database management is a system that keeps track of details about a person in an organisation. Not a lot of orphanages these days are shifting to a computer and program-based system, but unfortunately, most have only pen and paper-based records, which not only consumes space but it is also not eco-friendly. It comes as a hassle when one has to view a record of a person as they have to search through multiple records, and it will consume time. This program will organise all the data and can pull out any information about anyone whose data is entered. This is also a safe way of storage as physical data gets degraded over time or, worse, destroyed due to natural disasters. In this developing world, it is only smart enough to shift all data to an electronic-based storage system. The program comes with all features, including creating, inserting, searching, and deleting the data, as well as printing them.

Keywords: database, orphans, programming, C⁺⁺

Procedia PDF Downloads 131
24114 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques

Authors: Soheila Sadeghi

Abstract:

In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.

Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes

Procedia PDF Downloads 19
24113 Recruitment Model (FSRM) for Faculty Selection Based on Fuzzy Soft

Authors: G. S. Thakur

Abstract:

This paper presents a Fuzzy Soft Recruitment Model (FSRM) for faculty selection of MHRD technical institutions. The selection criteria are based on 4-tier flexible structure in the institutions. The Advisory Committee on Faculty Recruitment (ACoFAR) suggested nine criteria for faculty in the proposed FSRM. The model Fuzzy Soft is proposed with consultation of ACoFAR based on selection criteria. The Fuzzy Soft distance similarity measures are applied for finding best faculty from the applicant pool.

Keywords: fuzzy soft set, fuzzy sets, fuzzy soft distance, fuzzy soft similarity measures, ACoFAR

Procedia PDF Downloads 329
24112 Euthanasia in Dementia Cases: An Interview Study of Dutch Physicians' Experiences

Authors: J. E. Appel, R. N. Bouwmeester, L. Crombach, K. Georgieva, N. O’Shea, T. I. van Rijssel, L. Wingens

Abstract:

The Netherlands has a unique and progressive euthanasia law. Even people with advanced neurodegenerative diseases, like dementia, can request euthanasia when an Advanced Euthanasia Directive (AED) was written. Although the law sets some guidelines, in practice many complexities occur. Especially doctors experience difficult situations, as they have to decide whether euthanasia is justified. Research suggests that this leads to an emotional burden for them, due to feelings of isolation, fear of prosecution, as well as pressures from patient, family, or society. Existing literature, however, failed to address problems arising in dementia cases in particular, as well as possible sources of support. In order to investigate these issues, semi-structured in-depth interviews with 20 Dutch general practitioners and elderly care physicians will be conducted. Results are expected to be obtained by the end of December 2017.

Keywords: dementia, euthanasia, general practitioners, elderly care physicians, palliative care

Procedia PDF Downloads 201
24111 Using Implicit Data to Improve E-Learning Systems

Authors: Slah Alsaleh

Abstract:

In the recent years and with popularity of internet and technology, e-learning became a major part of majority of education systems. One of the advantages the e-learning systems provide is the large amount of information available about the students' behavior while communicating with the e-learning system. Such information is very rich and it can be used to improve the capability and efficiency of e-learning systems. This paper discusses how e-learning can benefit from implicit data in different ways including; creating homogeneous groups of student, evaluating students' learning, creating behavior profiles for students and identifying the students through their behaviors.

Keywords: e-learning, implicit data, user behavior, data mining

Procedia PDF Downloads 296