Search results for: network data envelopment analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 43689

Search results for: network data envelopment analysis

41649 Circular Tool and Dynamic Approach to Grow the Entrepreneurship of Macroeconomic Metabolism

Authors: Maria Areias, Diogo Simões, Ana Figueiredo, Anishur Rahman, Filipa Figueiredo, João Nunes

Abstract:

It is expected that close to 7 billion people will live in urban areas by 2050. In order to improve the sustainability of the territories and its transition towards circular economy, it’s necessary to understand its metabolism and promote and guide the entrepreneurship answer. The study of a macroeconomic metabolism involves the quantification of the inputs, outputs and storage of energy, water, materials and wastes for an urban region. This quantification and analysis representing one opportunity for the promotion of green entrepreneurship. There are several methods to assess the environmental impacts of an urban territory, such as human and environmental risk assessment (HERA), life cycle assessment (LCA), ecological footprint assessment (EF), material flow analysis (MFA), physical input-output table (PIOT), ecological network analysis (ENA), multicriteria decision analysis (MCDA) among others. However, no consensus exists about which of those assessment methods are best to analyze the sustainability of these complex systems. Taking into account the weaknesses and needs identified, the CiiM - Circular Innovation Inter-Municipality project aims to define an uniform and globally accepted methodology through the integration of various methodologies and dynamic approaches to increase the efficiency of macroeconomic metabolisms and promoting entrepreneurship in a circular economy. The pilot territory considered in CiiM project has a total area of 969,428 ha, comprising a total of 897,256 inhabitants (about 41% of the population of the Center Region). The main economic activities in the pilot territory, which contribute to a gross domestic product of 14.4 billion euros, are: social support activities for the elderly; construction of buildings; road transport of goods, retailing in supermarkets and hypermarkets; mass production of other garments; inpatient health facilities; and the manufacture of other components and accessories for motor vehicles. The region's business network is mostly constituted of micro and small companies (similar to the Central Region of Portugal), with a total of 53,708 companies identified in the CIM Region of Coimbra (39 large companies), 28,146 in the CIM Viseu Dão Lafões (22 large companies) and 24,953 in CIM Beiras and Serra da Estrela (13 large companies). For the construction of the database was taking into account data available at the National Institute of Statistics (INE), General Directorate of Energy and Geology (DGEG), Eurostat, Pordata, Strategy and Planning Office (GEP), Portuguese Environment Agency (APA), Commission for Coordination and Regional Development (CCDR) and Inter-municipal Community (CIM), as well as dedicated databases. In addition to the collection of statistical data, it was necessary to identify and characterize the different stakeholder groups in the pilot territory that are relevant to the different metabolism components under analysis. The CIIM project also adds the potential of a Geographic Information System (GIS) so that it is be possible to obtain geospatial results of the territorial metabolisms (rural and urban) of the pilot region. This platform will be a powerful visualization tool of flows of products/services that occur within the region and will support the stakeholders, improving their circular performance and identifying new business ideas and symbiotic partnerships.

Keywords: circular economy tools, life cycle assessment macroeconomic metabolism, multicriteria decision analysis, decision support tools, circular entrepreneurship, industrial and regional symbiosis

Procedia PDF Downloads 96
41648 Dynamic Analysis and Vibration Response of Thermoplastic Rolling Elements in a Rotor Bearing System

Authors: Nesrine Gaaliche

Abstract:

This study provides a finite element dynamic model for analyzing rolling bearing system vibration response. The vibration responses of polypropylene bearings with and without defects are studied using FE analysis and compared to experimental data. The viscoelastic behavior of thermoplastic is investigated in this work to evaluate the influence of material flexibility and damping viscosity. The vibrations are detected using 3D dynamic analysis. Peak vibrations are more noticeable in an inner ring defect than in an outer ring defect, according to test data. The performance of thermoplastic bearings is compared to that of metal parts using vibration signals. Both the test and numerical results show that Polypropylene bearings exhibit less vibration than steel counterparts. Unlike bearings made from metal, polypropylene bearings absorb vibrations and handle shaft misalignments. Following validation of the overall vibration spectrum data, Von Mises stresses inside the rings are assessed under high loads. Stress is significantly high under the balls, according to the simulation findings. For the test cases, the computational findings correspond closely to the experimental results.

Keywords: viscoelastic, FE analysis, polypropylene, bearings

Procedia PDF Downloads 100
41647 Infrastructural Investment and Economic Growth in Indian States: A Panel Data Analysis

Authors: Jonardan Koner, Basabi Bhattacharya, Avinash Purandare

Abstract:

The study is focused to find out the impact of infrastructural investment on economic development in Indian states. The study uses panel data analysis to measure the impact of infrastructural investment on Real Gross Domestic Product in Indian States. Panel data analysis incorporates Unit Root Test, Cointegration Teat, Pooled Ordinary Least Squares, Fixed Effect Approach, Random Effect Approach, Hausman Test. The study analyzes panel data (annual in frequency) ranging from 1991 to 2012 and concludes that infrastructural investment has a desirable impact on economic development in Indian. Finally, the study reveals that the infrastructural investment significantly explains the variation of economic indicator.

Keywords: infrastructural investment, real GDP, unit root test, cointegration teat, pooled ordinary least squares, fixed effect approach, random effect approach, Hausman test

Procedia PDF Downloads 397
41646 Impact of Climate Change on Sea Level Rise along the Coastline of Mumbai City, India

Authors: Chakraborty Sudipta, A. R. Kambekar, Sarma Arnab

Abstract:

Sea-level rise being one of the most important impacts of anthropogenic induced climate change resulting from global warming and melting of icebergs at Arctic and Antarctic, the investigations done by various researchers both on Indian Coast and elsewhere during the last decade has been reviewed in this paper. The paper aims to ascertain the propensity of consistency of different suggested methods to predict the near-accurate future sea level rise along the coast of Mumbai. Case studies at East Coast, Southern Tip and West and South West coast of India have been reviewed. Coastal Vulnerability Index of several important international places has been compared, which matched with Intergovernmental Panel on Climate Change forecasts. The application of Geographic Information System mapping, use of remote sensing technology, both Multi Spectral Scanner and Thematic Mapping data from Landsat classified through Iterative Self-Organizing Data Analysis Technique for arriving at high, moderate and low Coastal Vulnerability Index at various important coastal cities have been observed. Instead of data driven, hindcast based forecast for Significant Wave Height, additional impact of sea level rise has been suggested. Efficacy and limitations of numerical methods vis-à-vis Artificial Neural Network has been assessed, importance of Root Mean Square error on numerical results is mentioned. Comparing between various computerized methods on forecast results obtained from MIKE 21 has been opined to be more reliable than Delft 3D model.

Keywords: climate change, Coastal Vulnerability Index, global warming, sea level rise

Procedia PDF Downloads 130
41645 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 405
41644 A Comparative Study on Deep Learning Models for Pneumonia Detection

Authors: Hichem Sassi

Abstract:

Pneumonia, being a respiratory infection, has garnered global attention due to its rapid transmission and relatively high mortality rates. Timely detection and treatment play a crucial role in significantly reducing mortality associated with pneumonia. Presently, X-ray diagnosis stands out as a reasonably effective method. However, the manual scrutiny of a patient's X-ray chest radiograph by a proficient practitioner usually requires 5 to 15 minutes. In situations where cases are concentrated, this places immense pressure on clinicians for timely diagnosis. Relying solely on the visual acumen of imaging doctors proves to be inefficient, particularly given the low speed of manual analysis. Therefore, the integration of artificial intelligence into the clinical image diagnosis of pneumonia becomes imperative. Additionally, AI recognition is notably rapid, with convolutional neural networks (CNNs) demonstrating superior performance compared to human counterparts in image identification tasks. To conduct our study, we utilized a dataset comprising chest X-ray images obtained from Kaggle, encompassing a total of 5216 training images and 624 test images, categorized into two classes: normal and pneumonia. Employing five mainstream network algorithms, we undertook a comprehensive analysis to classify these diseases within the dataset, subsequently comparing the results. The integration of artificial intelligence, particularly through improved network architectures, stands as a transformative step towards more efficient and accurate clinical diagnoses across various medical domains.

Keywords: deep learning, computer vision, pneumonia, models, comparative study

Procedia PDF Downloads 58
41643 Presentation of HVA Faults in SONELGAZ Underground Network and Methods of Faults Diagnostic and Faults Location

Authors: I. Touaїbia, E. Azzag, O. Narjes

Abstract:

Power supply networks are growing continuously and their reliability is getting more important than ever. The complexity of the whole network comprises numerous components that can fail and interrupt the power supply for the end user. Underground distribution systems are normally exposed to permanent faults, due to specific construction characteristics. In these systems, visual inspection cannot be performed. In order to enhance service restoration, accurate fault location techniques must be applied. This paper describes the different faults that affect the underground distribution system of SONELGAZ (National Society of Electricity and Gas of Algeria), and cable fault location procedure with impulse reflection method (TDR), based in the analyses of the cable response of the electromagnetic impulse, allows cable fault prelocation. The results are obtained from real test in the underground distribution feeder from electrical network of energy distribution company of Souk-Ahras, in order to know the influence of cable characteristics in the types and frequency of faults.

Keywords: distribution networks, fault location, TDR, underground cable

Procedia PDF Downloads 525
41642 The Right to Data Portability and Its Influence on the Development of Digital Services

Authors: Roman Bieda

Abstract:

The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.

Keywords: data portability, digital market, GDPR, personal data

Procedia PDF Downloads 470
41641 Intrusion Detection and Prevention System (IDPS) in Cloud Computing Using Anomaly-Based and Signature-Based Detection Techniques

Authors: John Onyima, Ikechukwu Ezepue

Abstract:

Virtualization and cloud computing are among the fast-growing computing innovations in recent times. Organisations all over the world are moving their computing services towards the cloud this is because of its rapid transformation of the organization’s infrastructure and improvement of efficient resource utilization and cost reduction. However, this technology brings new security threats and challenges about safety, reliability and data confidentiality. Evidently, no single security technique can guarantee security or protection against malicious attacks on a cloud computing network hence an integrated model of intrusion detection and prevention system has been proposed. Anomaly-based and signature-based detection techniques will be integrated to enable the network and its host defend themselves with some level of intelligence. The anomaly-base detection was implemented using the local deviation factor graph-based (LDFGB) algorithm while the signature-based detection was implemented using the snort algorithm. Results from this collaborative intrusion detection and prevention techniques show robust and efficient security architecture for cloud computing networks.

Keywords: anomaly-based detection, cloud computing, intrusion detection, intrusion prevention, signature-based detection

Procedia PDF Downloads 297
41640 Preparation of Wireless Networks and Security; Challenges in Efficient Accession of Encrypted Data in Healthcare

Authors: M. Zayoud, S. Oueida, S. Ionescu, P. AbiChar

Abstract:

Background: Wireless sensor network is encompassed of diversified tools of information technology, which is widely applied in a range of domains, including military surveillance, weather forecasting, and earthquake forecasting. Strengthened grounds are always developed for wireless sensor networks, which usually emerges security issues during professional application. Thus, essential technological tools are necessary to be assessed for secure aggregation of data. Moreover, such practices have to be incorporated in the healthcare practices that shall be serving in the best of the mutual interest Objective: Aggregation of encrypted data has been assessed through homomorphic stream cipher to assure its effectiveness along with providing the optimum solutions to the field of healthcare. Methods: An experimental design has been incorporated, which utilized newly developed cipher along with CPU-constrained devices. Modular additions have also been employed to evaluate the nature of aggregated data. The processes of homomorphic stream cipher have been highlighted through different sensors and modular additions. Results: Homomorphic stream cipher has been recognized as simple and secure process, which has allowed efficient aggregation of encrypted data. In addition, the application has led its way to the improvisation of the healthcare practices. Statistical values can be easily computed through the aggregation on the basis of selected cipher. Sensed data in accordance with variance, mean, and standard deviation has also been computed through the selected tool. Conclusion: It can be concluded that homomorphic stream cipher can be an ideal tool for appropriate aggregation of data. Alongside, it shall also provide the best solutions to the healthcare sector.

Keywords: aggregation, cipher, homomorphic stream, encryption

Procedia PDF Downloads 255
41639 Forecasting Nokoué Lake Water Levels Using Long Short-Term Memory Network

Authors: Namwinwelbere Dabire, Eugene C. Ezin, Adandedji M. Firmin

Abstract:

The prediction of hydrological flows (rainfall-depth or rainfall-discharge) is becoming increasingly important in the management of hydrological risks such as floods. In this study, the Long Short-Term Memory (LSTM) network, a state-of-the-art algorithm dedicated to time series, is applied to predict the daily water level of Nokoue Lake in Benin. This paper aims to provide an effective and reliable method enable of reproducing the future daily water level of Nokoue Lake, which is influenced by a combination of two phenomena: rainfall and river flow (runoff from the Ouémé River, the Sô River, the Porto-Novo lagoon, and the Atlantic Ocean). Performance analysis based on the forecasting horizon indicates that LSTM can predict the water level of Nokoué Lake up to a forecast horizon of t+10 days. Performance metrics such as Root Mean Square Error (RMSE), coefficient of correlation (R²), Nash-Sutcliffe Efficiency (NSE), and Mean Absolute Error (MAE) agree on a forecast horizon of up to t+3 days. The values of these metrics remain stable for forecast horizons of t+1 days, t+2 days, and t+3 days. The values of R² and NSE are greater than 0.97 during the training and testing phases in the Nokoué Lake basin. Based on the evaluation indices used to assess the model's performance for the appropriate forecast horizon of water level in the Nokoué Lake basin, the forecast horizon of t+3 days is chosen for predicting future daily water levels.

Keywords: forecasting, long short-term memory cell, recurrent artificial neural network, Nokoué lake

Procedia PDF Downloads 60
41638 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection

Authors: Mahshid Arabi

Abstract:

With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.

Keywords: data protection, digital technologies, information security, modern management

Procedia PDF Downloads 25
41637 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks

Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang

Abstract:

The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.

Keywords: femtocell networks, game theory, interference mitigation, spectrum allocation

Procedia PDF Downloads 155
41636 [Keynote Talk]: Knowledge Codification and Innovation Success within Digital Platforms

Authors: Wissal Ben Arfi, Lubica Hikkerova, Jean-Michel Sahut

Abstract:

This study examines interfirm networks in the digital transformation era, and in particular, how tacit knowledge codification affects innovation success within digital platforms. Hence, one of the most important features of digital transformation and innovation process outcomes is the emergence of digital platforms, as an interfirm network, at the heart of open innovation. This research aims to illuminate how digital platforms influence inter-organizational innovation through virtual team interactions and knowledge sharing practices within an interfirm network. Consequently, it contributes to the respective strategic management literature on new product development (NPD), open innovation, industrial management, and its emerging interfirm networks’ management. The empirical findings show, on the one hand, that knowledge conversion may be enhanced, especially by the socialization which seems to be the most important phase as it has played a crucial role to hold the virtual team members together. On the other hand, in the process of socialization, the tacit knowledge codification is crucial because it provides the structure needed for the interfirm network actors to interact and act to reach common goals which favor the emergence of open innovation. Finally, our results offer several conditions necessary, but not always sufficient, for interfirm managers involved in NPD and innovation concerning strategies to increasingly shape interconnected and borderless markets and business collaborations. In the digital transformation era, the need for adaptive and innovative business models as well as new and flexible network forms is becoming more significant than ever. Supported by technological advancements and digital platforms, companies could benefit from increased market opportunities and creating new markets for their innovations through alliances and collaborative strategies, as a mode of reducing or eliminating uncertainty environments or entry barriers. Consequently, an efficient and well-structured interfirm network is essential to create network capabilities, to ensure tacit knowledge sharing, to enhance organizational learning and to foster open innovation success within digital platforms.

Keywords: interfirm networks, digital platform, virtual teams, open innovation, knowledge sharing

Procedia PDF Downloads 123
41635 Analysis of Road Network Vulnerability Due to Merapi Volcano Eruption

Authors: Imam Muthohar, Budi Hartono, Sigit Priyanto, Hardiansyah Hardiansyah

Abstract:

The eruption of Merapi Volcano in Yogyakarta, Indonesia in 2010 caused many casualties due to minimum preparedness in facing disaster. Increasing population capacity and evacuating to safe places become very important to minimize casualties. Regional government through the Regional Disaster Management Agency has divided disaster-prone areas into three parts, namely ring 1 at a distance of 10 km, ring 2 at a distance of 15 km and ring 3 at a distance of 20 km from the center of Mount Merapi. The success of the evacuation is fully supported by road network infrastructure as a way to rescue in an emergency. This research attempts to model evacuation process based on the rise of refugees in ring 1, expanded to ring 2 and finally expanded to ring 3. The model was developed using SATURN (Simulation and Assignment of Traffic to Urban Road Networks) program version 11.3. 12W, involving 140 centroid, 449 buffer nodes, and 851 links across Yogyakarta Special Region, which was aimed at making a preliminary identification of road networks considered vulnerable to disaster. An assumption made to identify vulnerability was the improvement of road network performance in the form of flow and travel times on the coverage of ring 1, ring 2, ring 3, Sleman outside the ring, Yogyakarta City, Bantul, Kulon Progo, and Gunung Kidul. The research results indicated that the performance increase in the road networks existing in the area of ring 2, ring 3, and Sleman outside the ring. The road network in ring 1 started to increase when the evacuation was expanded to ring 2 and ring 3. Meanwhile, the performance of road networks in Yogyakarta City, Bantul, Kulon Progo, and Gunung Kidul during the evacuation period simultaneously decreased in when the evacuation areas were expanded. The results of preliminary identification of the vulnerability have determined that the road networks existing in ring 1, ring 2, ring 3 and Sleman outside the ring were considered vulnerable to the evacuation of Mount Merapi eruption. Therefore, it is necessary to pay a great deal of attention in order to face the disasters that potentially occur at anytime.

Keywords: model, evacuation, SATURN, vulnerability

Procedia PDF Downloads 165
41634 Simulation Study of a Fault at the Switch on the Operation of the Doubly Fed Induction Generator Based on the Wind Turbine

Authors: N. Zerzouri, N. Benalia, N. Bensiali

Abstract:

This work is devoted to an analysis of the operation of a doubly fed induction generator (DFIG) integrated with a wind system. The power transfer between the stator and the network is carried out by acting on the rotor via a bidirectional signal converter. The analysis is devoted to the study of a fault in the converter due to an interruption of the control of a semiconductor. Simulation results obtained by the MATLAB / Simulink software illustrate the quality of the power generated at the default.

Keywords: doubly fed induction generator (DFIG), wind power generation, back to back PWM converter, default switching

Procedia PDF Downloads 462
41633 High-Fidelity Materials Screening with a Multi-Fidelity Graph Neural Network and Semi-Supervised Learning

Authors: Akeel A. Shah, Tong Zhang

Abstract:

Computational approaches to learning the properties of materials are commonplace, motivated by the need to screen or design materials for a given application, e.g., semiconductors and energy storage. Experimental approaches can be both time consuming and costly. Unfortunately, computational approaches such as ab-initio electronic structure calculations and classical or ab-initio molecular dynamics are themselves can be too slow for the rapid evaluation of materials, often involving thousands to hundreds of thousands of candidates. Machine learning assisted approaches have been developed to overcome the time limitations of purely physics-based approaches. These approaches, on the other hand, require large volumes of data for training (hundreds of thousands on many standard data sets such as QM7b). This means that they are limited by how quickly such a large data set of physics-based simulations can be established. At high fidelity, such as configuration interaction, composite methods such as G4, and coupled cluster theory, gathering such a large data set can become infeasible, which can compromise the accuracy of the predictions - many applications require high accuracy, for example band structures and energy levels in semiconductor materials and the energetics of charge transfer in energy storage materials. In order to circumvent this problem, multi-fidelity approaches can be adopted, for example the Δ-ML method, which learns a high-fidelity output from a low-fidelity result such as Hartree-Fock or density functional theory (DFT). The general strategy is to learn a map between the low and high fidelity outputs, so that the high-fidelity output is obtained a simple sum of the physics-based low-fidelity and correction, Although this requires a low-fidelity calculation, it typically requires far fewer high-fidelity results to learn the correction map, and furthermore, the low-fidelity result, such as Hartree-Fock or semi-empirical ZINDO, is typically quick to obtain, For high-fidelity outputs the result can be an order of magnitude or more in speed up. In this work, a new multi-fidelity approach is developed, based on a graph convolutional network (GCN) combined with semi-supervised learning. The GCN allows for the material or molecule to be represented as a graph, which is known to improve accuracy, for example SchNet and MEGNET. The graph incorporates information regarding the numbers of, types and properties of atoms; the types of bonds; and bond angles. They key to the accuracy in multi-fidelity methods, however, is the incorporation of low-fidelity output to learn the high-fidelity equivalent, in this case by learning their difference. Semi-supervised learning is employed to allow for different numbers of low and high-fidelity training points, by using an additional GCN-based low-fidelity map to predict high fidelity outputs. It is shown on 4 different data sets that a significant (at least one order of magnitude) increase in accuracy is obtained, using one to two orders of magnitude fewer low and high fidelity training points. One of the data sets is developed in this work, pertaining to 1000 simulations of quinone molecules (up to 24 atoms) at 5 different levels of fidelity, furnishing the energy, dipole moment and HOMO/LUMO.

Keywords: .materials screening, computational materials, machine learning, multi-fidelity, graph convolutional network, semi-supervised learning

Procedia PDF Downloads 33
41632 Passengers’ Behavior Analysis under the Public Transport Disruption: An Agent-Based Simulation

Authors: M. Rahimi, F. Corman

Abstract:

This paper study the travel behavior of passengers in a public transport disruption under information provision strategies. We develop a within-day approach for multi-agent simulation to evaluate the behavior of the agents, under comprehensive scenarios through various information exposure, equilibrium, and non-equilibrium scenarios. In particular, we quantify the effects of information strategies in disruption situation on passengers’ satisfaction, number of involved agents, and the caused delay. An agent-based micro-simulation model (MATSim) is applied for the city of Zürich, Switzerland, for the purpose of activity-based simulation in a multimodal network. Statistic outcome is analysed for all the agents who may be involved in the disruption. Agents’ movement in the public transport network illustrates agents’ adaptations to available information about the disruption. Agents’ delays and utility reveal that information significantly affects agents’ satisfaction and delay in public transport disruption. Besides, while the earlier availability of the information causes the fewer consequent delay for the involved agents, however, it also leads to more amount of affected agents.

Keywords: agent-based simulation, disruption management, passengers’ behavior simulation, public transport

Procedia PDF Downloads 140
41631 Supply Network Design for Production-Distribution of Fish: A Sustainable Approach Using Mathematical Programming

Authors: Nicolás Clavijo Buriticá, Laura Viviana Triana Sanchez

Abstract:

This research develops a productive context associated with the aquaculture industry in northern Tolima-Colombia, specifically in the town of Lerida. Strategic aspects of chain of fish Production-Distribution, especially those related to supply network design of an association devoted to cultivating, farming, processing and marketing of fish are addressed. This research is addressed from a special approach of Supply Chain Management (SCM) which guides management objectives to the system sustainability; this approach is called Sustainable Supply Chain Management (SSCM). The network design of fish production-distribution system is obtained for the case study by two mathematical programming models that aims to maximize the economic benefits of the chain and minimize total supply chain costs, taking into account restrictions to protect the environment and its implications on system productivity. The results of the mathematical models validated in the productive situation of the partnership under study, called Asopiscinorte shows the variation in the number of open or closed locations in the supply network that determines the final network configuration. This proposed result generates for the case study an increase of 31.5% in the partial productivity of storage and processing, in addition to possible favorable long-term implications, such as attending an agile or not a consumer area, increase or not the level of sales in several areas, to meet in quantity, time and cost of work in progress and finished goods to various actors in the chain.

Keywords: Sustainable Supply Chain, mathematical programming, aquaculture industry, Supply Chain Design, Supply Chain Configuration

Procedia PDF Downloads 534
41630 Mobile Learning: Toward Better Understanding of Compression Techniques

Authors: Farouk Lawan Gambo

Abstract:

Data compression shrinks files into fewer bits then their original presentation. It has more advantage on internet because the smaller a file, the faster it can be transferred but learning most of the concepts in data compression are abstract in nature therefore making them difficult to digest by some students (Engineers in particular). To determine the best approach toward learning data compression technique, this paper first study the learning preference of engineering students who tend to have strong active, sensing, visual and sequential learning preferences, the paper also study the advantage that mobility of learning have experienced; Learning at the point of interest, efficiency, connection, and many more. A survey is carried out with some reasonable number of students, through random sampling to see whether considering the learning preference and advantages in mobility of learning will give a promising improvement over the traditional way of learning. Evidence from data analysis using Ms-Excel as a point of concern for error-free findings shows that there is significance different in the students after using learning content provided on smart phone, also the result of the findings presented in, bar charts and pie charts interpret that mobile learning has to be promising feature of learning.

Keywords: data analysis, compression techniques, learning content, traditional learning approach

Procedia PDF Downloads 344
41629 Long Short-Term Memory Based Model for Modeling Nicotine Consumption Using an Electronic Cigarette and Internet of Things Devices

Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi

Abstract:

In this paper, we want to determine whether the accurate prediction of nicotine concentration can be obtained by using a network of smart objects and an e-cigarette. The approach consists of, first, the recognition of factors influencing smoking cessation such as physical activity recognition and participant’s behaviors (using both smartphone and smartwatch), then the prediction of the configuration of the e-cigarette (in terms of nicotine concentration, power, and resistance of e-cigarette). The study uses a network of commonly connected objects; a smartwatch, a smartphone, and an e-cigarette transported by the participants during an uncontrolled experiment. The data obtained from sensors carried in the three devices were trained by a Long short-term memory algorithm (LSTM). Results show that our LSTM-based model allows predicting the configuration of the e-cigarette in terms of nicotine concentration, power, and resistance with a root mean square error percentage of 12.9%, 9.15%, and 11.84%, respectively. This study can help to better control consumption of nicotine and offer an intelligent configuration of the e-cigarette to users.

Keywords: Iot, activity recognition, automatic classification, unconstrained environment

Procedia PDF Downloads 219
41628 Analysis of Exponential Nonuniform Transmission Line Parameters

Authors: Mounir Belattar

Abstract:

In this paper the Analysis of voltage waves that propagate along a lossless exponential nonuniform line is presented. For this analysis the parameters of this line are assumed to be varying function of the distance x along the line from the source end. The approach is based on the tow-port networks cascading presentation to derive the ABDC parameters of transmission using Picard-Carson Method which is a powerful method in getting a power series solution for distributed network because it is easy to calculate poles and zeros and solves differential equations such as telegrapher equations by an iterative sequence. So the impedance, admittance voltage and current along the line are expanded as a Taylor series in x/l where l is the total length of the line to obtain at the end, the main transmission line parameters such as voltage response and transmission and reflexion coefficients represented by scattering parameters in frequency domain.

Keywords: ABCD parameters, characteristic impedance exponential nonuniform transmission line, Picard-Carson's method, S parameters, Taylor's series

Procedia PDF Downloads 440
41627 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 502
41626 ChaQra: A Cellular Unit of the Indian Quantum Network

Authors: Shashank Gupta, Iteash Agarwal, Vijayalaxmi Mogiligidda, Rajesh Kumar Krishnan, Sruthi Chennuri, Deepika Aggarwal, Anwesha Hoodati, Sheroy Cooper, Ranjan, Mohammad Bilal Sheik, Bhavya K. M., Manasa Hegde, M. Naveen Krishna, Amit Kumar Chauhan, Mallikarjun Korrapati, Sumit Singh, J. B. Singh, Sunil Sud, Sunil Gupta, Sidhartha Pant, Sankar, Neha Agrawal, Ashish Ranjan, Piyush Mohapatra, Roopak T., Arsh Ahmad, Nanjunda M., Dilip Singh

Abstract:

Major research interests on quantum key distribution (QKD) are primarily focussed on increasing 1. point-to-point transmission distance (1000 Km), 2. secure key rate (Mbps), 3. security of quantum layer (device-independence). It is great to push the boundaries on these fronts, but these isolated approaches are neither scalable nor cost-effective due to the requirements of specialised hardware and different infrastructure. Current and future QKD network requires addressing different sets of challenges apart from distance, key rate, and quantum security. In this regard, we present ChaQra -a sub-quantum network with core features as 1) Crypto agility (integration in the already deployed telecommunication fibres), 2) Software defined networking (SDN paradigm for routing different nodes), 3) reliability (addressing denial-of-service with hybrid quantum safe cryptography), 4) upgradability (modules upgradation based on scientific and technological advancements), 5) Beyond QKD (using QKD network for distributed computing, multi-party computation etc). Our results demonstrate a clear path to create and accelerate quantum secure Indian subcontinent under the national quantum mission.

Keywords: quantum network, quantum key distribution, quantum security, quantum information

Procedia PDF Downloads 51
41625 Anomaly Detection in Financial Markets Using Tucker Decomposition

Authors: Salma Krafessi

Abstract:

The financial markets have a multifaceted, intricate environment, and enormous volumes of data are produced every day. To find investment possibilities, possible fraudulent activity, and market oddities, accurate anomaly identification in this data is essential. Conventional methods for detecting anomalies frequently fail to capture the complex organization of financial data. In order to improve the identification of abnormalities in financial time series data, this study presents Tucker Decomposition as a reliable multi-way analysis approach. We start by gathering closing prices for the S&P 500 index across a number of decades. The information is converted to a three-dimensional tensor format, which contains internal characteristics and temporal sequences in a sliding window structure. The tensor is then broken down using Tucker Decomposition into a core tensor and matching factor matrices, allowing latent patterns and relationships in the data to be captured. A possible sign of abnormalities is the reconstruction error from Tucker's Decomposition. We are able to identify large deviations that indicate unusual behavior by setting a statistical threshold. A thorough examination that contrasts the Tucker-based method with traditional anomaly detection approaches validates our methodology. The outcomes demonstrate the superiority of Tucker's Decomposition in identifying intricate and subtle abnormalities that are otherwise missed. This work opens the door for more research into multi-way data analysis approaches across a range of disciplines and emphasizes the value of tensor-based methods in financial analysis.

Keywords: tucker decomposition, financial markets, financial engineering, artificial intelligence, decomposition models

Procedia PDF Downloads 59
41624 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 284
41623 Exploring the Role of Data Mining in Crime Classification: A Systematic Literature Review

Authors: Faisal Muhibuddin, Ani Dijah Rahajoe

Abstract:

This in-depth exploration, through a systematic literature review, scrutinizes the nuanced role of data mining in the classification of criminal activities. The research focuses on investigating various methodological aspects and recent developments in leveraging data mining techniques to enhance the effectiveness and precision of crime categorization. Commencing with an exposition of the foundational concepts of crime classification and its evolutionary dynamics, this study details the paradigm shift from conventional methods towards approaches supported by data mining, addressing the challenges and complexities inherent in the modern crime landscape. Specifically, the research delves into various data mining techniques, including K-means clustering, Naïve Bayes, K-nearest neighbour, and clustering methods. A comprehensive review of the strengths and limitations of each technique provides insights into their respective contributions to improving crime classification models. The integration of diverse data sources takes centre stage in this research. A detailed analysis explores how the amalgamation of structured data (such as criminal records) and unstructured data (such as social media) can offer a holistic understanding of crime, enriching classification models with more profound insights. Furthermore, the study explores the temporal implications in crime classification, emphasizing the significance of considering temporal factors to comprehend long-term trends and seasonality. The availability of real-time data is also elucidated as a crucial element in enhancing responsiveness and accuracy in crime classification.

Keywords: data mining, classification algorithm, naïve bayes, k-means clustering, k-nearest neigbhor, crime, data analysis, sistematic literature review

Procedia PDF Downloads 60
41622 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features

Authors: Bushra Zafar, Usman Qamar

Abstract:

Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.

Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection

Procedia PDF Downloads 314
41621 Using RASCAL Code to Analyze the Postulated UF6 Fire Accident

Authors: J. R. Wang, Y. Chiang, W. S. Hsu, S. H. Chen, J. H. Yang, S. W. Chen, C. Shih, Y. F. Chang, Y. H. Huang, B. R. Shen

Abstract:

In this research, the RASCAL code was used to simulate and analyze the postulated UF6 fire accident which may occur in the Institute of Nuclear Energy Research (INER). There are four main steps in this research. In the first step, the UF6 data of INER were collected. In the second step, the RASCAL analysis methodology and model was established by using these data. Third, this RASCAL model was used to perform the simulation and analysis of the postulated UF6 fire accident. Three cases were simulated and analyzed in this step. Finally, the analysis results of RASCAL were compared with the hazardous levels of the chemicals. According to the compared results of three cases, Case 3 has the maximum danger in human health.

Keywords: RASCAL, UF₆, safety, hydrogen fluoride

Procedia PDF Downloads 213
41620 CNN-Based Compressor Mass Flow Estimator in Industrial Aircraft Vapor Cycle System

Authors: Justin Reverdi, Sixin Zhang, Saïd Aoues, Fabrice Gamboa, Serge Gratton, Thomas Pellegrini

Abstract:

In vapor cycle systems, the mass flow sensor plays a key role for different monitoring and control purposes. However, physical sensors can be inaccurate, heavy, cumbersome, expensive, or highly sensitive to vibrations, which is especially problematic when embedded into an aircraft. The conception of a virtual sensor, based on other standard sensors, is a good alternative. This paper has two main objectives. Firstly, a data-driven model using a convolutional neural network is proposed to estimate the mass flow of the compressor. We show that it significantly outperforms the standard polynomial regression model (thermodynamic maps) in terms of the standard MSE metric and engineer performance metrics. Secondly, a semi-automatic segmentation method is proposed to compute the engineer performance metrics for real datasets, as the standard MSE metric may pose risks in analyzing the dynamic behavior of vapor cycle systems.

Keywords: deep learning, convolutional neural network, vapor cycle system, virtual sensor

Procedia PDF Downloads 53