Search results for: network data mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28234

Search results for: network data mining

25684 Neural Network Based Approach of Software Maintenance Prediction for Laboratory Information System

Authors: Vuk M. Popovic, Dunja D. Popovic

Abstract:

Software maintenance phase is started once a software project has been developed and delivered. After that, any modification to it corresponds to maintenance. Software maintenance involves modifications to keep a software project usable in a changed or a changing environment, to correct discovered faults, and modifications, and to improve performance or maintainability. Software maintenance and management of software maintenance are recognized as two most important and most expensive processes in a life of a software product. This research is basing the prediction of maintenance, on risks and time evaluation, and using them as data sets for working with neural networks. The aim of this paper is to provide support to project maintenance managers. They will be able to pass the issues planned for the next software-service-patch to the experts, for risk and working time evaluation, and afterward to put all data to neural networks in order to get software maintenance prediction. This process will lead to the more accurate prediction of the working hours needed for the software-service-patch, which will eventually lead to better planning of budget for the software maintenance projects.

Keywords: laboratory information system, maintenance engineering, neural networks, software maintenance, software maintenance costs

Procedia PDF Downloads 358
25683 Lessons from Implementation of a Network-Wide Safety Huddle in Behavioral Health

Authors: Deborah Weidner, Melissa Morgera

Abstract:

The model of care delivery in the Behavioral Health Network (BHN) is integrated across all five regions of Hartford Healthcare and thus spans the entirety of the state of Connecticut, with care provided in seven inpatient settings and over 30 ambulatory outpatient locations. While safety has been a core priority of the BHN in alignment with High Reliability practices, safety initiatives have historically been facilitated locally in each region or within each entity, with interventions implemented locally as opposed to throughout the network. To address this, the BHN introduced a network wide Safety Huddle during 2022. Launched in January, the BHN Safety Huddle brought together internal stakeholders, including medical and administrative leaders, along with executive institute leadership, quality, and risk management. By bringing leaders together and introducing a network-wide safety huddle into the way we work, the benefit has been an increase in awareness of safety events occurring in behavioral health areas as well as increased systemization of countermeasures to prevent future events. One significant discussion topic presented in huddles has pertained to environmental design and patient access to potentially dangerous items, addressing some of the most relevant factors resulting in harm to patients in inpatient and emergency settings for behavioral health patients. The safety huddle has improved visibility of potential environmental safety risks through the generation of over 15 safety alerts cascaded throughout the BHN and also spurred a rapid improvement project focused on standardization of patient belonging searches to reduce patient access to potentially dangerous items on inpatient units. Safety events pertaining to potentially dangerous items decreased by 31% as a result of standardized interventions implemented across the network and as a result of increased awareness. A second positive outcome originating from the BHN Safety Huddle was implementation of a recommendation to increase the emergency Narcan®(naloxone) supply on hand in ambulatory settings of the BHN after incidents involving accidental overdose resulted in higher doses of naloxone administration. By increasing the emergency supply of naloxone on hand in all ambulatory and residential settings, colleagues are better prepared to respond in an emergency situation should a patient experience an overdose while on site. Lastly, discussions in safety huddle spurred a new initiative within the BHN to improve responsiveness to assaultive incidents through a consultation service. This consult service, aligned with one of the network’s improvement priorities to reduce harm events related to assaultive incidents, was borne out of discussion in huddle in which it was identified that additional interventions may be needed in providing clinical care to patients who are experiencing multiple and/ or frequent safety events.

Keywords: quality, safety, behavioral health, risk management

Procedia PDF Downloads 83
25682 21st Century Teacher Image to Stakeholders of Teacher Education Institutions in the Philippines

Authors: Marilyn U. Balagtas, Maria Ruth M. Regalado, Carmelina E. Barrera, Ramer V. Oxiño, Rosarito T. Suatengco, Josephine E. Tondo

Abstract:

This study presents the perceptions of the students and teachers from kindergarten to tertiary level of the image of the 21st century teacher to provide basis in designing teacher development programs in Teacher Education Institutions (TEIs) in the Philippines. The highlights of the report are the personal, psychosocial, and professional images of the 21st century teacher in basic education and the teacher educators based on a survey done to 612 internal stakeholders of nine member institutions of the National Network of Normal Schools (3NS). Data were obtained through the use of a validated researcher-made instrument which allowed generation of both quantitative and qualitative descriptions of the teacher image. Through the use of descriptive statistics, the common images of the teacher were drawn, which were validated and enriched by the information drawn from the qualitative data. The study recommends a repertoire of teacher development programs to create the good image of the 21st century teachers for a better Philippines.

Keywords: teacher image, 21st century teacher, teacher education, development program

Procedia PDF Downloads 367
25681 Energy Efficient Clustering with Reliable and Load-Balanced Multipath Routing for Wireless Sensor Networks

Authors: Alamgir Naushad, Ghulam Abbas, Shehzad Ali Shah, Ziaul Haq Abbas

Abstract:

Unlike conventional networks, it is particularly challenging to manage resources efficiently in Wireless Sensor Networks (WSNs) due to their inherent characteristics, such as dynamic network topology and limited bandwidth and battery power. To ensure energy efficiency, this paper presents a routing protocol for WSNs, namely, Enhanced Hybrid Multipath Routing (EHMR), which employs hierarchical clustering and proposes a next hop selection mechanism between nodes according to a maximum residual energy metric together with a minimum hop count. Load-balancing of data traffic over multiple paths is achieved for a better packet delivery ratio and low latency rate. Reliability is ensured in terms of higher data rate and lower end-to-end delay. EHMR also enhances the fast-failure recovery mechanism to recover a failed path. Simulation results demonstrate that EHMR achieves a higher packet delivery ratio, reduced energy consumption per-packet delivery, lower end-to-end latency, and reduced effect of data rate on packet delivery ratio when compared with eminent WSN routing protocols.

Keywords: energy efficiency, load-balancing, hierarchical clustering, multipath routing, wireless sensor networks

Procedia PDF Downloads 85
25680 Secure Proxy Signature Based on Factoring and Discrete Logarithm

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

A digital signature is an electronic signature form used by an original signer to sign a specific document. When the original signer is not in his office or when he/she travels outside, he/she delegates his signing capability to a proxy signer and then the proxy signer generates a signing message on behalf of the original signer. The two parties must be able to authenticate one another and agree on a secret encryption key, in order to communicate securely over an unreliable public network. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties. In this paper, we present a secure proxy signature scheme over an efficient and secure authenticated key agreement protocol based on factoring and discrete logarithm problem.

Keywords: discrete logarithm, factoring, proxy signature, key agreement

Procedia PDF Downloads 308
25679 A Neural Approach for the Offline Recognition of the Arabic Handwritten Words of the Algerian Departments

Authors: Salim Ouchtati, Jean Sequeira, Mouldi Bedda

Abstract:

In this work we present an off line system for the recognition of the Arabic handwritten words of the Algerian departments. The study is based mainly on the evaluation of neural network performances, trained with the gradient back propagation algorithm. The used parameters to form the input vector of the neural network are extracted on the binary images of the handwritten word by several methods: the parameters of distribution, the moments centered of the different projections and the Barr features. It should be noted that these methods are applied on segments gotten after the division of the binary image of the word in six segments. The classification is achieved by a multi layers perceptron. Detailed experiments are carried and satisfactory recognition results are reported.

Keywords: handwritten word recognition, neural networks, image processing, pattern recognition, features extraction

Procedia PDF Downloads 513
25678 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions

Authors: K. Hardy, A. Maurushat

Abstract:

Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.

Keywords: big data, open data, productivity, data governance

Procedia PDF Downloads 371
25677 ATC in Competitive Electricity Market Using TCSC

Authors: S. K. Gupta, Richa Bansal

Abstract:

In a deregulated power system structure, power producers, and customers share a common transmission network for wheeling power from the point of generation to the point of consumption. All parties in this open access environment may try to purchase the energy from the cheaper source for greater profit margins, which may lead to overloading and congestion of certain corridors of the transmission network. This may result in violation of line flow, voltage and stability limits and thereby undermine the system security. Utilities therefore need to determine adequately their Available Transfer Capability (ATC) to ensure that system reliability is maintained while serving a wide range of bilateral and multilateral transactions. This paper presents power transfer distribution factor based on AC load flow for the determination and enhancement of ATC. The study has been carried out for IEEE 24 bus Reliability Test System.

Keywords: available transfer capability, FACTS devices, power transfer distribution factors, electric

Procedia PDF Downloads 497
25676 Double Encrypted Data Communication Using Cryptography and Steganography

Authors: Adine Barett, Jermel Watson, Anteneh Girma, Kacem Thabet

Abstract:

In information security, secure communication of data across networks has always been a problem at the forefront. Transfer of information across networks is susceptible to being exploited by attackers engaging in malicious activity. In this paper, we leverage steganography and cryptography to create a layered security solution to protect the information being transmitted. The first layer of security leverages crypto- graphic techniques to scramble the information so that it cannot be deciphered even if the steganography-based layer is compromised. The second layer of security relies on steganography to disguise the encrypted in- formation so that it cannot be seen. We consider three cryptographic cipher methods in the cryptography layer, namely, Playfair cipher, Blowfish cipher, and Hills cipher. Then, the encrypted message is passed through the least significant bit (LSB) to the steganography algorithm for further encryption. Both encryption approaches are combined efficiently to help secure information in transit over a network. This multi-layered encryption is a solution that will benefit cloud platforms, social media platforms and networks that regularly transfer private information such as banks and insurance companies.

Keywords: cryptography, steganography, layered security, Cipher, encryption

Procedia PDF Downloads 85
25675 Enhanced Retrieval-Augmented Generation (RAG) Method with Knowledge Graph and Graph Neural Network (GNN) for Automated QA Systems

Authors: Zhihao Zheng, Zhilin Wang, Linxin Liu

Abstract:

In the research of automated knowledge question-answering systems, accuracy and efficiency are critical challenges. This paper proposes a knowledge graph-enhanced Retrieval-Augmented Generation (RAG) method, combined with a Graph Neural Network (GNN) structure, to automatically determine the correctness of knowledge competition questions. First, a domain-specific knowledge graph was constructed from a large corpus of academic journal literature, with key entities and relationships extracted using Natural Language Processing (NLP) techniques. Then, the RAG method's retrieval module was expanded to simultaneously query both text databases and the knowledge graph, leveraging the GNN to further extract structured information from the knowledge graph. During answer generation, contextual information provided by the knowledge graph and GNN is incorporated to improve the accuracy and consistency of the answers. Experimental results demonstrate that the knowledge graph and GNN-enhanced RAG method perform excellently in determining the correctness of questions, achieving an accuracy rate of 95%. Particularly in cases involving ambiguity or requiring contextual information, the structured knowledge provided by the knowledge graph and GNN significantly enhances the RAG method's performance. This approach not only demonstrates significant advantages in improving the accuracy and efficiency of automated knowledge question-answering systems but also offers new directions and ideas for future research and practical applications.

Keywords: knowledge graph, graph neural network, retrieval-augmented generation, NLP

Procedia PDF Downloads 39
25674 An Approach to Analyze Testing of Nano On-Chip Networks

Authors: Farnaz Fotovvatikhah, Javad Akbari

Abstract:

Test time of a test architecture is an important factor which depends on the architecture's delay and test patterns. Here a new architecture to store the test results based on network on chip is presented. In addition, simple analytical model is proposed to calculate link test time for built in self-tester (BIST) and external tester (Ext) in multiprocessor systems. The results extracted from the model are verified using FPGA implementation and experimental measurements. Systems consisting 16, 25, and 36 processors are implemented and simulated and test time is calculated. In addition, BIST and Ext are compared in terms of test time at different conditions such as at different number of test patterns and nodes. Using the model the maximum frequency of testing could be calculated and the test structure could be optimized for high speed testing.

Keywords: test, nano on-chip network, JTAG, modelling

Procedia PDF Downloads 488
25673 Awareness and Utilization of Social Network Tools among Agricultural Science Students in Colleges of Education in Ogun State, Nigeria

Authors: Adebowale Olukayode Efunnowo

Abstract:

This study was carried out to assess the awareness and utilization of Social Network Tools (SNTs) among agricultural science students in Colleges of Education in Ogun State, Nigeria. Simple random sampling techniques were used to select 280 respondents from the study area. Descriptive statistics was used to describe the objectives while Pearson Product Moment Correlation was used to test the hypothesis. The result showed that the majority (71.8%) of the respondents were single, with a mean age of 20 years. Almost all (95.7%) the respondents were aware of Facebook and 2go as a Social Network Tools (SNTs) while 85.0% of the respondents were not aware of Blackplanet, LinkedIn, MyHeritage and Bebo. Many (41.1%) of the respondents had views that using SNTs can enhance extensive literature survey, increase internet browsing potential, promote teaching proficiency, and update on outcomes of researches. However, 51.4% of the respondents perceived that SNTs usage as what is meant for the lecturers/adults only while 16.1% considered it as mainly used by internet fraudsters. Findings revealed that about 50.0% of the respondents browsed Facebook and 2go daily while more than 80% of the respondents used Blackplanet, MyHeritage, Skyrock, Bebo, LinkedIn and My YearBook as the need arise. Major constraints to the awareness and utilization of SNTs were high cost and poor quality of ICTs facilities (77.1%), epileptic power supply (75.0%), inadequate telecommunication infrastructure (71.1%), low technical know-how (62.9%) and inadequate computer knowledge (61.1%). The result of PPMC analysis showed that there was an inverse relationship between constraints and utilization of SNTs at p < 0.05. It can be concluded that constraints affect efficient and effective utilization of SNTs in the study area. It is hereby recommended that management of colleges of education and agricultural institutes should provide good internet connectivity, computer facilities, and alternative power supply in order to increase the awareness and utilization of SNTs among students.

Keywords: awareness, utilization, social network tools, constraints, students

Procedia PDF Downloads 352
25672 A Palmprint Identification System Based Multi-Layer Perceptron

Authors: David P. Tantua, Abdulkader Helwan

Abstract:

Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.

Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator

Procedia PDF Downloads 371
25671 Integrative Omics-Portrayal Disentangles Molecular Heterogeneity and Progression Mechanisms of Cancer

Authors: Binder Hans

Abstract:

Cancer is no longer seen as solely a genetic disease where genetic defects such as mutations and copy number variations affect gene regulation and eventually lead to aberrant cell functioning which can be monitored by transcriptome analysis. It has become obvious that epigenetic alterations represent a further important layer of (de-)regulation of gene activity. For example, aberrant DNA methylation is a hallmark of many cancer types, and methylation patterns were successfully used to subtype cancer heterogeneity. Hence, unraveling the interplay between different omics levels such as genome, transcriptome and epigenome is inevitable for a mechanistic understanding of molecular deregulation causing complex diseases such as cancer. This objective requires powerful downstream integrative bioinformatics methods as an essential prerequisite to discover the whole genome mutational, transcriptome and epigenome landscapes of cancer specimen and to discover cancer genesis, progression and heterogeneity. Basic challenges and tasks arise ‘beyond sequencing’ because of the big size of the data, their complexity, the need to search for hidden structures in the data, for knowledge mining to discover biological function and also systems biology conceptual models to deduce developmental interrelations between different cancer states. These tasks are tightly related to cancer biology as an (epi-)genetic disease giving rise to aberrant genomic regulation under micro-environmental control and clonal evolution which leads to heterogeneous cellular states. Machine learning algorithms such as self organizing maps (SOM) represent one interesting option to tackle these bioinformatics tasks. The SOMmethod enables recognizing complex patterns in large-scale data generated by highthroughput omics technologies. It portrays molecular phenotypes by generating individualized, easy to interpret images of the data landscape in combination with comprehensive analysis options. Our image-based, reductionist machine learning methods provide one interesting perspective how to deal with massive data in the discovery of complex diseases, gliomas, melanomas and colon cancer on molecular level. As an important new challenge, we address the combined portrayal of different omics data such as genome-wide genomic, transcriptomic and methylomic ones. The integrative-omics portrayal approach is based on the joint training of the data and it provides separate personalized data portraits for each patient and data type which can be analyzed by visual inspection as one option. The new method enables an integrative genome-wide view on the omics data types and the underlying regulatory modes. It is applied to high and low-grade gliomas and to melanomas where it disentangles transversal and longitudinal molecular heterogeneity in terms of distinct molecular subtypes and progression paths with prognostic impact.

Keywords: integrative bioinformatics, machine learning, molecular mechanisms of cancer, gliomas and melanomas

Procedia PDF Downloads 148
25670 UniFi: Universal Filter Model for Image Enhancement

Authors: Aleksei Samarin, Artyom Nazarenko, Valentin Malykh

Abstract:

Image enhancement is becoming more and more popular, especially on mobile devices. Nowadays, it is a common approach to enhance an image using a convolutional neural network (CNN). Such a network should be of significant size; otherwise, a possibility for the artifacts to occur is overgrowing. The existing large CNNs are computationally expensive, which could be crucial for mobile devices. Another important flaw of such models is they are poorly interpretable. There is another approach to image enhancement, namely, the usage of predefined filters in combination with the prediction of their applicability. We present an approach following this paradigm, which outperforms both existing CNN-based and filter-based approaches in the image enhancement task. It is easily adaptable for mobile devices since it has only 47 thousand parameters. It shows the best SSIM 0.919 on RANDOM250 (MIT Adobe FiveK) among small models and is thrice faster than previous models.

Keywords: universal filter, image enhancement, neural networks, computer vision

Procedia PDF Downloads 101
25669 Bi-objective Network Optimization in Disaster Relief Logistics

Authors: Katharina Eberhardt, Florian Klaus Kaiser, Frank Schultmann

Abstract:

Last-mile distribution is one of the most critical parts of a disaster relief operation. Various uncertainties, such as infrastructure conditions, resource availability, and fluctuating beneficiary demand, render last-mile distribution challenging in disaster relief operations. The need to balance critical performance criteria like response time, meeting demand and cost-effectiveness further complicates the task. The occurrence of disasters cannot be controlled, and the magnitude is often challenging to assess. In summary, these uncertainties create a need for additional flexibility, agility, and preparedness in logistics operations. As a result, strategic planning and efficient network design are critical for an effective and efficient response. Furthermore, the increasing frequency of disasters and the rising cost of logistical operations amplify the need to provide robust and resilient solutions in this area. Therefore, we formulate a scenario-based bi-objective optimization model that integrates pre-positioning, allocation, and distribution of relief supplies extending the general form of a covering location problem. The proposed model aims to minimize underlying logistics costs while maximizing demand coverage. Using a set of disruption scenarios, the model allows decision-makers to identify optimal network solutions to address the risk of disruptions. We provide an empirical case study of the public authorities’ emergency food storage strategy in Germany to illustrate the potential applicability of the model and provide implications for decision-makers in a real-world setting. Also, we conduct a sensitivity analysis focusing on the impact of varying stockpile capacities, single-site outages, and limited transportation capacities on the objective value. The results show that the stockpiling strategy needs to be consistent with the optimal number of depots and inventory based on minimizing costs and maximizing demand satisfaction. The strategy has the potential for optimization, as network coverage is insufficient and relies on very high transportation and personnel capacity levels. As such, the model provides decision support for public authorities to determine an efficient stockpiling strategy and distribution network and provides recommendations for increased resilience. However, certain factors have yet to be considered in this study and should be addressed in future works, such as additional network constraints and heuristic algorithms.

Keywords: humanitarian logistics, bi-objective optimization, pre-positioning, last mile distribution, decision support, disaster relief networks

Procedia PDF Downloads 79
25668 Green Closed-Loop Supply Chain Network Design Considering Different Production Technologies Levels and Transportation Modes

Authors: Mahsa Oroojeni Mohammad Javad

Abstract:

Globalization of economic activity and rapid growth of information technology has resulted in shorter product lifecycles, reduced transport capacity, dynamic and changing customer behaviors, and an increased focus on supply chain design in recent years. The design of the supply chain network is one of the most important supply chain management decisions. These decisions will have a long-term impact on the efficacy and efficiency of the supply chain. In this paper, a two-objective mixed-integer linear programming (MILP) model is developed for designing and optimizing a closed-loop green supply chain network that, to the greatest extent possible, includes all real-world assumptions such as multi-level supply chain, the multiplicity of production technologies, and multiple modes of transportation, with the goals of minimizing the total cost of the chain (first objective) and minimizing total emissions of emissions (second objective). The ε-constraint and CPLEX Solver have been used to solve the problem as a single-objective problem and validate the problem. Finally, the sensitivity analysis is applied to study the effect of the real-world parameters’ changes on the objective function. The optimal management suggestions and policies are presented.

Keywords: closed-loop supply chain, multi-level green supply chain, mixed-integer programming, transportation modes

Procedia PDF Downloads 80
25667 Survey on Big Data Stream Classification by Decision Tree

Authors: Mansoureh Ghiasabadi Farahani, Samira Kalantary, Sara Taghi-Pour, Mahboubeh Shamsi

Abstract:

Nowadays, the development of computers technology and its recent applications provide access to new types of data, which have not been considered by the traditional data analysts. Two particularly interesting characteristics of such data sets include their huge size and streaming nature .Incremental learning techniques have been used extensively to address the data stream classification problem. This paper presents a concise survey on the obstacles and the requirements issues classifying data streams with using decision tree. The most important issue is to maintain a balance between accuracy and efficiency, the algorithm should provide good classification performance with a reasonable time response.

Keywords: big data, data streams, classification, decision tree

Procedia PDF Downloads 521
25666 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication

Authors: Aishwarya Shekhar, Himanshu Sharma

Abstract:

Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.

Keywords: confidentiality, deduplication, data compression, hybridity of cloud

Procedia PDF Downloads 383
25665 The Impact on the Network Deflectometry

Authors: Djamel–Eddine Yassine Boutiba

Abstract:

In this present memory, we present the various impacts deflectometer leading to the sizing by strengthening of existing roadways. It reminds that the road network in Algeria plays a major role with regard to drainage in major strategic areas and especially in the fringe northern Algeria. Heavy traffic passing through the northern fringe (between 25% and 30% heavy vehicles) causes substantial degradations at both the surface layer and base layer. The work on site by means within the laboratory CTTP such as deflectographe Lacroix, allowed us to record a large number of deflection localized bending on RN19A (Carrefour CW73-Ain- Merane), whose analysis of the results led us to opt for a building throughout the band's project . By the recorder against HWD (Heavy Weight déflectometer) allowed us to learn about the behavior of the pavement on the banks. In addition, the Software Alize III has been essential in the verification of the increase in the thickness dimensioned.

Keywords: capacity, deflection, deflectograph lacroix, degradation, hwd

Procedia PDF Downloads 285
25664 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc

Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez

Abstract:

The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.

Keywords: BLER, LTE, network, qualipoc, SNR.

Procedia PDF Downloads 115
25663 Flame Retardant Study of Methylol Melamine Phosphate-Treated Cotton Fibre

Authors: Nurudeen Afolami Ayeni, Kasali Bello

Abstract:

Methylolmelamine with increasing degree of methylol substitution and the phosphates derivatives were used to resinate cotton fabric (CF). The resination was carried out at different curing time and curing temperature. Generally, the results show a reduction in the flame propagation rate of the treated fabrics compared to the untreated cotton fabric (CF). While the flame retardancy of methylolmelamine-treated fibre could be attributed to the degree of crosslinking of fibre-resin network which promotes stability, the methylolmelamine phosphate-treated fabrics show better retardancy due to the intumescences action of the phosphate resin upon decomposition in the resin – fabric network.

Keywords: cotton fabric, flame retardant, methylolmelamine, crosslinking, resination

Procedia PDF Downloads 385
25662 Manganese Contamination Exacerbates Reproductive Stress in a Suicidally-Breeding Marsupial

Authors: Ami Fadhillah Amir Abdul Nasir, Amanda C. Niehaus, Skye F. Cameron, Frank A. Von Hippel, John Postlethwait​, Robbie S. Wilson

Abstract:

For suicidal breeders, the physiological stresses and energetic costs of breeding are fatal. Environmental stressors such as pollution should compound these costs, yet suicidal breeding is so rare among mammals that this is unknown. Here, we explored the consequences of metal contamination to the health, aging and performance of endangered, suicidally-breeding northern quolls (Dasyurus hallucatus) living near an active manganese mine on Groote Eylandt, Northern Territory, Australia. We found respirable manganese dust at levels exceeding international recommendations even 20km from mining sites and substantial accumulation of manganese within quolls’ hair, testes, and in two brain regions—the neocortex and cerebellum, responsible for sensory perception and motor function, respectively. Though quolls did not differ in sprint speeds, motor skill, or manoeuvrability, those with higher accumulation of manganese crashed at lower speeds during manoeuvrability tests, indicating a potential effect on sight or cognition. Immune function and telomere length declined over the breeding season, as expected with ageing, but manganese contamination exacerbated immune declines and suppressed cortisol. Unexpectedly, male quolls with higher levels of manganese had longer telomeres, supporting evidence of unusual telomere dynamics among Dasyurids—though whether this affects their lifespan is unknown. We posit that sublethal contamination via pollution, mining, or urbanisation imposes physiological costs on wildlife that may diminish reproductive success or survival.

Keywords: ecotoxicology, heavy metal, manganese, telomere length, cortisol, locomotor

Procedia PDF Downloads 317
25661 A Review of Machine Learning for Big Data

Authors: Devatha Kalyan Kumar, Aravindraj D., Sadathulla A.

Abstract:

Big data are now rapidly expanding in all engineering and science and many other domains. The potential of large or massive data is undoubtedly significant, make sense to require new ways of thinking and learning techniques to address the various big data challenges. Machine learning is continuously unleashing its power in a wide range of applications. In this paper, the latest advances and advancements in the researches on machine learning for big data processing. First, the machine learning techniques methods in recent studies, such as deep learning, representation learning, transfer learning, active learning and distributed and parallel learning. Then focus on the challenges and possible solutions of machine learning for big data.

Keywords: active learning, big data, deep learning, machine learning

Procedia PDF Downloads 446
25660 A Network Economic Analysis of Friendship, Cultural Activity, and Homophily

Authors: Siming Xie

Abstract:

In social networks, the term homophily refers to the tendency of agents with similar characteristics to link with one another and is so robustly observed across many contexts and dimensions. The starting point of my research is the observation that the “type” of agents is not a single exogenous variable. Agents, despite their differences in race, religion, and other hard to alter characteristics, may share interests and engage in activities that cut across those predetermined lines. This research aims to capture the interactions of homophily effects in a model where agents have two-dimension characteristics (i.e., race and personal hobbies such as basketball, which one either likes or dislikes) and with biases in meeting opportunities and in favor of same-type friendships. A novel feature of my model is providing a matching process with biased meeting probability on different dimensions, which could help to understand the structuring process in multidimensional networks without missing layer interdependencies. The main contribution of this study is providing a welfare based matching process for agents with multi-dimensional characteristics. In particular, this research shows that the biases in meeting opportunities on one dimension would lead to the emergence of homophily on the other dimension. The objective of this research is to determine the pattern of homophily in network formations, which will shed light on our understanding of segregation and its remedies. By constructing a two-dimension matching process, this study explores a method to describe agents’ homophilous behavior in a social network with multidimension and construct a game in which the minorities and majorities play different strategies in a society. It also shows that the optimal strategy is determined by the relative group size, where society would suffer more from social segregation if the two racial groups have a similar size. The research also has political implications—cultivating the same characteristics among agents helps diminishing social segregation, but only if the minority group is small enough. This research includes both theoretical models and empirical analysis. Providing the friendship formation model, the author first uses MATLAB to perform iteration calculations, then derives corresponding mathematical proof on previous results, and last shows that the model is consistent with empirical evidence from high school friendships. The anonymous data comes from The National Longitudinal Study of Adolescent Health (Add Health).

Keywords: homophily, multidimension, social networks, friendships

Procedia PDF Downloads 170
25659 Cybersecurity Assessment of Decentralized Autonomous Organizations in Smart Cities

Authors: Claire Biasco, Thaier Hayajneh

Abstract:

A smart city is the integration of digital technologies in urban environments to enhance the quality of life. Smart cities capture real-time information from devices, sensors, and network data to analyze and improve city functions such as traffic analysis, public safety, and environmental impacts. Current smart cities face controversy due to their reliance on real-time data tracking and surveillance. Internet of Things (IoT) devices and blockchain technology are converging to reshape smart city infrastructure away from its centralized model. Connecting IoT data to blockchain applications would create a peer-to-peer, decentralized model. Furthermore, blockchain technology powers the ability for IoT device data to shift from the ownership and control of centralized entities to individuals or communities with Decentralized Autonomous Organizations (DAOs). In the context of smart cities, DAOs can govern cyber-physical systems to have a greater influence over how urban services are being provided. This paper will explore how the core components of a smart city now apply to DAOs. We will also analyze different definitions of DAOs to determine their most important aspects in relation to smart cities. Both categorizations will provide a solid foundation to conduct a cybersecurity assessment of DAOs in smart cities. It will identify the benefits and risks of adopting DAOs as they currently operate. The paper will then provide several mitigation methods to combat cybersecurity risks of DAO integrations. Finally, we will give several insights into what challenges will be faced by DAO and blockchain spaces in the coming years before achieving a higher level of maturity.

Keywords: blockchain, IoT, smart city, DAO

Procedia PDF Downloads 121
25658 Strengthening Legal Protection of Personal Data through Technical Protection Regulation in Line with Human Rights

Authors: Tomy Prihananto, Damar Apri Sudarmadi

Abstract:

Indonesia recognizes the right to privacy as a human right. Indonesia provides legal protection against data management activities because the protection of personal data is a part of human rights. This paper aims to describe the arrangement of data management and data management in Indonesia. This paper is a descriptive research with qualitative approach and collecting data from literature study. Results of this paper are comprehensive arrangement of data that have been set up as a technical requirement of data protection by encryption methods. Arrangements on encryption and protection of personal data are mutually reinforcing arrangements in the protection of personal data. Indonesia has two important and immediately enacted laws that provide protection for the privacy of information that is part of human rights.

Keywords: Indonesia, protection, personal data, privacy, human rights, encryption

Procedia PDF Downloads 182
25657 Analysis of Cooperative Hybrid ARQ with Adaptive Modulation and Coding on a Correlated Fading Channel Environment

Authors: Ibrahim Ozkan

Abstract:

In this study, a cross-layer design which combines adaptive modulation and coding (AMC) and hybrid automatic repeat request (HARQ) techniques for a cooperative wireless network is investigated analytically. Previous analyses of such systems in the literature are confined to the case where the fading channel is independent at each retransmission, which can be unrealistic unless the channel is varying very fast. On the other hand, temporal channel correlation can have a significant impact on the performance of HARQ systems. In this study, utilizing a Markov channel model which accounts for the temporal correlation, the performance of non-cooperative and cooperative networks are investigated in terms of packet loss rate and throughput metrics for Chase combining HARQ strategy.

Keywords: cooperative network, adaptive modulation and coding, hybrid ARQ, correlated fading

Procedia PDF Downloads 144
25656 Integrating a Security Operations Centre with an Organization’s Existing Procedures, Policies and Information Technology Systems

Authors: M. Mutemwa

Abstract:

A Cybersecurity Operation Centre (SOC) is a centralized hub for network event monitoring and incident response. SOCs are critical when determining an organization’s cybersecurity posture because they can be used to detect, analyze and report on various malicious activities. For most organizations, a SOC is not part of the initial design and implementation of the Information Technology (IT) environment but rather an afterthought. As a result, it is not natively a plug and play component; therefore, there are integration challenges when a SOC is introduced into an organization. A SOC is an independent hub that needs to be integrated with existing procedures, policies and IT systems of an organization such as the service desk, ticket logging system, reporting, etc. This paper discussed the challenges of integrating a newly developed SOC to an organization’s existing IT environment. Firstly, the paper begins by looking at what data sources should be incorporated into the Security Information and Event Management (SIEM) such as which host machines, servers, network end points, software, applications, web servers, etc. for security posture monitoring. That is which systems need to be monitored first and the order by which the rest of the systems follow. Secondly, the paper also describes how to integrate the organization’s ticket logging system with the SOC SIEM. That is how the cybersecurity related incidents should be logged by both analysts and non-technical employees of an organization. Also the priority matrix for incident types and notifications of incidents. Thirdly, the paper looks at how to communicate awareness campaigns from the SOC and also how to report on incidents that are found inside the SOC. Lastly, the paper looks at how to show value for the large investments that are poured into designing, building and running a SOC.

Keywords: cybersecurity operation centre, incident response, priority matrix, procedures and policies

Procedia PDF Downloads 153
25655 An Agent-Based Modelling Simulation Approach to Calculate Processing Delay of GEO Satellite Payload

Authors: V. Vicente E. Mujica, Gustavo Gonzalez

Abstract:

The global coverage of broadband multimedia and internet-based services in terrestrial-satellite networks demand particular interests for satellite providers in order to enhance services with low latencies and high signal quality to diverse users. In particular, the delay of on-board processing is an inherent source of latency in a satellite communication that sometimes is discarded for the end-to-end delay of the satellite link. The frame work for this paper includes modelling of an on-orbit satellite payload using an agent model that can reproduce the properties of processing delays. In essence, a comparison of different spatial interpolation methods is carried out to evaluate physical data obtained by an GEO satellite in order to define a discretization function for determining that delay. Furthermore, the performance of the proposed agent and the development of a delay discretization function are together validated by simulating an hybrid satellite and terrestrial network. Simulation results show high accuracy according to the characteristics of initial data points of processing delay for Ku bands.

Keywords: terrestrial-satellite networks, latency, on-orbit satellite payload, simulation

Procedia PDF Downloads 271