Search results for: missing data estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26527

Search results for: missing data estimation

24067 Analysis of Spatial and Temporal Data Using Remote Sensing Technology

Authors: Kapil Pandey, Vishnu Goyal

Abstract:

Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.

Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing

Procedia PDF Downloads 433
24066 Optimal Placement of Phasor Measurement Units (PMU) Using Mixed Integer Programming (MIP) for Complete Observability in Power System Network

Authors: Harshith Gowda K. S, Tejaskumar N, Shubhanga R. B, Gowtham N, Deekshith Gowda H. S

Abstract:

Phasor measurement units (PMU) are playing an important role in the current power system for state estimation. It is necessary to have complete observability of the power system while minimizing the cost. For this purpose, the optimal location of the phasor measurement units in the power system is essential. In a bus system, zero injection buses need to be evaluated to minimize the number of PMUs. In this paper, the optimization problem is formulated using mixed integer programming to obtain the optimal location of the PMUs with increased observability. The formulation consists of with and without zero injection bus as constraints. The formulated problem is simulated using a CPLEX solver in the GAMS software package. The proposed method is tested on IEEE 30, IEEE 39, IEEE 57, and IEEE 118 bus systems. The results obtained show that the number of PMUs required is minimal with increased observability.

Keywords: PMU, observability, mixed integer programming (MIP), zero injection buses (ZIB)

Procedia PDF Downloads 166
24065 Ancient Egyptian Industry Technology of Canopic Jars, Analytical Study and Conservation Processes of Limestone Canopic Jar

Authors: Abd El Rahman Mohamed

Abstract:

Canopic jars made by the ancient Egyptians from different materials were used to preserve the viscera during the mummification process. The canopic jar studied here dates back to the Late Period (712-332 BC). It is found in the Grand Egyptian Museum (GEM), Giza, Egypt. This jar was carved from limestone and covered with a monkey head lid with painted eyes and ears with red pigment and surrounded with black pigment. The jar contains bandages of textile containing mummy viscera with resin and black resin blocks. The canopic jars were made using the sculpting tools that were used by the ancient Egyptians, such as metal chisels (made of copper) and hammers and emptying the mass of the jar from the inside using a tool invented by the ancient Egyptians, which called the emptying drill. This study also aims to use analytical techniques to identify the components of the jar, its contents, pigments, and previous restoration materials and to understand its deterioration aspects. Visual assessment, isolation and identification of fungi, optical microscopy (OM), scanning electron microscopy (SEM), X-ray fluorescence spectroscopy (XRF), X-ray diffraction (XRD), and Fourier transform infrared spectroscopy (FTIR) were used in our study. The jar showed different signs of deterioration, such as dust, dirt, stains, scratches, classifications, missing parts, and breaks; previous conservation materials include using iron wire, completion mortar and an adhesive for assembly. The results revealed that the jar was carved from Dolomite Limestone, red Hematite pigment, Mastic resin, and Linen textile bandages. The previous adhesive was Animal Glue and used Gypsum for the previous completion. The most dominant Microbial infection on the jar was found in the fungi of (Penicillium waksmanii), (Nigrospora sphaerica), (Actinomycetes sp) and (Spore-Forming Gram-Positive Bacilli). Conservation procedures have been applied with high accuracy to conserve the jar, including mechanical and chemical cleaning, re-assembling, completion and consolidation.

Keywords: Canopic jar, Consolidation, Mummification, Resin, Viscera.

Procedia PDF Downloads 74
24064 Beyond Classic Program Evaluation and Review Technique: A Generalized Model for Subjective Distributions with Flexible Variance

Authors: Byung Cheol Kim

Abstract:

The Program Evaluation and Review Technique (PERT) is widely used for project management, but it struggles with subjective distributions, particularly due to its assumptions of constant variance and light tails. To overcome these limitations, we propose the Generalized PERT (G-PERT) model, which enhances PERT by incorporating variability in three-point subjective estimates. Our methodology extends the original PERT model to cover the full range of unimodal beta distributions, enabling the model to handle thick-tailed distributions and offering formulas for computing mean and variance. This maintains the simplicity of PERT while providing a more accurate depiction of uncertainty. Our empirical analysis demonstrates that the G-PERT model significantly improves performance, particularly when dealing with heavy-tail subjective distributions. In comparative assessments with alternative models such as triangular and lognormal distributions, G-PERT shows superior accuracy and flexibility. These results suggest that G-PERT offers a more robust solution for project estimation while still retaining the user-friendliness of the classic PERT approach.

Keywords: PERT, subjective distribution, project management, flexible variance

Procedia PDF Downloads 21
24063 The Different Improvement of Numerical Magnitude and Spatial Representation of Numbers to Symbolic Approximate Arithmetic: A Training Study of Preschooler

Authors: Yu Liang, Wei Wei

Abstract:

Spatial representation of numbers and numerical magnitude are important for preschoolers’ mathematical ability. Mental number line, a typical index to measure numbers spatial representation, and numerical comparison are both related to arithmetic obviously. However, they seem to rely on different mechanisms and probably influence arithmetic through different mechanisms. In line with this idea, preschool children were trained with two tasks to investigate which one is more important for approximate arithmetic. The training of numerical processing and number line estimation were proved to be effective. They both improved the ability of approximate arithmetic. When the difficulty of approximate arithmetic was taken into account, the performance in number line training group was not significantly different among three levels. However, two harder levels achieved significance in numerical comparison training group. Thus, comparing spatial representation ability, symbolic approximation arithmetic relies more on numerical magnitude. Educational implications of the study were discussed.

Keywords: approximate arithmetic, mental number line, numerical magnitude, preschooler

Procedia PDF Downloads 254
24062 An Empirical Investigation of the Challenges of Secure Edge Computing Adoption in Organizations

Authors: Hailye Tekleselassie

Abstract:

Edge computing is a spread computing outline that transports initiative applications closer to data sources such as IoT devices or local edge servers, and possible happenstances would skull the action of new technologies. However, this investigation was attained to investigation the consciousness of technology and communications organization workers and computer users who support the service cloud. Surveys were used to achieve these objectives. Surveys were intended to attain these aims, and it is the functional using survey. Enquiries about confidence are also a key question. Problems like data privacy, integrity, and availability are the factors affecting the company’s acceptance of the service cloud.

Keywords: IoT, data, security, edge computing

Procedia PDF Downloads 84
24061 Research and Application of Consultative Committee for Space Data Systems Wireless Communications Standards for Spacecraft

Authors: Cuitao Zhang, Xiongwen He

Abstract:

According to the new requirements of the future spacecraft, such as networking, modularization and non-cable, this paper studies the CCSDS wireless communications standards, and focuses on the low data-rate wireless communications for spacecraft monitoring and control. The application fields and advantages of wireless communications are analyzed. Wireless communications technology has significant advantages in reducing the weight of the spacecraft, saving time in spacecraft integration, etc. Based on this technology, a scheme for spacecraft data system is put forward. The corresponding block diagram and key wireless interface design of the spacecraft data system are given. The design proposal of the wireless node and information flow of the spacecraft are also analyzed. The results show that the wireless communications scheme is reasonable and feasible. The wireless communications technology can meet the future spacecraft demands in networking, modularization and non-cable.

Keywords: Consultative Committee for Space Data Systems (CCSDS) standards, information flow, non-cable, spacecraft, wireless communications

Procedia PDF Downloads 332
24060 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 367
24059 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example

Authors: Wang Yang

Abstract:

Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.

Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map

Procedia PDF Downloads 106
24058 Integration GIS–SCADA Power Systems to Enclosure Air Dispersion Model

Authors: Ibrahim Shaker, Amr El Hossany, Moustafa Osman, Mohamed El Raey

Abstract:

This paper will explore integration model between GIS–SCADA system and enclosure quantification model to approach the impact of failure-safe event. There are real demands to identify spatial objects and improve control system performance. Nevertheless, the employed methodology is predicting electro-mechanic operations and corresponding time to environmental incident variations. Open processing, as object systems technology, is presented for integration enclosure database with minimal memory size and computation time via connectivity drivers such as ODBC:JDBC during main stages of GIS–SCADA connection. The function of Geographic Information System is manipulating power distribution in contrast to developing issues. In other ward, GIS-SCADA systems integration will require numerical objects of process to enable system model calibration and estimation demands, determine of past events for analysis and prediction of emergency situations for response training.

Keywords: air dispersion model, environmental management, SCADA systems, GIS system, integration power system

Procedia PDF Downloads 369
24057 A Proposal of Ontology about Brazilian Government Transparency Portal

Authors: Estela Mayra de Moura Vianna, Thiago José Tavares Ávila, Bruno Morais Silva, Diego Henrique Bezerra, Paulo Henrique Gomes Silva, Alan Pedro da Silva

Abstract:

The Brazilian Federal Constitution defines the access to information as a crucial right of the citizen and the Law on Access to Public Information, which regulates this right. Accordingly, the Fiscal Responsibility Act, 2000, amended in 2009 by the “Law of Transparency”, began demanding a wider disclosure of public accounts for the society, including electronic media for public access. Thus, public entities began to create "Transparency Portals," which aim to gather a diversity of data and information. However, this information, in general, is still published in formats that do not simplify understanding of the data by citizens and that could be better especially available for audit purposes. In this context, a proposal of ontology about Brazilian Transparency Portal can play a key role in how these data will be better available. This study aims to identify and implement in ontology, the data model about Transparency Portal ecosystem, with emphasis in activities that use these data for some applications, like audits, press activities, social government control, and others.

Keywords: audit, government transparency, ontology, public sector

Procedia PDF Downloads 508
24056 Design and Development of Data Mining Application for Medical Centers in Remote Areas

Authors: Grace Omowunmi Soyebi

Abstract:

Data Mining is the extraction of information from a large database which helps in predicting a trend or behavior, thereby helping management make knowledge-driven decisions. One principal problem of most hospitals in rural areas is making use of the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This Data Mining application is to be designed using a Structured System Analysis and design method, which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the Design and Implementation of a Computerized medical record system. This Computerized system will replace the file management system and help to easily retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.

Keywords: data mining, medical record system, systems programming, computing

Procedia PDF Downloads 211
24055 A Comprehensive Framework to Ensure Data Security in Cloud Computing: Analysis, Solutions, and Approaches

Authors: Loh Fu Quan, Fong Zi Heng, Burra Venkata Durga Kumar

Abstract:

Cloud computing has completely transformed the way many businesses operate. Traditionally, confidential data of a business is stored in computers located within the premise of the business. Therefore, a lot of business capital is put towards maintaining computing resources and hiring IT teams to manage them. The advent of cloud computing changes everything. Instead of purchasing and managing their infrastructure, many businesses have started to shift towards working with the cloud with the help of a cloud service provider (CSP), leading to cost savings. However, it also introduces security risks. This research paper focuses on the security risks that arise during data migration and user authentication in cloud computing. To overcome this problem, this paper provides a comprehensive framework that includes Transport Layer Security (TLS), user authentication, security tokens and multi-level data encryption. This framework aims to prevent authorized access to cloud resources and data leakage, ensuring the confidentiality of sensitive information. This framework can be used by cloud service providers to strengthen the security of their cloud and instil confidence in their users.

Keywords: Cloud computing, Cloud security, Cloud security issues, Cloud security framework

Procedia PDF Downloads 122
24054 D-Epi App: Mobile Application to Control Sodium Valproat Administration in Children with Idiopatic Epilepsy in Indonesia

Authors: Nyimas Annissa Mutiara Andini

Abstract:

There are 325,000 children younger than age 15 in the U.S. have epilepsy. In Indonesia, 40% of 3,5 millions cases of epilepsy happens in children. The most common type of epilepsy, which affects 6 out of 10 people with the disorder, is called idiopathic epilepsy and which has no identifiable cause. One of the most commonly used medications in the treatment of this childhood epilepsy is sodium valproate. Administration of sodium valproat in children has a problem to fail. Nearly 60% of pediatric patients known were mildly, moderately, or severely non-adherent with therapy during the first six months of treatment. Many parents or caregiver took far less medication than prescribed, and the treatment-adherence pattern for the majority of patients was established during the first month of treatment. 42% of the patients were almost always given their medications as prescribed but 13% had very poor adherence even in the early weeks and months of treatment. About 7% of patients initially gave the medication correctly 90% of the time, but adherence dropped to around 20% within six months of starting treatment. Over the six months of observation, the total missing of administration is about four out of 14 doses in any given week. This fail can cause the epilepsy to relapse. Whereas, current reported epilepsy disorder were significantly more likely than those never diagnosed to experience depression (8% vs 2%), anxiety (17% vs 3%), attention-deficit/hyperactivity disorder (23% vs 6%), developmental delay (51% vs 3%), autism/autism spectrum disorder (16% vs 1%), and headaches (14% vs 5%) (all P< 0.05). They had a greater risk of limitation in the ability to do things (relative risk: 9.22; 95% CI: 7.56–11.24), repeating a school grade (relative risk: 2.59; CI: 1.52–4.40), and potentially having unmet medical and mental health needs. In the other side, technology can help to make our life easier. One of the technology, that we can use is a mobile application. A mobile app is a software program we can download and access directly using our phone. Indonesians are highly mobile centric. They use, on average, 6.7 applications over a 30 day period. This paper is aimed to describe an application that could help to control a sodium valproat administration in children; we call it as D-Epi app. D-Epi app is a downloadable application that can help parents or caregiver alert by a timer-related application to warn whether it is the time to administer the sodium valproat. It works not only as a standard alarm, but also inform important information about the drug and emergency stuffs to do to children with epilepsy. This application could help parents and caregiver to take care a child with epilepsy in Indonesia.

Keywords: application, children, D-Epi, epilepsy

Procedia PDF Downloads 282
24053 Using AI for Analysing Political Leaders

Authors: Shuai Zhao, Shalendra D. Sharma, Jin Xu

Abstract:

This research uses advanced machine learning models to learn a number of hypotheses regarding political executives. Specifically, it analyses the impact these powerful leaders have on economic growth by using leaders’ data from the Archigos database from 1835 to the end of 2015. The data is processed by the AutoGluon, which was developed by Amazon. Automated Machine Learning (AutoML) and AutoGluon can automatically extract features from the data and then use multiple classifiers to train the data. Use a linear regression model and classification model to establish the relationship between leaders and economic growth (GDP per capita growth), and to clarify the relationship between their characteristics and economic growth from a machine learning perspective. Our work may show as a model or signal for collaboration between the fields of statistics and artificial intelligence (AI) that can light up the way for political researchers and economists.

Keywords: comparative politics, political executives, leaders’ characteristics, artificial intelligence

Procedia PDF Downloads 87
24052 Data Quality on Regular Immunization Programme at Birkod District: Somali Region, Ethiopia

Authors: Eyob Seife, Tesfalem Teshome, Bereket Seyoum, Behailu Getachew, Yohans Demis

Abstract:

Developing countries continue to face preventable communicable diseases, such as vaccine-preventable diseases. The Expanded Programme on Immunization (EPI) was established by the World Health Organization in 1974 to control these diseases. Health data use is crucial in decision-making, but ensuring data quality remains challenging. The study aimed to assess the accuracy ratio, timeliness, and quality index of regular immunization programme data in the Birkod district of the Somali Region, Ethiopia. For poor data quality, technical, contextual, behavioral, and organizational factors are among contributors. The study used a quantitative cross-sectional design conducted in September 2022GC using WHO-recommended data quality self-assessment tools. The accuracy ratio and timeliness of reports on regular immunization programmes were assessed for two health centers and three health posts in the district for one fiscal year. Moreover, the quality index assessment was conducted at the district level and health facilities by trained assessors. The study found poor data quality in the accuracy ratio and timeliness of reports at all health units, which includes zeros. Overreporting was observed for most facilities, particularly at the health post level. Health centers showed a relatively better accuracy ratio than health posts. The quality index assessment revealed poor quality at all levels. The study recommends that responsible bodies at different levels improve data quality using various approaches, such as the capacitation of health professionals and strengthening the quality index components. The study highlighted the need for attention to data quality in general, specifically at the health post level, and improving the quality index at all levels, which is essential.

Keywords: Birkod District, data quality, quality index, regular immunization programme, Somali Region-Ethiopia

Procedia PDF Downloads 91
24051 The Results of Longitudinal Water Quality Monitoring of the Brandywine River, Chester County, Pennsylvania by High School Students

Authors: Dina L. DiSantis

Abstract:

Strengthening a sense of responsibility while relating global sustainability concepts such as water quality and pollution to a local water system can be achieved by teaching students to conduct and interpret water quality monitoring tests. When students conduct their own research, they become better stewards of the environment. Providing outdoor learning and place-based opportunities for students helps connect them to the natural world. By conducting stream studies and collecting data, students are able to better understand how the natural environment is a place where everything is connected. Students have been collecting physical, chemical and biological data along the West and East Branches of the Brandywine River, in Pennsylvania for over ten years. The stream studies are part of the advanced placement environmental science and aquatic science courses that are offered as electives to juniors and seniors at the Downingtown High School West Campus in Downingtown, Pennsylvania. Physical data collected includes: temperature, turbidity, width, depth, velocity, and volume of flow or discharge. The chemical tests conducted are: dissolved oxygen, carbon dioxide, pH, nitrates, alkalinity and phosphates. Macroinvertebrates are collected with a kick net, identified and then released. Students collect the data from several locations while traveling by canoe. In the classroom, students prepare a water quality data analysis and interpretation report based on their collected data. The summary of the results from longitudinal water quality data collection by students, as well as the strengths and weaknesses of student data collection will be presented.

Keywords: place-based, student data collection, sustainability, water quality monitoring

Procedia PDF Downloads 156
24050 Visual Analytics of Higher Order Information for Trajectory Datasets

Authors: Ye Wang, Ickjai Lee

Abstract:

Due to the widespread of mobile sensing, there is a strong need to handle trails of moving objects, trajectories. This paper proposes three visual analytic approaches for higher order information of trajectory data sets based on the higher order Voronoi diagram data structure. Proposed approaches reveal geometrical information, topological, and directional information. Experimental results demonstrate the applicability and usefulness of proposed three approaches.

Keywords: visual analytics, higher order information, trajectory datasets, spatio-temporal data

Procedia PDF Downloads 402
24049 Reducing Uncertainty of Monte Carlo Estimated Fatigue Damage in Offshore Wind Turbines Using FORM

Authors: Jan-Tore H. Horn, Jørgen Juncher Jensen

Abstract:

Uncertainties related to fatigue damage estimation of non-linear systems are highly dependent on the tail behaviour and extreme values of the stress range distribution. By using a combination of the First Order Reliability Method (FORM) and Monte Carlo simulations (MCS), the accuracy of the fatigue estimations may be improved for the same computational efforts. The method is applied to a bottom-fixed, monopile-supported large offshore wind turbine, which is a non-linear and dynamically sensitive system. Different curve fitting techniques to the fatigue damage distribution have been used depending on the sea-state dependent response characteristics, and the effect of a bi-linear S-N curve is discussed. Finally, analyses are performed on several environmental conditions to investigate the long-term applicability of this multistep method. Wave loads are calculated using state-of-the-art theory, while wind loads are applied with a simplified model based on rotor thrust coefficients.

Keywords: fatigue damage, FORM, monopile, Monte Carlo, simulation, wind turbine

Procedia PDF Downloads 263
24048 Self-Supervised Pretraining on Sequences of Functional Magnetic Resonance Imaging Data for Transfer Learning to Brain Decoding Tasks

Authors: Sean Paulsen, Michael Casey

Abstract:

In this work we present a self-supervised pretraining framework for transformers on functional Magnetic Resonance Imaging (fMRI) data. First, we pretrain our architecture on two self-supervised tasks simultaneously to teach the model a general understanding of the temporal and spatial dynamics of human auditory cortex during music listening. Our pretraining results are the first to suggest a synergistic effect of multitask training on fMRI data. Second, we finetune the pretrained models and train additional fresh models on a supervised fMRI classification task. We observe significantly improved accuracy on held-out runs with the finetuned models, which demonstrates the ability of our pretraining tasks to facilitate transfer learning. This work contributes to the growing body of literature on transformer architectures for pretraining and transfer learning with fMRI data, and serves as a proof of concept for our pretraining tasks and multitask pretraining on fMRI data.

Keywords: transfer learning, fMRI, self-supervised, brain decoding, transformer, multitask training

Procedia PDF Downloads 91
24047 Lessons Learned from Ransomware-as-a-Service (RaaS) Organized Campaigns

Authors: Vitali Kremez

Abstract:

The researcher monitored an organized ransomware campaign in order to gain significant visibility into the tactics, techniques, and procedures employed by a campaign boss operating a ransomware scheme out of Russia. As the Russian hacking community lowered the access requirements for unsophisticated Russian cybercriminals to engage in ransomware campaigns, corporations and individuals face a commensurately greater challenge of effectively protecting their data and operations from being held ransom. This report discusses two notorious ransomware campaigns. Though the loss of data can be devastating, the findings demonstrate that sending ransom payments does not always help obtain data. Key learnings: 1. From the ransomware affiliate perspective, such campaigns have significantly lowered the barriers for entry for low-tier cybercriminals. 2. Ransomware revenue amounts are not as glamorous and fruitful as they are often publicly reported. Average ransomware crime bosses make only $90K per year on average. 3. Data gathered indicates that sending ransom payments does not always help obtain data. 4. The talk provides the complete payout structure and Bitcoin laundering operation related to the ransomware-as-a-service campaign.

Keywords: bitcoin, cybercrime, ransomware, Russia

Procedia PDF Downloads 196
24046 Comparative Analysis of Sigmoidal Feedforward Artificial Neural Networks and Radial Basis Function Networks Approach for Localization in Wireless Sensor Networks

Authors: Ashish Payal, C. S. Rai, B. V. R. Reddy

Abstract:

With the increasing use and application of Wireless Sensor Networks (WSN), need has arisen to explore them in more effective and efficient manner. An important area which can bring efficiency to WSNs is the localization process, which refers to the estimation of the position of wireless sensor nodes in an ad hoc network setting, in reference to a coordinate system that may be internal or external to the network. In this paper, we have done comparison and analysed Sigmoidal Feedforward Artificial Neural Networks (SFFANNs) and Radial Basis Function (RBF) networks for developing localization framework in WSNs. The presented work utilizes the Received Signal Strength Indicator (RSSI), measured by static node on 100 x 100 m2 grid from three anchor nodes. The comprehensive evaluation of these approaches is done using MATLAB software. The simulation results effectively demonstrate that FFANNs based sensor motes will show better localization accuracy as compared to RBF.

Keywords: localization, wireless sensor networks, artificial neural network, radial basis function, multi-layer perceptron, backpropagation, RSSI, GPS

Procedia PDF Downloads 341
24045 Estimation of Grinding Force and Material Characterization of Ceramic Matrix Composite

Authors: Lakshminarayanan, Vijayaraghavan, Krishnamurthy

Abstract:

The ever-increasing demand for high efficiency in automotive and aerospace applications requires new materials to suit to high temperature applications. The Ceramic Matrix Composites nowadays find its applications for high strength and high temperature environments. In this paper, Al2O3 and Sic ceramic materials are taken in particulate form as matrix and reinforcement respectively. They are blended together in Ball Milling and compacted in Cold Compaction Machine by powder metallurgy technique. Scanning Electron Microscope images are taken for the samples in order to find out proper blending of powders. Micro harness testing is also carried out for the samples in Vickers Micro Hardness Testing Equipment. Surface grinding of the samples is also carried out in Surface Grinding Machine in order to find out grinding force estimates. The surface roughness of the grounded samples is also taken in Surface Profilometer. These are yielding promising results.

Keywords: ceramic matrix composite, cold compaction, material characterization, particulate and surface grinding

Procedia PDF Downloads 244
24044 Analysis of Cross-Sectional and Retrograde Data on the Prevalence of Marginal Gingivitis

Authors: Ilma Robo, Saimir Heta, Nedja Hysi, Vera Ostreni

Abstract:

Introduction: Marginal gingivitis is a disease with considerable frequency among patients who present routinely for periodontal control and treatment. In fact, this disease may not have alarming symptoms in patients and may go unnoticed by themselves when personal hygiene conditions are optimal. The aim of this study was to collect retrograde data on the prevalence of marginal gingiva in the respective group of patients, evaluated according to specific periodontal diagnostic tools. Materials and methods: The study was conducted in two patient groups. The first group was with 34 patients, during December 2019-January 2020, and the second group was with 64 patients during 2010-2018 (each year in the mentioned monthly period). Bacterial plaque index, hemorrhage index, amount of gingival fluid, presence of xerostomia and candidiasis were recorded in patients. Results: Analysis of the collected data showed that susceptibility to marginal gingivitis shows higher values according to retrograde data, compared to cross-sectional ones. Susceptibility to candidiasis and the occurrence of xerostomia, even in the combination of both pathologies, as risk factors for the occurrence of marginal gingivitis, show higher values ​​according to retrograde data. The female are presented with a reduced bacterial plaque index than the males, but more importantly, this index in the females is also associated with a reduced index of gingival hemorrhage, in contrast to the males. Conclusions: Cross-sectional data show that the prevalence of marginal gingivitis is more reduced, compared to retrograde data, based on the hemorrhage index and the bacterial plaque index together. Changes in production in the amount of gingival fluid show a higher prevalence of marginal gingivitis in cross-sectional data than in retrograde data; this is based on the sophistication of the way data are recorded, which evolves over time and also based on professional sensitivity to this phenomenon.

Keywords: marginal gingivitis, cross-sectional, retrograde, prevalence

Procedia PDF Downloads 162
24043 Why Do We Need Hierachical Linear Models?

Authors: Mustafa Aydın, Ali Murat Sunbul

Abstract:

Hierarchical or nested data structures usually are seen in many research areas. Especially, in the field of education, if we examine most of the studies, we can see the nested structures. Students in classes, classes in schools, schools in cities and cities in regions are similar nested structures. In a hierarchical structure, students being in the same class, sharing the same physical conditions and similar experiences and learning from the same teachers, they demonstrate similar behaviors between them rather than the students in other classes.

Keywords: hierarchical linear modeling, nested data, hierarchical structure, data structure

Procedia PDF Downloads 653
24042 Asymptotic Confidence Intervals for the Difference of Coefficients of Variation in Gamma Distributions

Authors: Patarawan Sangnawakij, Sa-Aat Niwitpong

Abstract:

In this paper, we proposed two new confidence intervals for the difference of coefficients of variation, CIw and CIs, in two independent gamma distributions. These proposed confidence intervals using the close form method of variance estimation which was presented by Donner and Zou (2010) based on concept of Wald and Score confidence interval, respectively. Monte Carlo simulation study is used to evaluate the performance, coverage probability and expected length, of these confidence intervals. The results indicate that values of coverage probabilities of the new confidence interval based on Wald and Score are satisfied the nominal coverage and close to nominal level 0.95 in various situations, particularly, the former proposed confidence interval is better when sample sizes are small. Moreover, the expected lengths of the proposed confidence intervals are nearly difference when sample sizes are moderate to large. Therefore, in this study, the confidence interval for the difference of coefficients of variation which based on Wald is preferable than the other one confidence interval.

Keywords: confidence interval, score’s interval, wald’s interval, coefficient of variation, gamma distribution, simulation study

Procedia PDF Downloads 428
24041 Investigating Cloud Forensics: Challenges, Tools, and Practical Case Studies

Authors: Noha Badkook, Maryam Alsubaie, Samaher Dawood, Enas Khairullah

Abstract:

Cloud computing has introduced transformative benefits in data storage and accessibility while posing unique forensic challenges. This paper explores cloud forensics, focusing on investigating and analyzing evidence from cloud environments to address issues such as unauthorized data access, manipulation, and breaches. The research highlights the practical use of open-source forensic tools like Autopsy and Bulk Extractor in real-world scenarios, including unauthorized data sharing via Google Drive and the misuse of personal cloud storage for sensitive information leaks. This work underscores the growing importance of robust forensic procedures and accessible tools in ensuring data security and accountability in cloud ecosystems.

Keywords: cloud forensic, tools, challenge, autopsy, bulk extractor

Procedia PDF Downloads 18
24040 The Disposable Identities; Enabling Trust-by-Design to Build Sustainable Data-Driven Value

Authors: Lorna Goulden, Kai M. Hermsen, Jari Isohanni, Mirko Ross, Jef Vanbockryck

Abstract:

This article introduces disposable identities, with reference use cases and explores possible technical approaches. The proposed approach, when fully developed as an open-source toolkit, enables developers of mobile or web apps to employ a self-sovereign identity and data privacy framework, in order to rebuild trust in digital services by providing greater transparency, decentralized control, and GDPR compliance. With a user interface for the management of self-sovereign identity, digital authorizations, and associated data-driven transactions, the advantage of Disposable Identities is that they may also contain verifiable data such as the owner’s photograph, official or even biometric identifiers for more proactive prevention of identity abuse. These Disposable Identities designed for decentralized privacy management can also be time, purpose and context-bound through a secure digital contract; with verification functionalities based on tamper-proof technology.

Keywords: dentity, trust, self-sovereign, disposable identity, privacy toolkit, decentralised identity, verifiable credential, cybersecurity, data driven business, PETs, GDPRdentity, trust, self-sovereign, disposable identity, privacy toolkit, decentralised identity, verifiable credential, cybersecurity, data driven business, PETs, GDPRI

Procedia PDF Downloads 220
24039 Best Practices to Enhance Patient Security and Confidentiality When Using E-Health in South Africa

Authors: Lethola Tshikose, Munyaradzi Katurura

Abstract:

Information and Communication Technology (ICT) plays a critical role in improving daily healthcare processes. The South African healthcare organizations have adopted Information Systems to integrate their patient records. This has made it much easier for healthcare organizations because patient information can now be accessible at any time. The primary purpose of this research study was to investigate the best practices that can be applied to enhance patient security and confidentiality when using e-health systems in South Africa. Security and confidentiality are critical in healthcare organizations as they ensure safety in EHRs. The research study used an inductive research approach that included a thorough literature review; therefore, no data was collected. The research paper’s scope included patient data and possible security threats associated with healthcare systems. According to the study, South African healthcare organizations discovered various patient data security and confidentiality issues. The study also revealed that when it comes to handling patient data, health professionals sometimes make mistakes. Some may not be computer literate, which posed issues and caused data to be tempered with. The research paper recommends that healthcare organizations ensure that security measures are adequately supported and promoted by their IT department. This will ensure that adequate resources are distributed to keep patient data secure and confidential. Healthcare organizations must correctly use standards set up by IT specialists to solve patient data security and confidentiality issues. Healthcare organizations must make sure that their organizational structures are adaptable to improve security and confidentiality.

Keywords: E-health, EHR, security, confidentiality, healthcare

Procedia PDF Downloads 60
24038 The Effect of Data Integration to the Smart City

Authors: Richard Byrne, Emma Mulliner

Abstract:

Smart cities are a vision for the future that is increasingly becoming a reality. While a key concept of the smart city is the ability to capture, communicate, and process data that has long been produced through day-to-day activities of the city, much of the assessment models in place neglect this fact to focus on ‘smartness’ concepts. Although it is true technology often provides the opportunity to capture and communicate data in more effective ways, there are also human processes involved that are just as important. The growing importance with regards to the use and ownership of data in society can be seen by all with companies such as Facebook and Google increasingly coming under the microscope, however, why is the same scrutiny not applied to cities? The research area is therefore of great importance to the future of our cities here and now, while the findings will be of just as great importance to our children in the future. This research aims to understand the influence data is having on organisations operating throughout the smart cities sector and employs a mixed-method research approach in order to best answer the following question: Would a data-based evaluation model for smart cities be more appropriate than a smart-based model in assessing the development of the smart city? A fully comprehensive literature review concluded that there was a requirement for a data-driven assessment model for smart cities. This was followed by a documentary analysis to understand the root source of data integration to the smart city. A content analysis of city data platforms enquired as to the alternative approaches employed by cities throughout the UK and draws on best practice from New York to compare and contrast. Grounded in theory, the research findings to this point formulated a qualitative analysis framework comprised of: the changing environment influenced by data, the value of data in the smart city, the data ecosystem of the smart city and organisational response to the data orientated environment. The framework was applied to analyse primary data collected through the form of interviews with both public and private organisations operating throughout the smart cities sector. The work to date represents the first stage of data collection that will be built upon by a quantitative research investigation into the feasibility of data network effects in the smart city. An analysis into the benefits of data interoperability supporting services to the smart city in the areas of health and transport will conclude the research to achieve the aim of inductively forming a framework that can be applied to future smart city policy. To conclude, the research recognises the influence of technological perspectives in the development of smart cities to date and highlights this as a challenge to introduce theory applied with a planning dimension. The primary researcher has utilised their experience working in the public sector throughout the investigation to reflect upon what is perceived as a gap in practice of where we are today, to where we need to be tomorrow.

Keywords: data, planning, policy development, smart cities

Procedia PDF Downloads 312