Search results for: data security assurance
24970 Energy Initiatives for Turkey
Authors: A.Beril Tugrul, Selahattin Cimen
Abstract:
Dependency of humanity on the energy is ever-increasing today and the energy policies are reaching undeniable and un-ignorable dimensions steering the political events as well. Therefore, energy has the highest priority for Turkey like any other country. In this study, the energy supply security for Turkey evaluated according to the strategic criteria of energy policy. Under these circumstances, different alternatives are described and assessed with in terms of the energy expansion of Turkey. With this study, different opportunities in the energy expansion of Turkey is clarified and emphasized.Keywords: energy policy, energy strategy, future projection, Turkey
Procedia PDF Downloads 39024969 Sourcing and Compiling a Maltese Traffic Dataset MalTra
Authors: Gabriele Borg, Alexei De Bono, Charlie Abela
Abstract:
There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale.Keywords: Big Data, vehicular traffic, traffic management, mobile data patterns
Procedia PDF Downloads 10924968 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study
Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar
Abstract:
Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices
Procedia PDF Downloads 50824967 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence
Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno
Abstract:
Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index
Procedia PDF Downloads 16824966 Real-Time Online Tracking Platform
Authors: Denis Obrul, Borut Žalik
Abstract:
We present an extendable online real-time tracking platform that can be used to track a wide variety of location-aware devices. These can range from GPS devices mounted inside a vehicle, closed and secure systems such as Teltonika and to mobile phones running multiple platforms. Special consideration is given to decentralized approach, security and flexibility. A number of different use cases are presented as a proof of concept.Keywords: real-time, online, gps, tracking, web application
Procedia PDF Downloads 35324965 Climate Adaptations to Traditional Milpa Farming Practices in Mayan Communities of Southern Belize: A Socio-Ecological Systems Approach
Authors: Kristin Drexler
Abstract:
Climate change has exacerbated food and livelihood insecurity for Mayan milpa farmers in Central America. For centuries, milpa farming has been sustainable for subsistence; however, in the last 50 years, milpas have become less reliable due to accelerating climate change, resource degradation, declining markets, poverty, and other factors. Using interviews with extension leaders and milpa farmers in Belize, this qualitative study examines the capacity for increasing climate-smart agriculture (CSA) aspects of existing traditional milpa practices, specifically no-burn mulching, soil enrichment, and the use of cover plants. Applying community capitals and socio-ecological systems frameworks, this study finds four key capitals were perceived by farmers and agriculture extension leaders as barriers for increasing CSA practices: (1) human-capacity, (2) financial, (3) infrastructure, and (4) governance-justice capitals. The key barriers include a lack of CSA technology and pest management knowledge-sharing (human-capacity), unreliable roads and utility services (infrastructure), the closure of small markets and crop-buying programs in Belize (financial), and constraints on extension services and exacerbating a sense of marginalization in Maya communities (governance-justice). Recommendations are presented for government action to reduce barriers and facilitate an increase in milpa crop productivity, promote food and livelihood security, and enable climate resilience of Mayan milpa communities in Belize.Keywords: socio-ecological systems, community capitals, climate-smart agriculture, food security, milpa, Belize
Procedia PDF Downloads 9124964 Database Management System for Orphanages to Help Track of Orphans
Authors: Srivatsav Sanjay Sridhar, Asvitha Raja, Prathit Kalra, Soni Gupta
Abstract:
Database management is a system that keeps track of details about a person in an organisation. Not a lot of orphanages these days are shifting to a computer and program-based system, but unfortunately, most have only pen and paper-based records, which not only consumes space but it is also not eco-friendly. It comes as a hassle when one has to view a record of a person as they have to search through multiple records, and it will consume time. This program will organise all the data and can pull out any information about anyone whose data is entered. This is also a safe way of storage as physical data gets degraded over time or, worse, destroyed due to natural disasters. In this developing world, it is only smart enough to shift all data to an electronic-based storage system. The program comes with all features, including creating, inserting, searching, and deleting the data, as well as printing them.Keywords: database, orphans, programming, C⁺⁺
Procedia PDF Downloads 15624963 Modelling the Dynamics and Optimal Control Strategies of Terrorism within the Southern Borno State Nigeria
Authors: Lubem Matthew Kwaghkor
Abstract:
Terrorism, which remains one of the largest threats faced by various nations and communities around the world, including Nigeria, is the calculated use of violence to create a general climate of fear in a population to attain particular goals that might be political, religious, or economical. Several terrorist groups are currently active in Nigeria, leading to attacks on both civil and military targets. Among these groups, Boko Haram is the deadliest terrorist group operating majorly in Borno State. The southern part of Borno State in North-Eastern Nigeria has been plagued by terrorism, insurgency, and conflict for several years. Understanding the dynamics of terrorism is crucial for developing effective strategies to mitigate its impact on communities and to facilitate peace-building efforts. This research aims to develop a mathematical model that captures the dynamics of terrorism within the southern part of Borno State, Nigeria, capturing both government and local community intervention strategies as control measures in combating terrorism. A compartmental model of five nonlinear differential equations is formulated. The model analyses show that a feasible solution set of the model exists and is bounded. Stability analyses show that both the terrorism free equilibrium and the terrorism endermic equilibrium are asymptotically stable, making the model to have biological meaning. Optimal control theory will be employed to identify the most effective strategy to prevent or minimize acts of terrorism. The research outcomes are expected to contribute towards enhancing security and stability in Southern Borno State while providing valuable insights for policymakers, security agencies, and researchers. This is an ongoing research.Keywords: modelling, terrorism, optimal control, susceptible, non-susceptible, community intervention
Procedia PDF Downloads 2224962 Creation of Computerized Benchmarks to Facilitate Preparedness for Biological Events
Abstract:
Introduction: Communicable diseases and pandemics pose a growing threat to the well-being of the global population. A vital component of protecting the public health is the creation and sustenance of a continuous preparedness for such hazards. A joint Israeli-German task force was deployed in order to develop an advanced tool for self-evaluation of emergency preparedness for variable types of biological threats. Methods: Based on a comprehensive literature review and interviews with leading content experts, an evaluation tool was developed based on quantitative and qualitative parameters and indicators. A modified Delphi process was used to achieve consensus among over 225 experts from both Germany and Israel concerning items to be included in the evaluation tool. Validity and applicability of the tool for medical institutions was examined in a series of simulation and field exercises. Results: Over 115 German and Israeli experts reviewed and examined the proposed parameters as part of the modified Delphi cycles. A consensus of over 75% of experts was attained for 183 out of 188 items. The relative importance of each parameter was rated as part of the Delphi process, in order to define its impact on the overall emergency preparedness. The parameters were integrated in computerized web-based software that enables to calculate scores of emergency preparedness for biological events. Conclusions: The parameters developed in the joint German-Israeli project serve as benchmarks that delineate actions to be implemented in order to create and maintain an ongoing preparedness for biological events. The computerized evaluation tool enables to continuously monitor the level of readiness and thus strengths and gaps can be identified and corrected appropriately. Adoption of such a tool is recommended as an integral component of quality assurance of public health and safety.Keywords: biological events, emergency preparedness, bioterrorism, natural biological events
Procedia PDF Downloads 42324961 New Two-Way Map-Reduce Join Algorithm: Hash Semi Join
Authors: Marwa Hussein Mohamed, Mohamed Helmy Khafagy, Samah Ahmed Senbel
Abstract:
Map Reduce is a programming model used to handle and support massive data sets. Rapidly increasing in data size and big data are the most important issue today to make an analysis of this data. map reduce is used to analyze data and get more helpful information by using two simple functions map and reduce it's only written by the programmer, and it includes load balancing , fault tolerance and high scalability. The most important operation in data analysis are join, but map reduce is not directly support join. This paper explains two-way map-reduce join algorithm, semi-join and per split semi-join, and proposes new algorithm hash semi-join that used hash table to increase performance by eliminating unused records as early as possible and apply join using hash table rather than using map function to match join key with other data table in the second phase but using hash tables isn't affecting on memory size because we only save matched records from the second table only. Our experimental result shows that using a hash table with hash semi-join algorithm has higher performance than two other algorithms while increasing the data size from 10 million records to 500 million and running time are increased according to the size of joined records between two tables.Keywords: map reduce, hadoop, semi join, two way join
Procedia PDF Downloads 51324960 Using Implicit Data to Improve E-Learning Systems
Authors: Slah Alsaleh
Abstract:
In the recent years and with popularity of internet and technology, e-learning became a major part of majority of education systems. One of the advantages the e-learning systems provide is the large amount of information available about the students' behavior while communicating with the e-learning system. Such information is very rich and it can be used to improve the capability and efficiency of e-learning systems. This paper discusses how e-learning can benefit from implicit data in different ways including; creating homogeneous groups of student, evaluating students' learning, creating behavior profiles for students and identifying the students through their behaviors.Keywords: e-learning, implicit data, user behavior, data mining
Procedia PDF Downloads 31024959 Enabling Quantitative Urban Sustainability Assessment with Big Data
Authors: Changfeng Fu
Abstract:
Sustainable urban development has been widely accepted a common sense in the modern urban planning and design. However, the measurement and assessment of urban sustainability, especially the quantitative assessment have been always an issue obsessing planning and design professionals. This paper will present an on-going research on the principles and technologies to develop a quantitative urban sustainability assessment principles and techniques which aim to integrate indicators, geospatial and geo-reference data, and assessment techniques together into a mechanism. It is based on the principles and techniques of geospatial analysis with GIS and statistical analysis methods. The decision-making technologies and methods such as AHP and SMART are also adopted to address overall assessment conclusions. The possible interfaces and presentation of data and quantitative assessment results are also described. This research is based on the knowledge, situations and data sources of UK, but it is potentially adaptable to other countries or regions. The implementation potentials of the mechanism are also discussed.Keywords: urban sustainability assessment, quantitative analysis, sustainability indicator, geospatial data, big data
Procedia PDF Downloads 35924958 Leveraging NFT Secure and Decentralized Lending: A Defi Solution
Authors: Chandan M. S., Darshan G. A., Vyshnavi, Abhishek T.
Abstract:
In the evolving world of technology and digital assets, non-fungible tokens (NFTs) have emerged as the latest advancement. These digital assets represent ownership of intangible items and hold significant value. Unlike cryptocurrencies, like Ethereum or Bitcoin, NFTs cannot be exchanged due to their nature. Each NFT has an indivisible value. NFTs not only pave the way for financial services but also open up fresh opportunities for creators, buyers and artists. To revolutionize financing in the DeFi space, this proposed approach utilizes NFTs generated from digital arts. By eliminating intermediaries, this innovative method ensures trust and security in transactions. The idea entails automating borrower-lender interactions through contracts while securely storing data using blockchain technology. Borrowers can obtain funding by leveraging assets such as estate, artwork and collectibles that are often illiquid. The key component of this system is contracts that independently execute lending agreements and collateral transfers within predefined parameters. By leveraging the Ethereum blockchain, this project aims to provide consumers with access to a platform offering a wide range of financial services. The demonstration illustrates how NFT lending and borrowing is managed through contracts, providing a secure and trustworthy transaction environment.Keywords: blockchain, defi, NFT, ethereum, marketplace
Procedia PDF Downloads 5324957 Development of Generalized Correlation for Liquid Thermal Conductivity of N-Alkane and Olefin
Authors: A. Ishag Mohamed, A. A. Rabah
Abstract:
The objective of this research is to develop a generalized correlation for the prediction of thermal conductivity of n-Alkanes and Alkenes. There is a minority of research and lack of correlation for thermal conductivity of liquids in the open literature. The available experimental data are collected covering the groups of n-Alkanes and Alkenes.The data were assumed to correlate to temperature using Filippov correlation. Nonparametric regression of Grace Algorithm was used to develop the generalized correlation model. A spread sheet program based on Microsoft Excel was used to plot and calculate the value of the coefficients. The results obtained were compared with the data that found in Perry's Chemical Engineering Hand Book. The experimental data correlated to the temperature ranged "between" 273.15 to 673.15 K, with R2 = 0.99.The developed correlation reproduced experimental data that which were not included in regression with absolute average percent deviation (AAPD) of less than 7 %. Thus the spread sheet was quite accurate which produces reliable data.Keywords: N-Alkanes, N-Alkenes, nonparametric, regression
Procedia PDF Downloads 65424956 Survey on Arabic Sentiment Analysis in Twitter
Authors: Sarah O. Alhumoud, Mawaheb I. Altuwaijri, Tarfa M. Albuhairi, Wejdan M. Alohaideb
Abstract:
Large-scale data stream analysis has become one of the important business and research priorities lately. Social networks like Twitter and other micro-blogging platforms hold an enormous amount of data that is large in volume, velocity and variety. Extracting valuable information and trends out of these data would aid in a better understanding and decision-making. Multiple analysis techniques are deployed for English content. Moreover, one of the languages that produce a large amount of data over social networks and is least analyzed is the Arabic language. The proposed paper is a survey on the research efforts to analyze the Arabic content in Twitter focusing on the tools and methods used to extract the sentiments for the Arabic content on Twitter.Keywords: big data, social networks, sentiment analysis, twitter
Procedia PDF Downloads 57624955 Experimental Research and Analyses of Yoruba Native Speakers’ Chinese Phonetic Errors
Authors: Obasa Joshua Ifeoluwa
Abstract:
Phonetics is the foundation and most important part of language learning. This article, through an acoustic experiment as well as using Praat software, uses Yoruba students’ Chinese consonants, vowels, and tones pronunciation to carry out a visual comparison with that of native Chinese speakers. This article is aimed at Yoruba native speakers learning Chinese phonetics; therefore, Yoruba students are selected. The students surveyed are required to be at an elementary level and have learned Chinese for less than six months. The students selected are all undergraduates majoring in Chinese Studies at the University of Lagos. These students have already learned Chinese Pinyin and are all familiar with the pinyin used in the provided questionnaire. The Chinese students selected are those that have passed the level two Mandarin proficiency examination, which serves as an assurance that their pronunciation is standard. It is discovered in this work that in terms of Mandarin’s consonants pronunciation, Yoruba students cannot distinguish between the voiced and voiceless as well as the aspirated and non-aspirated phonetics features. For instance, while pronouncing [ph] it is clearly shown in the spectrogram that the Voice Onset Time (VOT) of a Chinese speaker is higher than that of a Yoruba native speaker, which means that the Yoruba speaker is pronouncing the unaspirated counterpart [p]. Another difficulty is to pronounce some affricates like [tʂ]、[tʂʰ]、[ʂ]、[ʐ]、 [tɕ]、[tɕʰ]、[ɕ]. This is because these sounds are not in the phonetic system of the Yoruba language. In terms of vowels, some students find it difficult to pronounce some allophonic high vowels such as [ɿ] and [ʅ], therefore pronouncing them as their phoneme [i]; another pronunciation error is pronouncing [y] as [u], also as shown in the spectrogram, a student pronounced [y] as [iu]. In terms of tone, it is most difficult for students to differentiate between the second (rising) and third (falling and rising) tones because these tones’ emphasis is on the rising pitch. This work concludes that the major error made by Yoruba students while pronouncing Chinese sounds is caused by the interference of their first language (LI) and sometimes by their lingua franca.Keywords: Chinese, Yoruba, error analysis, experimental phonetics, consonant, vowel, tone
Procedia PDF Downloads 11124954 Estimating Current Suicide Rates Using Google Trends
Authors: Ladislav Kristoufek, Helen Susannah Moat, Tobias Preis
Abstract:
Data on the number of people who have committed suicide tends to be reported with a substantial time lag of around two years. We examine whether online activity measured by Google searches can help us improve estimates of the number of suicide occurrences in England before official figures are released. Specifically, we analyse how data on the number of Google searches for the terms “depression” and “suicide” relate to the number of suicides between 2004 and 2013. We find that estimates drawing on Google data are significantly better than estimates using previous suicide data alone. We show that a greater number of searches for the term “depression” is related to fewer suicides, whereas a greater number of searches for the term “suicide” is related to more suicides. Data on suicide related search behaviour can be used to improve current estimates of the number of suicide occurrences.Keywords: nowcasting, search data, Google Trends, official statistics
Procedia PDF Downloads 35724953 Analysis of Policy Issues on Computer-Based Testing in Nigeria
Authors: Samuel Oye Bandele
Abstract:
A policy is a system of principles to guide activities and strategic decisions of an organisation in order to achieve stated objectives and meeting expected outcomes. A Computer Based Test (CBT) policy is therefore a statement of intent to drive the CBT programmes, and should be implemented as a procedure or protocol. Policies are hence generally adopted by an organization or a nation. The concern here, in this paper, is the consideration and analysis of issues that are significant to evolving the acceptable policy that will drive the new CBT innovation in Nigeria. Public examinations and internal examinations in higher educational institutions in Nigeria are gradually making a radical shift from Paper Based or Paper-Pencil to Computer-Based Testing. The need to make an objective and empirical analysis of Policy issues relating to CBT became expedient. The following are some of the issues on CBT evolution in Nigeria that were identified as requiring policy backing. Prominent among them are requirements for establishing CBT centres, purpose of CBT, types and acquisition of CBT equipment, qualifications of staff: professional, technical and regular, security plans and curbing of cheating during examinations, among others. The descriptive research design was employed based on a population consisting of Principal Officers (Policymakers), Staff (Teaching and non-Teaching-Policy implementors), and CBT staff ( Technical and Professional- Policy supports) and candidates (internal and external). A fifty-item researcher-constructed questionnaire on policy issues was employed to collect data from 600 subjects drawn from higher institutions in South West Nigeria, using the purposive and stratified random sampling techniques. Data collected were analysed using descriptive (frequency counts, means and standard deviation) and inferential (t-test, ANOVA, regression and Factor analysis) techniques. Findings from this study showed, among others, that the factor loadings had significantly weights on the organizational and National policy issues on CBT innovation in Nigeria.Keywords: computer-based testing, examination, innovation, paper-based testing, paper pencil based testing, policy issues
Procedia PDF Downloads 24824952 On the Network Packet Loss Tolerance of SVM Based Activity Recognition
Authors: Gamze Uslu, Sebnem Baydere, Alper K. Demir
Abstract:
In this study, data loss tolerance of Support Vector Machines (SVM) based activity recognition model and multi activity classification performance when data are received over a lossy wireless sensor network is examined. Initially, the classification algorithm we use is evaluated in terms of resilience to random data loss with 3D acceleration sensor data for sitting, lying, walking and standing actions. The results show that the proposed classification method can recognize these activities successfully despite high data loss. Secondly, the effect of differentiated quality of service performance on activity recognition success is measured with activity data acquired from a multi hop wireless sensor network, which introduces high data loss. The effect of number of nodes on the reliability and multi activity classification success is demonstrated in simulation environment. To the best of our knowledge, the effect of data loss in a wireless sensor network on activity detection success rate of an SVM based classification algorithm has not been studied before.Keywords: activity recognition, support vector machines, acceleration sensor, wireless sensor networks, packet loss
Procedia PDF Downloads 47524951 The Development and Provision of a Knowledge Management Ecosystem, Optimized for Genomics
Authors: Matthew I. Bellgard
Abstract:
The field of bioinformatics has made, and continues to make, substantial progress and contributions to life science research and development. However, this paper contends that a systems approach integrates bioinformatics activities for any project in a defined manner. The application of critical control points in this bioinformatics systems approach may be useful to identify and evaluate points in a pathway where specified activity risk can be reduced, monitored and quality enhanced.Keywords: bioinformatics, food security, personalized medicine, systems approach
Procedia PDF Downloads 42224950 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman electricity Transmission Company
Authors: Rahma Saleh Hussein Al Balushi
Abstract:
Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS department. This paper will describe in detail the current GIS data submission process and the journey for developing it. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, and updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) for excavation permits and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting and data alterations has also contributed to reducing the missing attributes and enhance data quality index of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the years 2017 and year 2022. Overall, concluding that by governance, asset information & GIS department can control the GIS data process; collect, properly record, and manage asset data and information within the OETC network. This control extends to other applications and systems integrated with/related to GIS systems.Keywords: asset management ISO55001, standard procedures process, governance, CMMS
Procedia PDF Downloads 12524949 Formal Models of Sanitary Inspections Teams Activities
Authors: Tadeusz Nowicki, Radosław Pytlak, Robert Waszkowski, Jerzy Bertrandt, Anna Kłos
Abstract:
This paper presents methods for formal modeling of activities in the area of sanitary inspectors outbreak of food-borne diseases. The models allow you to measure the characteristics of the activities of sanitary inspection and as a result allow improving the performance of sanitary services and thus food security.Keywords: food-borne disease, epidemic, sanitary inspection, mathematical models
Procedia PDF Downloads 30224948 Efects of Data Corelation in a Sparse-View Compresive Sensing Based Image Reconstruction
Authors: Sajid Abas, Jon Pyo Hong, Jung-Ryun Le, Seungryong Cho
Abstract:
Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.Keywords: computed tomography, computed laminography, compressive sending, low-dose
Procedia PDF Downloads 46424947 Navigating Top Management Team Characteristics for Ambidexterity in Small and Medium-Sized African Businesses: The Key to Unlocking Success
Authors: Rumbidzai Sipiwe Zimvumi
Abstract:
The study aimed to identify the top management team attributes for ambidexterity in small and medium-sized enterprises by utilizing the upper echelons theory. The conventional opinion holds that an organization's ability to pursue both exploitative and explorative innovation methods at the same time is reflected in its ambidexterity. Top-level managers are critical to this matrix because they forecast and explain strategic choices that guarantee success by improving organizational performance. Since the focus of the study was on the unique characteristics of TMTs that can facilitate ambidexterity, the primary goal was to comprehend how TMTs in SMEs can better manage ambidexterity. The study used document analysis to collect information on ambidexterity and TMT traits. Finding, choosing, assessing, and synthesizing data from peer-reviewed publications allowed for the review and evaluation of papers. The fact that SMEs will perform better if they can achieve a balance between exploration and exploitation cannot be overstated. Unfortunately, exploitation is the main priority for most SMEs. The results showed that some of the noteworthy TMT traits that support ambidexterity in SMEs are age diversity, shared responsibility, leadership impact, psychological safety, and self-confidence. It has been shown that most SMEs confront significant obstacles in recruiting people, including formalizing their management and assembling executive teams with seniority. Small and medium-sized enterprises (SMEs) are often held by families or people who neglect to keep their personal lives apart from the firm, which eliminates the opportunity for management and staff to take the initiative. This helps to explain why exploitative strategies, which preserve present success, are used rather than explorative strategies, which open new economic opportunities and dimensions. It is evident that psychological safety deteriorates, and creativity is hindered in the process. The study makes the case that TMTs who are motivated to become ambidextrous can exist. According to the report, small- and medium-sized business owners should value the opinions of all parties involved and provide their managers and regular staff the freedom to think creatively and in a safe environment. TMTs who experience psychological safety are more likely to be inventive, creative, and productive. A team's collective perception that it is acceptable to take chances, voice opinions and concerns, ask questions, and own up to mistakes without fear of unfavorable outcomes is known as team psychological safety. Thus, traits like age diversity, leadership influence, learning agility, psychological safety, and self-assurance are critical to the success of SMEs. As a solution to ensuring ambidexterity is attained, the study suggests a clear separation of ownership and control, the adoption of technology to stimulate creativity, team spirit and excitement, shared accountability, and good management of diversity. Among the suggestions for the SME's success are resource allocation and important collaborations.Keywords: navigating, ambidexterity, top management team, small and medium enterprises
Procedia PDF Downloads 5824946 Fuzzy Wavelet Model to Forecast the Exchange Rate of IDR/USD
Authors: Tri Wijayanti Septiarini, Agus Maman Abadi, Muhammad Rifki Taufik
Abstract:
The exchange rate of IDR/USD can be the indicator to analysis Indonesian economy. The exchange rate as a important factor because it has big effect in Indonesian economy overall. So, it needs the analysis data of exchange rate. There is decomposition data of exchange rate of IDR/USD to be frequency and time. It can help the government to monitor the Indonesian economy. This method is very effective to identify the case, have high accurate result and have simple structure. In this paper, data of exchange rate that used is weekly data from December 17, 2010 until November 11, 2014.Keywords: the exchange rate, fuzzy mamdani, discrete wavelet transforms, fuzzy wavelet
Procedia PDF Downloads 57124945 Humanising Digital Healthcare to Build Capacity by Harnessing the Power of Patient Data
Authors: Durhane Wong-Rieger, Kawaldip Sehmi, Nicola Bedlington, Nicole Boice, Tamás Bereczky
Abstract:
Patient-generated health data should be seen as the expression of the experience of patients, including the outcomes reflecting the impact a treatment or service had on their physical health and wellness. We discuss how the healthcare system can reach a place where digital is a determinant of health - where data is generated by patients and is respected and which acknowledges their contribution to science. We explore the biggest barriers facing this. The International Experience Exchange with Patient Organisation’s Position Paper is based on a global patient survey conducted in Q3 2021 that received 304 responses. Results were discussed and validated by the 15 patient experts and supplemented with literature research. Results are a subset of this. Our research showed patient communities want to influence how their data is generated, shared, and used. Our study concludes that a reasonable framework is needed to protect the integrity of patient data and minimise abuse, and build trust. Results also demonstrated a need for patient communities to have more influence and control over how health data is generated, shared, and used. The results clearly highlight that the community feels there is a lack of clear policies on sharing data.Keywords: digital health, equitable access, humanise healthcare, patient data
Procedia PDF Downloads 8224944 Study of the Influence of Eccentricity Due to Configuration and Materials on Seismic Response of a Typical Building
Authors: A. Latif Karimi, M. K. Shrimali
Abstract:
Seismic design is a critical stage in the process of design and construction of a building. It includes strategies for designing earthquake-resistant buildings to ensure health, safety, and security of the building occupants and assets. Hence, it becomes very important to understand the behavior of structural members precisely, for construction of buildings that can yield a better response to seismic forces. This paper investigates the behavior of a typical structure when subjected to ground motion. The corresponding mode shapes and modal frequencies are studied to interpret the response of an actual structure using different fabricated models and 3D visual models. In this study, three different structural configurations are subjected to horizontal ground motion, and the effect of “stiffness eccentricity” and placement of infill walls are checked to determine how each parameter contributes in a building’s response to dynamic forces. The deformation data from lab experiments and the analysis on SAP2000 software are reviewed to obtain the results. This study revealed that seismic response in a building can be improved by introducing higher deformation capacity in the building. Also, proper design of infill walls and maintaining a symmetrical configuration in a building are the key factors in building stability during the earthquake.Keywords: eccentricity, seismic response, mode shape, building configuration, building dynamics
Procedia PDF Downloads 20024943 Use of Machine Learning in Data Quality Assessment
Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho
Abstract:
Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.Keywords: machine learning, data quality, quality dimension, quality assessment
Procedia PDF Downloads 14824942 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges
Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars
Abstract:
In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting
Procedia PDF Downloads 15324941 Free Radical Scavenging Activity and Total Phenolic Assessment of Drug Repurposed Medicinal Plant Metabolites: Promising Tools against Post COVID-19 Syndromes and Non-Communicable Diseases in Botswana
Authors: D. Motlhanka, M. Mine, T. Bagaketse, T. Ngakane
Abstract:
There is a plethora of evidence from numerous sources that highlights the triumph of naturally derived medicinal plant metabolites with antioxidant capability for repurposed therapeutics. As post-COVID-19 syndromes and non-communicable diseases are on the rise, there is an urgent need to come up with new therapeutic strategies to address the problem. Non-communicable diseases and Post COVID-19 syndromes are classified as socio-economic diseases and are ranked high among threats to health security due to the economic burden they pose to any government budget commitment. Research has shown a strong link between accumulation of free radicals and oxidative stress critical for pathogenesis of non-communicable diseases and COVID-19 syndromes. Botswana has embarked on a robust programme derived from ethno-pharmacognosy and drug repurposing to address these threats to health security. In the current approach, a number of medicinally active plant-derived polyphenolics are repurposed and combined into new medicinal tools to target diabetes, Hypertension, Prostate Cancer and oxidative stress induced Post COVID 19 syndromes such as “brain fog”. All four formulants demonstrated Free Radical scavenging capacities above 95% at 200µg/ml using the diphenylpicryalhydrazyl free radical scavenging assay and the total phenolic contents between 6899-15000GAE(g/L) using the folin-ciocalteau assay respectively. These repurposed medicinal tools offer new hope and potential in the fight against emerging health threats driven by hyper-inflammation and free radical-induced oxidative stress.Keywords: drug repurposed plant polyphenolics, free radical damage, non-communicable diseases, post COVID 19 syndromes
Procedia PDF Downloads 128