Search results for: harmonic data
22356 Analyzing the Effectiveness of a Bank of Parallel Resistors, as a Burden Compensation Technique for Current Transformer's Burden, Using LabVIEW™ Data Acquisition Tool
Authors: Dilson Subedi
Abstract:
Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However, due to upgradation of electromechanical relays to numerical relays and electromechanical energy meters to digital meters, the connected burden, which defines some of the CT characteristics, has drastically reduced. This has led to the system experiencing high currents damaging the connected relays and meters. Since the protection and metering equipment's are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore, during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and on the reliability of the protection and metering system. This paper shows the effectiveness of a bank of parallel connected resistors, as a burden compensation technique, in compensating the burden of under-burdened CT’s. The response of the CT in the case of failure of one or more resistors at different levels of overcurrent will be captured using the LabVIEWTM data acquisition hardware (DAQ). The analysis is done on the real-time data gathered using LabVIEWTM. Variation of current transformer saturation characteristics with changes in burden will be discussed.Keywords: accuracy limiting factor, burden, burden compensation, current transformer
Procedia PDF Downloads 24522355 Exploring Ways Early Childhood Teachers Integrate Information and Communication Technologies into Children's Play: Two Case Studies from the Australian Context
Authors: Caroline Labib
Abstract:
This paper reports on a qualitative study exploring the approaches teachers used to integrate computers or smart tablets into their program planning. Their aim was to integrate ICT into children’s play, thereby supporting children’s learning and development. Data was collected in preschool settings in Melbourne in 2016. Interviews with teachers, observations of teacher interactions with children and copies of teachers’ planning and observation documents informed the study. The paper looks closely at findings from two early childhood settings and focuses on exploring the differing approaches two EC teachers have adopted when integrating iPad or computers into their settings. Data analysis revealed three key approaches which have been labelled: free digital play, guided digital play and teacher-led digital use. Importantly, teacher decisions were influenced by the interplay between the opportunities that the ICT tools offered, the teachers’ prior knowledge and experience about ICT and children’s learning needs and contexts. This paper is a snapshot of two early childhood settings, and further research will encompass data from six more early childhood settings in Victoria with the aim of exploring a wide range of motivating factors for early childhood teachers trying to integrate ICT into their programs.Keywords: early childhood education (ECE), digital play, information and communication technologies (ICT), play, and teachers' interaction approaches
Procedia PDF Downloads 21222354 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring
Authors: Daniel Fundi Murithi
Abstract:
Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring
Procedia PDF Downloads 16322353 Deepfake Detection for Compressed Media
Authors: Sushil Kumar Gupta, Atharva Joshi, Ayush Sonawale, Sachin Naik, Rajshree Khande
Abstract:
The usage of artificially created videos and audio by deep learning is a major problem of the current media landscape, as it pursues the goal of misinformation and distrust. In conclusion, the objective of this work targets generating a reliable deepfake detection model using deep learning that will help detect forged videos accurately. In this work, CelebDF v1, one of the largest deepfake benchmark datasets in the literature, is adopted to train and test the proposed models. The data includes authentic and synthetic videos of high quality, therefore allowing an assessment of the model’s performance against realistic distortions.Keywords: deepfake detection, CelebDF v1, convolutional neural network (CNN), xception model, data augmentation, media manipulation
Procedia PDF Downloads 922352 The Impact of Financial Risk on Banks’ Financial Performance: A Comparative Study of Islamic Banks and Conventional Banks in Pakistan
Authors: Mohammad Yousaf Safi Mohibullah Afghan
Abstract:
The study made on Islamic and conventional banks scrutinizes the risks interconnected with credit and liquidity on the productivity performance of Islamic and conventional banks that operate in Pakistan. Among the banks, only 4 Islamic and 18 conventional banks have been selected to enrich the result of our study on Islamic banks performance in connection to conventional banks. The selection of the banks to the panel is based on collecting quarterly unbalanced data ranges from the first quarter of 2007 to the last quarter of 2017. The data are collected from the Bank’s web sites and State Bank of Pakistan. The data collection is carried out based on Delta-method test. The mentioned test is used to find out the empirical results. In the study, while collecting data on the banks, the return on assets and return on equity have been major factors that are used assignificant proxies in determining the profitability of the banks. Moreover, another major proxy is used in measuring credit and liquidity risks, the loan loss provision to total loan and the ratio of liquid assets to total liability. Meanwhile, with consideration to the previous literature, some other variables such as bank size, bank capital, bank branches, and bank employees have been used to tentatively control the impact of those factors whose direct and indirect effects on profitability is understood. In conclusion, the study emphasizes that credit risk affects return on asset and return on equity positively, and there is no significant difference in term of credit risk between Islamic and conventional banks. Similarly, the liquidity risk has a significant impact on the bank’s profitability, though the marginal effect of liquidity risk is higher for Islamic banks than conventional banks.Keywords: islamic & conventional banks, performance return on equity, return on assets, pakistan banking sectors, profitibility
Procedia PDF Downloads 16422351 Development of Time Series Forecasting Model for Dengue Cases in Nakhon Si Thammarat, Southern Thailand
Authors: Manit Pollar
Abstract:
Identifying the dengue epidemic periods early would be helpful to take necessary actions to prevent the dengue outbreaks. Providing an accurate prediction on dengue epidemic seasons will allow sufficient time to take the necessary decisions and actions to safeguard the situation for local authorities. This study aimed to develop a forecasting model on number of dengue incidences in Nakhon Si Thammarat Province, Southern Thailand using time series analysis. We develop Seasonal Autoregressive Moving Average (SARIMA) models on the monthly data collected between 2003-2011 and validated the models using data collected between January-September 2012. The result of this study revealed that the SARIMA(1,1,0)(1,2,1)12 model closely described the trends and seasons of dengue incidence and confirmed the existence of dengue fever cases in Nakhon Si Thammarat for the years between 2003-2011. The study showed that the one-step approach for predicting dengue incidences provided significantly more accurate predictions than the twelve-step approach. The model, even if based purely on statistical data analysis, can provide a useful basis for allocation of resources for disease prevention.Keywords: SARIMA, time series model, dengue cases, Thailand
Procedia PDF Downloads 35822350 Spatial Mapping of Variations in Groundwater of Taluka Islamkot Thar Using GIS and Field Data
Authors: Imran Aziz Tunio
Abstract:
Islamkot is an underdeveloped sub-district (Taluka) in the Tharparkar district Sindh province of Pakistan located between latitude 24°25'19.79"N to 24°47'59.92"N and longitude 70° 1'13.95"E to 70°32'15.11"E. The Islamkot has an arid desert climate and the region is generally devoid of perennial rivers, canals, and streams. It is highly dependent on rainfall which is not considered a reliable surface water source and groundwater is the only key source of water for many centuries. To assess groundwater’s potential, an electrical resistivity survey (ERS) was conducted in Islamkot Taluka. Groundwater investigations for 128 Vertical Electrical Sounding (VES) were collected to determine the groundwater potential and obtain qualitatively and quantitatively layered resistivity parameters. The PASI Model 16 GL-N Resistivity Meter was used by employing a Schlumberger electrode configuration, with half current electrode spacing (AB/2) ranging from 1.5 to 100 m and the potential electrode spacing (MN/2) from 0.5 to 10 m. The data was acquired with a maximum current electrode spacing of 200 m. The data processing for the delineation of dune sand aquifers involved the technique of data inversion, and the interpretation of the inversion results was aided by the use of forward modeling. The measured geo-electrical parameters were examined by Interpex IX1D software, and apparent resistivity curves and synthetic model layered parameters were mapped in the ArcGIS environment using the inverse Distance Weighting (IDW) interpolation technique. Qualitative interpretation of vertical electrical sounding (VES) data shows the number of geo-electrical layers in the area varies from three to four with different resistivity values detected. Out of 128 VES model curves, 42 nos. are 3 layered, and 86 nos. are 4 layered. The resistivity of the first subsurface layers (Loose surface sand) varied from 16.13 Ωm to 3353.3 Ωm and thickness varied from 0.046 m to 17.52m. The resistivity of the second subsurface layer (Semi-consolidated sand) varied from 1.10 Ωm to 7442.8 Ωm and thickness varied from 0.30 m to 56.27 m. The resistivity of the third subsurface layer (Consolidated sand) varied from 0.00001 Ωm to 3190.8 Ωm and thickness varied from 3.26 m to 86.66 m. The resistivity of the fourth subsurface layer (Silt and Clay) varied from 0.0013 Ωm to 16264 Ωm and thickness varied from 13.50 m to 87.68 m. The Dar Zarrouk parameters, i.e. longitudinal unit conductance S is from 0.00024 to 19.91 mho; transverse unit resistance T from 7.34 to 40080.63 Ωm2; longitudinal resistance RS is from 1.22 to 3137.10 Ωm and transverse resistivity RT from 5.84 to 3138.54 Ωm. ERS data and Dar Zarrouk parameters were mapped which revealed that the study area has groundwater potential in the subsurface.Keywords: electrical resistivity survey, GIS & RS, groundwater potential, environmental assessment, VES
Procedia PDF Downloads 11022349 Precipitation Intensity: Duration Based Threshold Analysis for Initiation of Landslides in Upper Alaknanda Valley
Authors: Soumiya Bhattacharjee, P. K. Champati Ray, Shovan L. Chattoraj, Mrinmoy Dhara
Abstract:
The entire Himalayan range is globally renowned for rainfall-induced landslides. The prime focus of the study is to determine rainfall based threshold for initiation of landslides that can be used as an important component of an early warning system for alerting stake holders. This research deals with temporal dimension of slope failures due to extreme rainfall events along the National Highway-58 from Karanprayag to Badrinath in the Garhwal Himalaya, India. Post processed 3-hourly rainfall intensity data and its corresponding duration from daily rainfall data available from Tropical Rainfall Measuring Mission (TRMM) were used as the prime source of rainfall data. Landslide event records from Border Road Organization (BRO) and some ancillary landslide inventory data for 2013 and 2014 have been used to determine Intensity Duration (ID) based rainfall threshold. The derived governing threshold equation, I= 4.738D-0.025, has been considered for prediction of landslides of the study region. This equation was validated with an accuracy of 70% landslides during August and September 2014. The derived equation was considered for further prediction of landslides of the study region. From the obtained results and validation, it can be inferred that this equation can be used for initiation of landslides in the study area to work as a part of an early warning system. Results can significantly improve with ground based rainfall estimates and better database on landslide records. Thus, the study has demonstrated a very low cost method to get first-hand information on possibility of impending landslide in any region, thereby providing alert and better preparedness for landslide disaster mitigation.Keywords: landslide, intensity-duration, rainfall threshold, TRMM, slope, inventory, early warning system
Procedia PDF Downloads 27322348 Governance Challenges of Consolidated Destinations. The Case of Barcelona
Authors: Montserrat Crespi-Vallbona; Oscar Mascarilla-Miró
Abstract:
Mature destinations have different challenges trying to attract tourism and please its citizens. Hence, they have to maintain their touristic interest to standard demand and also not to undeceive those tourists with more advanced experiences. Second, they have to be concerned for the daily life of citizens and avoid the negative effects of touristification. This balance is quite delicate and often has to do with the sensitivity and commitment of the party in the local government. However, what is a general consensus is the need for destinations to differentiate from the homogeneous rest of regions and create new content, consumable resources or marketing events to guarantee their positioning. In this sense, the main responsibility of destinations is to satisfy users, tourists and citizens. Hence, its aim has to do with holistic experiences, which collect these wide approaches. Specifically, this research aims to analyze the volume and growth of tourist houses in the central touristic neighborhoods of Barcelona (this is Ciutat Vella) as the starting point to identify the behavior of tourists regarding their interests in searching for local heritage attractiveness and community atmosphere. Then, different cases are analyzed in order to show how Barcelona struggles to keep its attractive brand for the visitors, as well as for its inhabitants. Methodologically, secondary data used in this research comes from official registered tourist houses (Catalunya Government), Open Data (Barcelona municipality), the Airbnb tourist platform, from the Incasol Data and Municipal Register of Inhabitants. Primary data are collected through in-depth interviews with neighbors, social movement managers and political representatives from Turisme de Barcelona (local DMO, Destination Management Organization). Results show what the opportunities and priorities are for key actors to design policies to find a balance between all different interests.Keywords: touristification, tourist houses, governance, tourism demand, airbnbfication
Procedia PDF Downloads 6522347 Predicting Recessions with Bivariate Dynamic Probit Model: The Czech and German Case
Authors: Lukas Reznak, Maria Reznakova
Abstract:
Recession of an economy has a profound negative effect on all involved stakeholders. It follows that timely prediction of recessions has been of utmost interest both in the theoretical research and in practical macroeconomic modelling. Current mainstream of recession prediction is based on standard OLS models of continuous GDP using macroeconomic data. This approach is not suitable for two reasons: the standard continuous models are proving to be obsolete and the macroeconomic data are unreliable, often revised many years retroactively. The aim of the paper is to explore a different branch of recession forecasting research theory and verify the findings on real data of the Czech Republic and Germany. In the paper, the authors present a family of discrete choice probit models with parameters estimated by the method of maximum likelihood. In the basic form, the probits model a univariate series of recessions and expansions in the economic cycle for a given country. The majority of the paper deals with more complex model structures, namely dynamic and bivariate extensions. The dynamic structure models the autoregressive nature of recessions, taking into consideration previous economic activity to predict the development in subsequent periods. Bivariate extensions utilize information from a foreign economy by incorporating correlation of error terms and thus modelling the dependencies of the two countries. Bivariate models predict a bivariate time series of economic states in both economies and thus enhance the predictive performance. A vital enabler of timely and successful recession forecasting are reliable and readily available data. Leading indicators, namely the yield curve and the stock market indices, represent an ideal data base, as the pieces of information is available in advance and do not undergo any retroactive revisions. As importantly, the combination of yield curve and stock market indices reflect a range of macroeconomic and financial market investors’ trends which influence the economic cycle. These theoretical approaches are applied on real data of Czech Republic and Germany. Two models for each country were identified – each for in-sample and out-of-sample predictive purposes. All four followed a bivariate structure, while three contained a dynamic component.Keywords: bivariate probit, leading indicators, recession forecasting, Czech Republic, Germany
Procedia PDF Downloads 24822346 African Folklore for Critical Self-Reflection, Reflective Dialogue, and Resultant Attitudinal and Behaviour Change: University Students’ Experiences
Authors: T. M. Buthelezi, E. O. Olagundoye, R. G. L. Cele
Abstract:
This article argues that whilst African folklore has mainly been used for entertainment, it also has an educational value that has power to change young people’s attitudes and behavior. The paper is informed by the findings from the data that was generated from 154 university students who were coming from diverse backgrounds. The qualitative data was thematically analysed. Referring to the six steps of the behaviour change model, we found that African Folklore provides relevant cultural knowledge and instills values that enable young people to engage on self-reflection that eventually leads them towards attitudinal changes and behaviour modification. Using the transformative learning theory, we argue that African Folklore in itself is a pedagogical strategy that integrates cultural knowledge, values with entertainment elements concisely enough to take the young people through a transformative phase which encompasses psychological, convictional and life-style adaptation. During data production stage all ethical considerations were observed including obtaining gatekeeper’s permission letter and ethical clearance certificate from the Ethics Committee of the University. The paper recommends that African Folklore approach should be incorporated into the school curriculum particularly in life skills education with aims to change behaviour.Keywords: African folklore, young people, attitudinal, behavior change, university students
Procedia PDF Downloads 26322345 A Comparative Study of the Impact of Membership in International Climate Change Treaties and the Environmental Kuznets Curve (EKC) in Line with Sustainable Development Theories
Authors: Mojtaba Taheri, Saied Reza Ameli
Abstract:
In this research, we have calculated the effect of membership in international climate change treaties for 20 developed countries based on the human development index (HDI) and compared this effect with the process of pollutant reduction in the Environmental Kuznets Curve (EKC) theory. For this purpose, the data related to The real GDP per capita with 2010 constant prices is selected from the World Development Indicators (WDI) database. Ecological Footprint (ECOFP) is the amount of biologically productive land needed to meet human needs and absorb carbon dioxide emissions. It is measured in global hectares (gha), and the data retrieved from the Global Ecological Footprint (2021) database will be used, and we will proceed by examining step by step and performing several series of targeted statistical regressions. We will examine the effects of different control variables, including Energy Consumption Structure (ECS) will be counted as the share of fossil fuel consumption in total energy consumption and will be extracted from The United States Energy Information Administration (EIA) (2021) database. Energy Production (EP) refers to the total production of primary energy by all energy-producing enterprises in one country at a specific time. It is a comprehensive indicator that shows the capacity of energy production in the country, and the data for its 2021 version, like the Energy Consumption Structure, is obtained from (EIA). Financial development (FND) is defined as the ratio of private credit to GDP, and to some extent based on the stock market value, also as a ratio to GDP, and is taken from the (WDI) 2021 version. Trade Openness (TRD) is the sum of exports and imports of goods and services measured as a share of GDP, and we use the (WDI) data (2021) version. Urbanization (URB) is defined as the share of the urban population in the total population, and for this data, we used the (WDI) data source (2021) version. The descriptive statistics of all the investigated variables are presented in the results section. Related to the theories of sustainable development, Environmental Kuznets Curve (EKC) is more significant in the period of study. In this research, we use more than fourteen targeted statistical regressions to purify the net effects of each of the approaches and examine the results.Keywords: climate change, globalization, environmental economics, sustainable development, international climate treaty
Procedia PDF Downloads 7122344 Value Relevance of Accounting Information: A Study of Steel Sector in India
Authors: Pradyumna Mohanty
Abstract:
The paper aims to explore whether accounting information of Indian companies in the Steel sector are value relevant or not. Ohlson’s model which usually takes into consideration book value per share (BV) and earnings per share (EARN) has been used and the same has been expanded to include two more variables such as cash flow from operations (CFO) and return on equity (ROE). The data were collected from CMIE-Prowess data base in respect of BSE-listed steel companies and the time frame spans from 2010 to 2014. OLS regression has been used to test the value relevance of these accounting numbers. Results indicate that both CFO and BV are having significant influence on the stock price in two out of five years of study. But, BV is emerging as the most significant and highly value relevant of all the four variables during the entire period of study.Keywords: value relevance, accounting information, book value per share, earnings per share
Procedia PDF Downloads 15822343 Implementation of Achterbahn-128 for Images Encryption and Decryption
Authors: Aissa Belmeguenai, Khaled Mansouri
Abstract:
In this work, an efficient implementation of Achterbahn-128 for images encryption and decryption was introduced. The implementation for this simulated project is written by MATLAB.7.5. At first two different original images are used for validate the proposed design. Then our developed program was used to transform the original images data into image digits file. Finally, we used our implemented program to encrypt and decrypt images data. Several tests are done for proving the design performance including visual tests and security analysis; we discuss the security analysis of the proposed image encryption scheme including some important ones like key sensitivity analysis, key space analysis, and statistical attacks.Keywords: Achterbahn-128, stream cipher, image encryption, security analysis
Procedia PDF Downloads 53222342 Groundwater Monitoring Using a Community: Science Approach
Authors: Shobha Kumari Yadav, Yubaraj Satyal, Ajaya Dixit
Abstract:
In addressing groundwater depletion, it is important to develop evidence base so to be used in assessing the state of its degradation. Groundwater data is limited compared to meteorological data, which impedes the groundwater use and management plan. Monitoring of groundwater levels provides information base to assess the condition of aquifers, their responses to water extraction, land-use change, and climatic variability. It is important to maintain a network of spatially distributed, long-term monitoring wells to support groundwater management plan. Monitoring involving local community is a cost effective approach that generates real time data to effectively manage groundwater use. This paper presents the relationship between rainfall and spring flow, which are the main source of freshwater for drinking, household consumptions and agriculture in hills of Nepal. The supply and withdrawal of water from springs depends upon local hydrology and the meteorological characteristics- such as rainfall, evapotranspiration and interflow. The study offers evidence of the use of scientific method and community based initiative for managing groundwater and springshed. The approach presents a method to replicate similar initiative in other parts of the country for maintaining integrity of springs.Keywords: citizen science, groundwater, water resource management, Nepal
Procedia PDF Downloads 20222341 A Dynamic Spatial Panel Data Analysis on Renter-Occupied Multifamily Housing DC
Authors: Jose Funes, Jeff Sauer, Laixiang Sun
Abstract:
This research examines determinants of multifamily housing development and spillovers in the District of Columbia. A range of socioeconomic factors related to income distribution, productivity, and land use policies are thought to influence the development in contemporary U.S. multifamily housing markets. The analysis leverages data from the American Community Survey to construct panel datasets spanning from 2010 to 2019. Using spatial regression, we identify several socioeconomic measures and land use policies both positively and negatively associated with new housing supply. We contextualize housing estimates related to race in relation to uneven development in the contemporary D.C. housing supply.Keywords: neighborhood effect, sorting, spatial spillovers, multifamily housing
Procedia PDF Downloads 10222340 Potentials and Challenges of Implementing Participatory Irrigation Management, Tanzania
Authors: Pilly Joseph Kagosi
Abstract:
The study aims at assessing challenges observed during implementation of participatory irrigation management (PIM) approach for food security in semi-arid areas of Tanzania. Data were collected through questionnaire, PRA tools, key informants discussion, Focus Group Discussion (FGD), participant observation and literature review. Data collected from questionnaire was analyzed using SPSS while PRA data was analyzed with the help of local communities during PRA exercise. Data from other methods were analyzed using content analysis. The study revealed that PIM approach has contribution in improved food security at household level due to involvement of communities in water management activities and decision making which enhanced availability of water for irrigation and increased crop production. However there were challenges observed during implementation of the approach including; minimum participation of beneficiaries in decision making during planning and designing stages, meaning inadequate devolution of power among scheme owners; Inadequate and lack of transparency on income expenditure in Water Utilization Associations’ (WUAs), water conflict among WUAs members, conflict between farmers and livestock keepers and conflict between WUAs leaders and village government regarding training opportunities and status; WUAs rules and regulation are not legally recognized by the National court and few farmers involved in planting trees around water sources. However it was realized that some of the mentioned challenges were rectified by farmers themselves facilitated by government officials. The study recommends that, the identified challenges need to be rectified for farmers to realize impotence of PIM approach as it was realized by other Asian countries.Keywords: potentials of implementing participatory approach, challenges of participatory approach, irrigation management, Tanzania
Procedia PDF Downloads 30522339 Judicial Analysis of the Burden of Proof on the Perpetrator of Corruption Criminal Act
Authors: Rahmayanti, Theresia Simatupang, Ronald H. Sianturi
Abstract:
Corruption criminal act develops rapidly since in the transition era there is weakness in law. Consequently, there is an opportunity for a few people to do fraud and illegal acts and to misuse their positions and formal functions in order to make them rich, and the criminal acts are done systematically and sophisticatedly. Some people believe that legal provisions which specifically regulate the corruption criminal act; namely, Law No. 31/1999 in conjunction with Law No. 20/2001 on the Eradication of Corruption Criminal Act are not effective any more, especially in onus probandi (the burden of proof) on corruptors. The research was a descriptive analysis, a research method which is used to obtain description on a certain situation or condition by explaining the data, and the conclusion is drawn through some analyses. The research used judicial normative approach since it used secondary data as the main data by conducting library research. The system of the burden of proof, which follows the principles of reversal of the burden of proof stipulated in Article 12B, paragraph 1 a and b, Article 37A, and Article 38B of Law No. 20/2001 on the Amendment of Law No. 31/1999, is used only as supporting evidence when the principal case is proved. Meanwhile, how to maximize the implementation of the burden of proof on the perpetrators of corruption criminal act in which the public prosecutor brings a corruption case to Court, depends upon the nature of the case and the type of indictment. The system of burden of proof can be used to eradicate corruption in the Court if some policies and general principles of justice such as independency, impartiality, and legal certainty, are applied.Keywords: burden of proof, perpetrator, corruption criminal act
Procedia PDF Downloads 32122338 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana
Authors: Gautier Viaud, Paul-Henry Cournède
Abstract:
Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models
Procedia PDF Downloads 30322337 Poultry in Motion: Text Mining Social Media Data for Avian Influenza Surveillance in the UK
Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves
Abstract:
Background: Avian influenza, more commonly known as Bird flu, is a viral zoonotic respiratory disease stemming from various species of poultry, including pets and migratory birds. Researchers have purported that the accessibility of health information online, in addition to the low-cost data collection methods the internet provides, has revolutionized the methods in which epidemiological and disease surveillance data is utilized. This paper examines the feasibility of using internet data sources, such as Twitter and livestock forums, for the early detection of the avian flu outbreak, through the use of text mining algorithms and social network analysis. Methods: Social media mining was conducted on Twitter between the period of 01/01/2021 to 31/12/2021 via the Twitter API in Python. The results were filtered firstly by hashtags (#avianflu, #birdflu), word occurrences (avian flu, bird flu, H5N1), and then refined further by location to include only those results from within the UK. Analysis was conducted on this text in a time-series manner to determine keyword frequencies and topic modeling to uncover insights in the text prior to a confirmed outbreak. Further analysis was performed by examining clinical signs (e.g., swollen head, blue comb, dullness) within the time series prior to the confirmed avian flu outbreak by the Animal and Plant Health Agency (APHA). Results: The increased search results in Google and avian flu-related tweets showed a correlation in time with the confirmed cases. Topic modeling uncovered clusters of word occurrences relating to livestock biosecurity, disposal of dead birds, and prevention measures. Conclusions: Text mining social media data can prove to be useful in relation to analysing discussed topics for epidemiological surveillance purposes, especially given the lack of applied research in the veterinary domain. The small sample size of tweets for certain weekly time periods makes it difficult to provide statistically plausible results, in addition to a great amount of textual noise in the data.Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, avian influenza, social media
Procedia PDF Downloads 10522336 Earthquake Risk Assessment Using Out-of-Sequence Thrust Movement
Authors: Rajkumar Ghosh
Abstract:
Earthquakes are natural disasters that pose a significant risk to human life and infrastructure. Effective earthquake mitigation measures require a thorough understanding of the dynamics of seismic occurrences, including thrust movement. Traditionally, estimating thrust movement has relied on typical techniques that may not capture the full complexity of these events. Therefore, investigating alternative approaches, such as incorporating out-of-sequence thrust movement data, could enhance earthquake mitigation strategies. This review aims to provide an overview of the applications of out-of-sequence thrust movement in earthquake mitigation. By examining existing research and studies, the objective is to understand how precise estimation of thrust movement can contribute to improving structural design, analyzing infrastructure risk, and developing early warning systems. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources, including GPS measurements, satellite imagery, and seismic recordings. By analyzing and synthesizing these diverse datasets, researchers can gain a more comprehensive understanding of thrust movement dynamics during seismic occurrences. The review identifies potential advantages of incorporating out-of-sequence data in earthquake mitigation techniques. These include improving the efficiency of structural design, enhancing infrastructure risk analysis, and developing more accurate early warning systems. By considering out-of-sequence thrust movement estimates, researchers and policymakers can make informed decisions to mitigate the impact of earthquakes. This study contributes to the field of seismic monitoring and earthquake risk assessment by highlighting the benefits of incorporating out-of-sequence thrust movement data. By broadening the scope of analysis beyond traditional techniques, researchers can enhance their knowledge of earthquake dynamics and improve the effectiveness of mitigation measures. The study collects data from various sources, including GPS measurements, satellite imagery, and seismic recordings. These datasets are then analyzed using appropriate statistical and computational techniques to estimate out-of-sequence thrust movement. The review integrates findings from multiple studies to provide a comprehensive assessment of the topic. The study concludes that incorporating out-of-sequence thrust movement data can significantly enhance earthquake mitigation measures. By utilizing diverse data sources, researchers and policymakers can gain a more comprehensive understanding of seismic dynamics and make informed decisions. However, challenges exist, such as data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and improve the accuracy of estimates, further research and advancements in methodology are recommended. Overall, this review serves as a valuable resource for researchers, engineers, and policymakers involved in earthquake mitigation, as it encourages the development of innovative strategies based on a better understanding of thrust movement dynamics.Keywords: earthquake, out-of-sequence thrust, disaster, human life
Procedia PDF Downloads 7722335 Different Sampling Schemes for Semi-Parametric Frailty Model
Authors: Nursel Koyuncu, Nihal Ata Tutkun
Abstract:
Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.Keywords: frailty model, ranked set sampling, efficiency, simple random sampling
Procedia PDF Downloads 21122334 The Role of Oceanic Environmental Conditions on Catch of Sardinella spp. In Ghana
Authors: Emmanuel Okine Neokye Serge Dossou Martin Iniga Bortey Nketia Alabi-Doku
Abstract:
Fish stock distribution is greatly influenced by oceanographic environmental conditions. Temporal variations of temperature and other oceanic properties, resulting from climate change have been documented to have a strong impact on fisheries and aquaculture. In Ghana, Sardinella species are one of the most important fisheries resources; they constitute about 60% of the total catch of coastal fisheries and are more predominant during the upwelling season. The present study investigated the role of physical oceanographic environmental conditions in the catches of Sardinella species: S. aurita and S. maderensis, which were landed in Ghana. Furthermore, we examined the relationship between environmental conditions and catches of Sardinella species for seasonal and interannual variations between 2005 and 2015. For oceanographic environmental factors, we used comprehensive datasets, which consist of :(1) daily in situ SST data obtained at two coastal stations in Ghana; (i) Cape 3 Points (4.7° N, -2.09° W) and (ii) Tema (5° N, 0° E), for the period 2005–2015, (2) Monthly SST data (MOAA GPV) from JAMSTEC, and (3) gridded 10 metre wind data from CCMP reanalysis. The analysis of the data collected showed that higher (lower) wind velocity forms stronger (weaker) coastal upwelling that is detected by lower (higher) SST, resulting in a higher (lower) catch of Sardinella spp., in both seasonal and interannual variations. It was also observed that the capture ability of small pelagic fish species such as Sardinella spp. is depend on the intensity of the coastal upwelling. Moreso, the Atlantic Meridional Mode index (climatic index) is now known to be a possible factor to the interannual variation in catch of small pelagic fish species.Keywords: Sardinella spp., fish, climate change, Ghana
Procedia PDF Downloads 1222333 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm
Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar
Abstract:
The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations
Procedia PDF Downloads 41522332 Testing the Life Cycle Theory on the Capital Structure Dynamics of Trade-Off and Pecking Order Theories: A Case of Retail, Industrial and Mining Sectors
Authors: Freddy Munzhelele
Abstract:
Setting: the empirical research has shown that the life cycle theory has an impact on the firms’ financing decisions, particularly the dividend pay-outs. Accordingly, the life cycle theory posits that as a firm matures, it gets to a level and capacity where it distributes more cash as dividends. On the other hand, the young firms prioritise investment opportunities sets and their financing; thus, they pay little or no dividends. The research on firms’ financing decisions also demonstrated, among others, the adoption of trade-off and pecking order theories on the dynamics of firms capital structure. The trade-off theory talks to firms holding a favourable position regarding debt structures particularly as to the cost and benefits thereof; and pecking order is concerned with firms preferring a hierarchical order as to choosing financing sources. The case of life cycle hypothesis explaining the financial managers’ decisions as regards the firms’ capital structure dynamics appears to be an interesting link, yet this link has been neglected in corporate finance research. If this link is to be explored as an empirical research, the financial decision-making alternatives will be enhanced immensely, since no conclusive evidence has been found yet as to the dynamics of capital structure. Aim: the aim of this study is to examine the impact of life cycle theory on the capital structure dynamics trade-off and pecking order theories of firms listed in retail, industrial and mining sectors of the JSE. These sectors are among the key contributors to the GDP in the South African economy. Design and methodology: following the postpositivist research paradigm, the study is quantitative in nature and utilises secondary data obtainable from the financial statements of sampled firm for the period 2010 – 2022. The firms’ financial statements will be extracted from the IRESS database. Since the data will be in panel form, a combination of the static and dynamic panel data estimators will used to analyse data. The overall data analyses will be done using STATA program. Value add: this study directly investigates the link between the life cycle theory and the dynamics of capital structure decisions, particularly the trade-off and pecking order theories.Keywords: life cycle theory, trade-off theory, pecking order theory, capital structure, JSE listed firms
Procedia PDF Downloads 6122331 Neural Synchronization - The Brain’s Transfer of Sensory Data
Authors: David Edgar
Abstract:
To understand how the brain’s subconscious and conscious functions, we must conquer the physics of Unity, which leads to duality’s algorithm. Where the subconscious (bottom-up) and conscious (top-down) processes function together to produce and consume intelligence, we use terms like ‘time is relative,’ but we really do understand the meaning. In the brain, there are different processes and, therefore, different observers. These different processes experience time at different rates. A sensory system such as the eyes cycles measurement around 33 milliseconds, the conscious process of the frontal lobe cycles at 300 milliseconds, and the subconscious process of the thalamus cycle at 5 milliseconds. Three different observers experience time differently. To bridge observers, the thalamus, which is the fastest of the processes, maintains a synchronous state and entangles the different components of the brain’s physical process. The entanglements form a synchronous cohesion between the brain components allowing them to share the same state and execute in the same measurement cycle. The thalamus uses the shared state to control the firing sequence of the brain’s linear subconscious process. Sharing state also allows the brain to cheat on the amount of sensory data that must be exchanged between components. Only unpredictable motion is transferred through the synchronous state because predictable motion already exists in the shared framework. The brain’s synchronous subconscious process is entirely based on energy conservation, where prediction regulates energy usage. So, the eyes every 33 milliseconds dump their sensory data into the thalamus every day. The thalamus is going to perform a motion measurement to identify the unpredictable motion in the sensory data. Here is the trick. The thalamus conducts its measurement based on the original observation time of the sensory system (33 ms), not its own process time (5 ms). This creates a data payload of synchronous motion that preserves the original sensory observation. Basically, a frozen moment in time (Flat 4D). The single moment in time can then be processed through the single state maintained by the synchronous process. Other processes, such as consciousness (300 ms), can interface with the synchronous state to generate awareness of that moment. Now, synchronous data traveling through a separate faster synchronous process creates a theoretical time tunnel where observation time is tunneled through the synchronous process and is reproduced on the other side in the original time-relativity. The synchronous process eliminates time dilation by simply removing itself from the equation so that its own process time does not alter the experience. To the original observer, the measurement appears to be instantaneous, but in the thalamus, a linear subconscious process generating sensory perception and thought production is being executed. It is all just occurring in the time available because other observation times are slower than thalamic measurement time. For life to exist in the physical universe requires a linear measurement process, it just hides by operating at a faster time relativity. What’s interesting is time dilation is not the problem; it’s the solution. Einstein said there was no universal time.Keywords: neural synchronization, natural intelligence, 99.95% IoT data transmission savings, artificial subconscious intelligence (ASI)
Procedia PDF Downloads 12622330 Dogmatic Analysis of Legal Risks of Using Artificial Intelligence: The European Union and Polish Perspective
Authors: Marianna Iaroslavska
Abstract:
ChatGPT is becoming commonplace. However, only a few people think about the legal risks of using Large Language Model in their daily work. The main dilemmas concern the following areas: who owns the copyright to what somebody creates through ChatGPT; what can OpenAI do with the prompt you enter; can you accidentally infringe on another creator's rights through ChatGPT; what about the protection of the data somebody enters into the chat. This paper will present these and other legal risks of using large language models at work using dogmatic methods and case studies. The paper will present a legal analysis of AI risks against the background of European Union law and Polish law. This analysis will answer questions about how to protect data, how to make sure you do not violate copyright, and what is at stake with the AI Act, which recently came into force in the EU. If your work is related to the EU area, and you use AI in your work, this paper will be a real goldmine for you. The copyright law in force in Poland does not protect your rights to a work that is created with the help of AI. So if you start selling such a work, you may face two main problems. First, someone may steal your work, and you will not be entitled to any protection because work created with AI does not have any legal protection. Second, the AI may have created the work by infringing on another person's copyright, so they will be able to claim damages from you. In addition, the EU's current AI Act imposes a number of additional obligations related to the use of large language models. The AI Act divides artificial intelligence into four risk levels and imposes different requirements depending on the level of risk. The EU regulation is aimed primarily at those developing and marketing artificial intelligence systems in the EU market. In addition to the above obstacles, personal data protection comes into play, which is very strictly regulated in the EU. If you violate personal data by entering information into ChatGPT, you will be liable for violations. When using AI within the EU or in cooperation with entities located in the EU, you have to take into account a lot of risks. This paper will highlight such risks and explain how they can be avoided.Keywords: EU, AI act, copyright, polish law, LLM
Procedia PDF Downloads 2122329 The Impact of Electronic Commerce on Organisational Efectiveness: A Study of Zenith Bank Plc
Authors: Olusola Abiodun Arinde
Abstract:
This research work was prompted by the very important role e-commerce plays in every organization, be it private or public. The underlying objective of this study is to have a critical appraisal of the extent to which e-commerce impacts on organizational effectiveness. This research was carried out using Zenith Bank Plc as a case study. Relevant data were collected through structured questionnaire, oral interview, journals, newspapers, and textbooks. The data collected were analyzed and hypotheses were tested. Based on the result of the hypotheses, it was observed that e-commerce is significant to every organization. Through e-commerce, fast services delivery would be guaranteed to customers, this would lead to higher productivity and profit for organizations. E-commerce should be managed in such a way that it does not alienate customers; it should also prevent enormous risks that are associated with e-commerce.Keywords: e-commerce, fast service, productivity, profit
Procedia PDF Downloads 24422328 Actionable Personalised Learning Strategies to Improve a Growth-Mindset in an Educational Setting Using Artificial Intelligence
Authors: Garry Gorman, Nigel McKelvey, James Connolly
Abstract:
This study will evaluate a growth mindset intervention with Junior Cycle Coding and Senior Cycle Computer Science students in Ireland, where gamification will be used to incentivise growth mindset behaviour. An artificial intelligence (AI) driven personalised learning system will be developed to present computer programming learning tasks in a manner that is best suited to the individuals’ own learning preferences while incentivising and rewarding growth mindset behaviour of persistence, mastery response to challenge, and challenge seeking. This research endeavours to measure mindset with before and after surveys (conducted nationally) and by recording growth mindset behaviour whilst playing a digital game. This study will harness the capabilities of AI and aims to determine how a personalised learning (PL) experience can impact the mindset of a broad range of students. The focus of this study will be to determine how personalising the learning experience influences female and disadvantaged students' sense of belonging in the computer science classroom when tasks are presented in a manner that is best suited to the individual. Whole Brain Learning will underpin this research and will be used as a framework to guide the research in identifying key areas such as thinking and learning styles, cognitive potential, motivators and fears, and emotional intelligence. This research will be conducted in multiple school types over one academic year. Digital games will be played multiple times over this period, and the data gathered will be used to inform the AI algorithm. The three data sets are described as follows: (i) Before and after survey data to determine the grit scores and mindsets of the participants, (ii) The Growth Mind-Set data from the game, which will measure multiple growth mindset behaviours, such as persistence, response to challenge and use of strategy, (iii) The AI data to guide PL. This study will highlight the effectiveness of an AI-driven personalised learning experience. The data will position AI within the Irish educational landscape, with a specific focus on the teaching of CS. These findings will benefit coding and computer science teachers by providing a clear pedagogy for the effective delivery of personalised learning strategies for computer science education. This pedagogy will help prevent students from developing a fixed mindset while helping pupils to exhibit persistence of effort, use of strategy, and a mastery response to challenges.Keywords: computer science education, artificial intelligence, growth mindset, pedagogy
Procedia PDF Downloads 8722327 Theoretical Study of the Structural and Elastic Properties of Semiconducting Rare Earth Chalcogenide Sm1-XEuXS under Pressure
Authors: R. Dubey, M. Sarwan, S. Singh
Abstract:
We have investigated the phase transition pressure and associated volume collapse in Sm1– X EuX S alloy (0≤x≤1) which shows transition from discontinuous to continuous as x is reduced. The calculated results from present approach are in good agreement with experimental data available for the end point members (x=0 and x=1). The results for the alloy counter parts are also in fair agreement with experimental data generated from the vegard’s law. An improved interaction potential model has been developed which includes coulomb, three body interaction, polarizability effect and overlap repulsive interaction operative up to second neighbor ions. It is found that the inclusion of polarizability effect has improved our results.Keywords: elastic constants, high pressure, phase transition, rare earth compound
Procedia PDF Downloads 419