Search results for: open source data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30062

Search results for: open source data

27782 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 387
27781 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management

Authors: Kenneth Harper

Abstract:

The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.

Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups

Procedia PDF Downloads 27
27780 Communicating Meaning through Translanguaging: The Case of Multilingual Interactions of Algerians on Facebook

Authors: F. Abdelhamid

Abstract:

Algeria is a multilingual speech community where individuals constantly mix between codes in spoken discourse. Code is used as a cover term to refer to the existing languages and language varieties which include, among others, the mother tongue of the majority Algerian Arabic, the official language Modern Standard Arabic and the foreign languages French and English. The present study explores whether Algerians mix between these codes in online communication as well. Facebook is the selected platform from which data is collected because it is the preferred social media site for most Algerians and it is the most used one. Adopting the notion of translanguaging, this study attempts explaining how users of Facebook use multilingual messages to communicate meaning. Accordingly, multilingual interactions are not approached from a pejorative perspective but rather as a creative linguistic behavior that multilingual utilize to achieve intended meanings. The study is intended as a contribution to the research on multilingualism online because although an extensive literature has investigated multilingualism in spoken discourse, limited research investigated it in the online one. Its aim is two-fold. First, it aims at ensuring that the selected platform for analysis, namely Facebook, could be a source for multilingual data to enable the qualitative analysis. This is done by measuring frequency rates of multilingual instances. Second, when enough multilingual instances are encountered, it aims at describing and interpreting some selected ones. 120 posts and 16335 comments were collected from two Facebook pages. Analysis revealed that third of the collected data are multilingual messages. Users of Facebook mixed between the four mentioned codes in writing their messages. The most frequent cases are mixing between Algerian Arabic and French and between Algerian Arabic and Modern Standard Arabic. A focused qualitative analysis followed where some examples are interpreted and explained. It seems that Algerians mix between codes when communicating online despite the fact that it is a conscious type of communication. This suggests that such behavior is not a random and corrupted way of communicating but rather an intentional and natural one.

Keywords: Algerian speech community, computer mediated communication, languages in contact, multilingualism, translanguaging

Procedia PDF Downloads 131
27779 Moderating Effect of Owner's Influence on the Relationship between the Probability of Client Failure and Going Concern Opinion Issuance

Authors: Mohammad Noor Hisham Osman, Ahmed Razman Abdul Latiff, Zaidi Mat Daud, Zulkarnain Muhamad Sori

Abstract:

The problem that Malaysian auditors do not issue going concern opinion (GC opinion) to seriously financially distressed companies is still a pressing issue. Policy makers, particularly the Financial Statement Review Committee (FSRC) of Malaysian Institute of Accountant, have raised this issue as early as in 2009. Similar problem happened in the US, UK, and many developing countries. It is important for auditors to issue GC opinion properly because such opinion is one signal about the viability of a company much needed by stakeholders. There are at least two unanswered questions or research gaps in the literature on determinants of GC opinion. Firstly, is client’s probability of failure associated with GC opinion issuance? Secondly, to what extent influential owners (management, family, and institution) moderate the association between client probability of failure and GC opinion issuance. The objective of this study is, therefore, twofold; (1) To examine the extent of the relationship between the probability of client failure and the issuance of GC opinion and (2) To examine the level of management, family, and institutional ownerships moderate the association between client probability of failure and the issuance of GC opinion. This study is quantitative in nature, and the sources of data are secondary (mainly company’s annual reports). A total of four hypotheses have been developed and tested on data accumulated from annual reports of seriously financially distressed Malaysian public listed companies. Data from 2006 to 2012 on a sample of 644 observations have been analyzed using panel logistic regression. It is found that certainty (rather than probability) of client failure affects the issuance of GC opinion. In addition, it is found that only the level of family ownership does positively moderate the relationship between client probability of failure and GC opinion issuance. This study is a contribution to auditing literature as its findings can enhance our understanding about audit quality; particularly on the variables that are associated with the issuance of GC opinion. The findings of this study shed light on the roles family owners in GC opinion issuance process, and this would open ways for the researcher to suggest measures that can be used to tackle the problem of auditors do not want to issue GC opinion to financially distressed clients. The measures to be suggested can be useful to policy makers in formulating future promulgations.

Keywords: audit quality, auditing, auditor characteristics, going concern opinion, Malaysia

Procedia PDF Downloads 260
27778 Study on Surface Morphology and Reflectance of Solar Cells Applied in Pyramid Structures

Authors: Zong-Sheng Chen

Abstract:

With the advancement of technology, human activities have increased greenhouse gas emissions and fossil fuel energy production, leading to increasingly severe global warming. To mitigate global warming, energy conservation and carbon reduction have become global goals. Solar energy, a renewable energy source, not only helps achieve energy conservation and carbon reduction but also serves as an efficient energy generation method. Solar energy, derived from sunlight, is an endless and promising energy source capable of meeting high energy demands sustainably. In recent years, many countries around the world have been developing the solar energy industry, and Taiwan is no exception. Positioned in the subtropical region, Taiwan possesses geographical advantages conducive to solar energy utilization. Furthermore, Taiwan's well-developed semiconductor technology and sophisticated equipment make it highly suitable for the development of high-efficiency solar cells. This study focuses on investigating the anti-reflection properties of solar cells. Through metal-assisted chemical etching, pyramid structures are etched to allow sunlight to pass through, achieving secondary or higher-order reflections on the surface of these structures. This trapping of light within the substrate reduces reflection rates and increases conversion efficiency.

Keywords: solar cell, reflectance, pyramidal structure, potassium hydroxide

Procedia PDF Downloads 67
27777 Analyses of Extent of Effects of Siting Boreholes Nearby Open Landfill Dumpsite at Obosi Anambra Southeast of Nigeria

Authors: George Obinna Akuaka

Abstract:

Solid waste disposal techniques in Nigeria pose an environmental threat to the environment and to nearby resident. The presence of microbial physical and chemical concentration in boreholes samples nearby dumpsite implies that groundwater is normally contaminated by leachate infiltration from an open landfill dumpsite. In this study, the physicochemical and microbial analyses of water samples from hand dug well in the site and boreholes were carried out around the active landfill and from different distances (50 m to 200 m). leachate samples collected were used to ascertain the effect or extent of contamination on the groundwater quality. A total of 5 leachate samples and 5 samples of groundwater were collected, and all samples were analyzed for various physical and chemical parameters according to the standard methods. These include pH, Electrical conductivity, Total dissolved solid, BOD, OD, Temperature, major cations such as Mg²+ Ca²+, Fe²+ Cu²+, major anions NO³-, Cl-,SO⁴- PO⁴-, Zn, Ar, Cd, Cr, Hg, Pb, Ni are the heavy metals and metalloids. The mean values of the physical and chemical parameters obtained from both sites were compared with the established of the World Health Organization (WHO). The leachate samples were found to be higher in the concentration of the results obtained than that of the boreholes water, and the recorded mean values of heavy metals were above approved standard minimum limits. The results indicated that mercury and copper were not found in all the borehole water samples. Microbial analyses showed that total heterotrophic bacteria mean count ranged from 10.6 X10⁷ cfu/ml to 2.04x10⁷cfu/ml and 9.5 X 10⁷ cfu/ml to 18.9 X 10⁷ cfu/ml in leachate and borehole samples respectively. It also revealed that almost at the bacteria isolated in the leachate were also found in the water samples. This results indicated that heavy pollution in all the samples with most physicochemical parameters and microbes showed traceable pollution, which occurred as a result of leachate infiltration into the ground water.

Keywords: physicochemical, landfill dumpsite, microbial, leachate, groundwater

Procedia PDF Downloads 204
27776 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 516
27775 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System

Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee

Abstract:

This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.

Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation

Procedia PDF Downloads 101
27774 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 379
27773 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 94
27772 Design of Mobile Teaching for Students Collaborative Learning in Distance Higher Education

Authors: Lisbeth Amhag

Abstract:

The aim of the study is to describe and analyze the design of mobile teaching for students collaborative learning in distance higher education with a focus on mobile technologies as online webinars (web-based seminars or conferencing) by using laptops, smart phones, or tablets. These multimedia tools can provide face-to-face interactions, recorded flipped classroom videos and parallel chat communications. The data collection consists of interviews with 22 students and observations of online face-to-face webinars, as well two surveys. Theoretically, the study joins the research tradition of Computer Supported Collaborative learning, CSCL, as well as Computer Self-Efficacy, CSE concerned with individuals’ media and information literacy. Important conclusions from the study demonstrated mobile interactions increased student centered learning. As the students were appreciating the working methods, they became more engaged and motivated. The mobile technology using among student also contributes to increased flexibility between space and place, as well as media and information literacy.

Keywords: computer self-efficacy, computer supported collaborative learning, distance and open learning, educational design and technologies, media and information literacy, mobile learning

Procedia PDF Downloads 358
27771 Occurrence and Geological Setting of the Black Shales Outcrops in Malaysia

Authors: Hassan M. Baioumy, Yuniarti Ulfa

Abstract:

Paleozoic, Mesozoic and Cenozoic black shales that can be a potential source of energy and precious metals are widely distributed in Malaysia Peninsula, Sarawak and Sabah. Two Paleozoic black shales outcrops were reported in the Langkawi Island belonging to the Cambrian fluvial Machinchang Formation and the Silurian glaciomarine Singa Formation. More the seventeen occurrences of Paleozoic black shales outcrops have been found in the Peninsular Malaysia that range in age from Devonian, Carboniferous, and Permian in the Terengganu, Perlis, Pahang, and Perak States. Mesozoic black shales outcrops occur in several places in both the Peninsular Malaysia and Sarawak. In the Peninsular Malaysia, Triassic black shales occur in the Nami area, Northern Kedah and in the Pahang area. In Sarawak, Triassic black shales have been reported in the Bau area. Cenozoic black shales outcrops were reported in both Sarawak at Miri area and Sabah at the Ranau and Tenom areas. Preliminary mineralogical and geochemical investigations on some of these black shales outcrops showed distinct compositional variations among these black shales outcrops probably due to variations in their source area composition and/or depositional and diagenetic settings of these shales. Some of these shalese also subjected to post-depositional hydrothermal mineralization that enriched these shales with Au-bearing minerals such as pyrite, calchopyrite, and arsenopyrite. Many of the studied black shales outcrops look rich in organic matter, which increase the possibility of using these black shales as an unconventional energy resource.

Keywords: black shales, energy, mineralization, Malaysia

Procedia PDF Downloads 428
27770 Intelligent Software Architecture and Automatic Re-Architecting Based on Machine Learning

Authors: Gebremeskel Hagos Gebremedhin, Feng Chong, Heyan Huang

Abstract:

Software system is the combination of architecture and organized components to accomplish a specific function or set of functions. A good software architecture facilitates application system development, promotes achievement of functional requirements, and supports system reconfiguration. We describe three studies demonstrating the utility of our architecture in the subdomain of mobile office robots and identify software engineering principles embodied in the architecture. The main aim of this paper is to analyze prove architecture design and automatic re-architecting using machine learning. Intelligence software architecture and automatic re-architecting process is reorganizing in to more suitable one of the software organizational structure system using the user access dataset for creating relationship among the components of the system. The 3-step approach of data mining was used to analyze effective recovery, transformation and implantation with the use of clustering algorithm. Therefore, automatic re-architecting without changing the source code is possible to solve the software complexity problem and system software reuse.

Keywords: intelligence, software architecture, re-architecting, software reuse, High level design

Procedia PDF Downloads 119
27769 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 244
27768 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 478
27767 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 303
27766 The Significance of Translating Folklore in Teaching and Learning Open Distance e-Learning

Authors: M. A. Mabasa, O. Ramokolo, M. Z. Mnikathi, D. Mathabatha, T. Manyapelo

Abstract:

The study examines the importance of translating South African folklore from Oral into Written Literature in a Multilingual Education. Therefore, the study postulates that translation can be regarded as a valuable tool when oral and written literature is transmitted from one generation to another. The study entails that translation does not take place in a haphazard fashion; for that reason, skills such as translation principles are required to translate folklore significantly and effectively. The purpose of the study is to indicate the significance of using translation relating to folklore in teaching and learning. The study also observed that Modernism in literature should be shared amongst varieties of cultures because folklore is interactive in narrating stories, folktales and myths to sharpen the reader’s knowledge and intellect because they are informative and educative in nature. As a technological tool, the study points out that translation is of paramount importance in the sense that the meanings of different data can be made available in all South African official languages using oral and written forms of folklore. The study opines that tradition and customary beliefs and practices in the institution of higher learning. The study envisages the way in which literature of folklore can be juxtaposed to ensure that translated folklore is of quality assured standards. The study alludes that well-translated folklore can serve as oral and written literature, which may contribute to the child’s learning and acquisition of knowledge and insights during cognitive development toward maturity. Methodologically, the study selects a qualitative research approach and selects content analysis as an instrument for data gathering, which will be analyzed qualitatively in consideration of the significance of translating folklore as written and spoken literature in a documented way. The study reveals that the translation of folktales promotes functional multilingualism in high-function formal contexts like a university. The study emphasizes that translated and preserved literary folklore may serve as a language repository from one generation to another because of the archival and storage of information in the form of a term bank.

Keywords: translation, editing, teaching, learning, folklores

Procedia PDF Downloads 31
27765 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 504
27764 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 430
27763 Online Delivery Approaches of Post Secondary Virtual Inclusive Media Education

Authors: Margot Whitfield, Andrea Ducent, Marie Catherine Rombaut, Katia Iassinovskaia, Deborah Fels

Abstract:

Learning how to create inclusive media, such as closed captioning (CC) and audio description (AD), in North America is restricted to the private sector, proprietary company-based training. We are delivering (through synchronous and asynchronous online learning) the first Canadian post-secondary, practice-based continuing education course package in inclusive media for broadcast production and processes. Despite the prevalence of CC and AD taught within the field of translation studies in Europe, North America has no comparable field of study. This novel approach to audio visual translation (AVT) education develops evidence-based methodology innovations, stemming from user study research with blind/low vision and Deaf/hard of hearing audiences for television and theatre, undertaken at Ryerson University. Knowledge outcomes from the courses include a) Understanding how CC/AD fit within disability/regulatory frameworks in Canada. b) Knowledge of how CC/AD could be employed in the initial stages of production development within broadcasting. c) Writing and/or speaking techniques designed for media. d) Hands-on practice in captioning re-speaking techniques and open source technologies, or in AD techniques. e) Understanding of audio production technologies and editing techniques. The case study of the curriculum development and deployment, involving first-time online course delivery from academic and practitioner-based instructors in introductory Captioning and Audio Description courses (CDIM 101 and 102), will compare two different instructors' approaches to learning design, including the ratio of synchronous and asynchronous classroom time and technological engagement tools on meeting software platform such as breakout rooms and polling. Student reception of these two different approaches will be analysed using qualitative thematic and quantitative survey analysis. Thus far, anecdotal conversations with students suggests that they prefer synchronous compared with asynchronous learning within our hands-on online course delivery method.

Keywords: inclusive media theory, broadcasting practices, AVT post secondary education, respeaking, audio description, learning design, virtual education

Procedia PDF Downloads 183
27762 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 290
27761 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India

Authors: Anushtha Saxena

Abstract:

This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.

Keywords: data monetization, e-commerce companies, regulatory framework, GDPR

Procedia PDF Downloads 120
27760 Psychological Resilience Factors Associated with Climate Change Adaptations by Subsistence Farmers in a Rural Community, South Africa

Authors: Kgopa Bontle, Tholen Sodi

Abstract:

Climate change poses a major threat to the well-being of both people and the environment, with subsistence farmers most affected as they rely on local supply systems that are sensitive to climate variation. This study documented psychological resilience factors associated with climate change adaptations by subsistence farmers in Maruleng Municipality, Limpopo Province. A qualitative study was conducted to examine the notions of climate change by subsistence farmers, the psychological resilience factors, the strategies to cope with climate change, adaptation methods, and the development of subsistence farmers’ psychological resilience factors model. Data were collected through direct interactions with participants using a grounded theory research design. An open-ended interview was used to collect data with a sample of 15 participants selected through theoretical sampling in Maruleng Municipality. The participants were both Sepedi and Xitsonga speaking from 2 villages, mostly unemployed, pensioners and dependent on social grants. The study included both males and females who were predominately the elderly. The research findings indicate that farmers have limited knowledge of what climate change is and what causes it. Furthermore, the research reflects that although their responses were non-scientific but sensible enough to know what they were dealing with. They mentioned extreme weather, which includes hot days and less rainfall and changes in seasons, as some of the impacts brought by climate change. The results also indicated that participants have learned to adapt through several adaptation strategies, including mulching, changes in irrigation time slots and being innovative. The resilience factors that emerged from the study were a passion for farming, hope, enthusiasm, courage, acceptance/tolerance, livelihood and belief systems. Looking at the socio-economic factors of the current study setting argumentation leads to the conclusion that it is important that government should assist the subsistence farmers as it was observed from the participants that they felt neglected by the government and policymakers as they are small scale farmers and are not included like commercial farmers.

Keywords: climate change, psychological resilience factors, human adaptation, subsistence farmers

Procedia PDF Downloads 122
27759 Relevance of Lecture Method in Modern Era: A Study from Nepal

Authors: Hari Prasad Nepal

Abstract:

Research on lecture method issues confirm that this teaching method has been practiced from the very beginnings of schooling. Many teachers, lecturers and professors are convinced that lecture still represents main tool of contemporary instructional process. The central purpose of this study is to uncover the extent of using lecture method in the higher education. The study was carried out in Nepalese context with employing mixed method research design. To obtain the primary data this study employed a questionnaire involving items with close and open answers. 120 teachers, lecturers and professors participated in this study. The findings indicated that 75 percent of the respondents use the lecture method in their classroom teaching. The study reveals that there are advantages of using lecture method such as easy to practice, less time to prepare, high pass rate, high students’ satisfaction, little comments on instructors, appropriate to large classes and high level students. In addition, the study divulged the instructors’ reflections and measures to improve the lecture method. This research concludes that the practice of lecture method is still significantly applicable in colleges and universities in Nepalese contexts. So, there are no significant changes in the application of lecture method in the higher education classroom despite the emergence of new learning approaches and strategies.

Keywords: instructors, learning approaches, learning strategies, lecture method

Procedia PDF Downloads 238
27758 Wood Energy, Trees outside Forests and Agroforestry Wood Harvesting and Conversion Residues Preparing and Storing

Authors: Adeiza Matthew, Oluwadamilola Abubakar

Abstract:

Wood energy, also known as wood fuel, is a renewable energy source that is derived from woody biomass, which is organic matter that is harvested from forests, woodlands, and other lands. Woody biomass includes trees, branches, twigs, and other woody debris that can be used as fuel. Wood energy can be classified based on its sources, such as trees outside forests, residues from wood harvesting and conversion, and energy plantations. There are several policy frameworks that support the use of wood energy, including participatory forest management and agroforestry. These policies aim to promote the sustainable use of woody biomass as a source of energy while also protecting forests and wildlife habitats. There are several options for using wood as a fuel, including central heating systems, pellet-based systems, wood chip-based systems, log boilers, fireplaces, and stoves. Each of these options has its own benefits and drawbacks, and the most appropriate option will depend on factors such as the availability of woody biomass, the heating needs of the household or facility, and the local climate. In order to use wood as a fuel, it must be harvested and stored properly. Hardwood or softwood can be used as fuel, and the heating value of firewood depends on the species of tree and the degree of moisture content. Proper harvesting and storage of wood can help to minimize environmental impacts and improve wildlife habitats. The use of wood energy has several environmental impacts, including the release of greenhouse gases during combustion and the potential for air pollution from combustion by-products. However, wood energy can also have positive environmental impacts, such as the sequestration of carbon in trees and the reduction of reliance on fossil fuels. The regulation and legislation of wood energy vary by country and region, and there is an ongoing debate about the potential use of wood energy in renewable energy technologies. Wood energy is a renewable energy source that can be used to generate electricity, heat, and transportation fuels. Woody biomass is abundant and widely available, making it a potentially significant source of energy for many countries. The use of wood energy can create local economic and employment opportunities, particularly in rural areas. Wood energy can be used to reduce reliance on fossil fuels and reduce greenhouse gas emissions. Properly managed forests can provide a sustained supply of woody biomass for energy, helping to reduce the risk of deforestation and habitat loss. Wood energy can be produced using a variety of technologies, including direct combustion, co-firing with fossil fuels, and the production of biofuels. The environmental impacts of wood energy can be minimized through the use of best practices in harvesting, transportation, and processing. Wood energy is regulated and legislated at the national and international levels, and there are various standards and certification systems in place to promote sustainable practices. Wood energy has the potential to play a significant role in the transition to a low-carbon economy and the achievement of climate change mitigation goals.

Keywords: biomass, timber, charcoal, firewood

Procedia PDF Downloads 100
27757 Changes in Foreign Direct Investment Policy of India and Its Impact on Economic Development

Authors: Kishor P. Kadam

Abstract:

Foreign direct investment policy (FDI) is defined as an investment involving a long term relationship and reflecting a long duration interest and control of a resident entity in the home country (foreign direct investor or parent firm) in the host country. India has been one of the most translucent and open-minded FDI regimes among the emerging and developing economies. There is clear cut mentioned about the sectoral caps for foreign investment. The policy problems that have been identified by time to time surveys as acting as additional hurdles for FDI are laws, regulatory systems and government monopolies that do not have contemporary relevance. Foreign investment policies in the post-reforms period have emphasized greater encouragement and mobilization of non-debt creating private inflows for plunging reliance on debt flows. This paper will focus on how foreign direct investment policy changed from 1990-91 up to now. A time series data of 25 years is used for analysing the policy changes. It is observed that India has more liberal policy. The growth in number of Greenfield investments in India has been more impressive than the number of M&A deals whereas equity capital for incorporated bodies FDI inflows has been increased continuously 2014-15. India has made major changes in FDI Policy, and it has positive impact on economic development.

Keywords: FDI, India, economic development, government

Procedia PDF Downloads 361
27756 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 199
27755 The High Precision of Magnetic Detection with Microwave Modulation in Solid Spin Assembly of NV Centres in Diamond

Authors: Zongmin Ma, Shaowen Zhang, Yueping Fu, Jun Tang, Yunbo Shi, Jun Liu

Abstract:

Solid-state quantum sensors are attracting wide interest because of their high sensitivity at room temperature. In particular, spin properties of nitrogen–vacancy (NV) color centres in diamond make them outstanding sensors of magnetic fields, electric fields and temperature under ambient conditions. Much of the work on NV magnetic sensing has been done so as to achieve the smallest volume, high sensitivity of NV ensemble-based magnetometry using micro-cavity, light-trapping diamond waveguide (LTDW), nano-cantilevers combined with MEMS (Micro-Electronic-Mechanical System) techniques. Recently, frequency-modulated microwaves with continuous optical excitation method have been proposed to achieve high sensitivity of 6 μT/√Hz using individual NV centres at nanoscale. In this research, we built-up an experiment to measure static magnetic field through continuous wave optical excitation with frequency-modulated microwaves method under continuous illumination with green pump light at 532 nm, and bulk diamond sample with a high density of NV centers (1 ppm). The output of the confocal microscopy was collected by an objective (NA = 0.7) and detected by a high sensitivity photodetector. We design uniform and efficient excitation of the micro strip antenna, which is coupled well with the spin ensembles at 2.87 GHz for zero-field splitting of the NV centers. Output of the PD signal was sent to an LIA (Lock-In Amplifier) modulated signal, generated by the microwave source by IQ mixer. The detected signal is received by the photodetector, and the reference signal enters the lock-in amplifier to realize the open-loop detection of the NV atomic magnetometer. We can plot ODMR spectra under continuous-wave (CW) microwave. Due to the high sensitivity of the lock-in amplifier, the minimum detectable value of the voltage can be measured, and the minimum detectable frequency can be made by the minimum and slope of the voltage. The magnetic field sensitivity can be derived from η = δB√T corresponds to a 10 nT minimum detectable shift in the magnetic field. Further, frequency analysis of the noise in the system indicates that at 10Hz the sensitivity less than 10 nT/√Hz.

Keywords: nitrogen-vacancy (NV) centers, frequency-modulated microwaves, magnetic field sensitivity, noise density

Procedia PDF Downloads 440
27754 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security

Authors: Kenneth Harper

Abstract:

Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.

Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs

Procedia PDF Downloads 18
27753 Kinetics of Growth Rate of Microalga: The Effect of Carbon Dioxide Concentration

Authors: Retno Ambarwati Sigit Lestari

Abstract:

Microalga is one of the organisms that can be considered ideal and potential for raw material of bioenergy production, because the content of lipids in microalga is relatively high. Microalga is an aquatic organism that produces complex organic compounds from inorganic molecules using carbon dioxide as a carbon source, and sunlight for energy supply. Microalga-CO₂ fixation has potential advantages over other carbon captures and storage approaches, such as wide distribution, high photosynthetic rate, good environmental adaptability, and ease of operation. The rates of growth and CO₂ capture of microalga are influenced by CO₂ concentration and light intensity. This study quantitatively investigates the effects of CO₂ concentration on the rates of growth and CO₂ capture of a type of microalga, cultivated in bioreactors. The works include laboratory experiments as well as mathematical modelling. The mathematical models were solved numerically and the accuracy of the model was tested by the experimental data. It turned out that the mathematical model proposed can well quantitatively describe the growth and CO₂ capture of microalga, in which the effects of CO₂ concentration can be observed.

Keywords: Microalga, CO2 concentration, photobioreactor, mathematical model

Procedia PDF Downloads 125