Search results for: market data
25870 Wireless Transmission of Big Data Using Novel Secure Algorithm
Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha
Abstract:
This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance
Procedia PDF Downloads 49025869 Etiquette Learning and Public Speaking: Early Etiquette Learning and Its Impact on Higher Education and Working Professionals
Authors: Simran Ballani
Abstract:
The purpose of this paper is to call education professionals to implement etiquette and public speaking skills for preschoolers, primary, middle and higher school students. In this paper the author aims to present importance of etiquette learning and public speaking curriculum for preschoolers, reflect on experiences from implementation of the curriculum and discuss the effect of the said implementation on higher education/global job market. Author’s aim to introduce this curriculum was to provide children with innovative learning and all around development. This training of soft skills at kindergarten level can have a long term effect on their social behaviors which in turn can contribute to professional success once they are ready for campus recruitment/global job markets. Additionally, if preschoolers learn polite, appropriate behavior at early age, it will enable them to become more socially attentive and display good manners as an adult. It is easier to nurture these skills in a child rather than changing bad manners at adulthood. Preschool/Kindergarten education can provide the platform for children to learn these crucial soft skills irrespective of the ethnicity, economic or social background they come from. These skills developed at such early years can go a long way to shape them into better and confident individuals. Unfortunately, accessibility of the etiquette learning and public speaking skill education is not standardized in pre-primary or primary level and most of the time embedding into the kindergarten curriculum is next to nil. All young children should be provided with equal opportunity to learn these soft skills which are essential for finding their place in job market.Keywords: Early Childhood Learning, , public speaking, , confidence building, , innovative learning
Procedia PDF Downloads 11125868 An Analysis of the Wheat Export Performance of Ukraine in Europe
Authors: Kiran Bala Das
Abstract:
This paper examines the Ukraine wheat export condition after Russian-Ukrainian military confrontation. The political conflict in Ukraine and the recent military intervention of Russia in Crimea is raising concern full effect of the events there is still uncertain, but some hints can be seen in the wheat market by analyzing the trend and pattern of Ukraine wheat export. Crimea is extremely important as it is where most of Ukraine grain exported by ship from its ports of the black sea. Ukraine is again seeking to establish itself a significant exporter of agricultural product with its rich black soil, it is chornozem the top soil layer that makes the country soil so fertile and become one of the major exporter of wheat in the world, its generous supplier of wheat make Ukraine 'Bread basket of Europe'. Ukraine possesses 30% of the world’s richest black soil; its agricultural industry has huge potential especially in grains. European Union (EU) is a significant trading partner of Ukraine but geopolitical tension adversely affects the wheat trade from black sea, which threatens Europe breadbasket. This study also highlights an index of export intensity to analyze the intensity of existing trade for the period 2011-2014 between Ukraine and EU countries. The result show export has intensified over the years, but this year low trade intensity. The overall consequence is hard to determine but if the situation deteriorates and Ukraine cutoff export, international wheat price will hike and grain prices (wheat) also come under the current circumstances and the recent development indicates how the grain market get affected and Agri future now in danger in Ukraine, and its forecast that Ukraine harvest low wheat crop this year and projected decline in export of wheat.Keywords: breadbasket of Europe, export intensity index, growth rate, wheat export
Procedia PDF Downloads 34925867 A Qualitative Inquiry of Institutional Responsiveness in Public Land Development in the Urban Areas in Sri Lanka
Authors: Priyanwada I. Singhapathirana
Abstract:
The public land ownership is a common phenomenon in many countries in the world however, the development approaches and the institutional structures are greatly diverse. The existing scholarship around public land development has been greatly limited to Europe and advanced Asian economies. Inferences of such studies seem to be inadequate and inappropriate to comprehend the peculiarities of public land development in developing Asian economies. The absence of critical inquiry on the public land ownership and the long-established institutional structures which govern the development has restrained these countries from institutional innovations. In this context, this research investigates the issues related to public land development and the institutional responses in Sri Lanka. This study introduces the concept of ‘Institutional Responsiveness’ in Public land development, which is conceptualized as the ability of the institutions to respond to the spatial, market and fiscal stimulus. The inquiry was carried out through in-depth interviews with five key informants from apex public agencies in order to explore the responsiveness of land institutions form decision-makers' perspectives. Further, the analysis of grey literature and recent media reports are used to supplement the analysis. As per the findings, long term abandonment of public lands and high transaction costs are some of the key issues in relation to public land development. The inability of the institutions to respond to the market and fiscal stimulus has left many potential public lands underutilized. As a result, the public sector itself and urban citizens have not been able to relish the benefits of the public lands in cities. Spatial analysis at the local scale is suggested for future studies in order to capture the multiple dimensions of the responsiveness of institutions to the development stimulus.Keywords: institutions, public land, responsiveness, under-utilization
Procedia PDF Downloads 12725866 Live Music Promotion in Burundi Country
Authors: Aster Anderson Rugamba
Abstract:
Context: Live music in Burundi is currently facing neglect and a decline in popularity, resulting in artists struggling to generate income from this field. Additionally, live music from Burundi has not been able to gain traction in the international market. It is essential to establish various structures and organizations to promote cultural events and support artistic endeavors in music and performing arts. Research Aim: The aim of this research is to seek new knowledge and understanding in the field of live music and its content in Burundi. Furthermore, it aims to connect with other professionals in the industry, make new discoveries, and explore potential collaborations and investments. Methodology: The research will utilize both quantitative and qualitative research methodologies. The quantitative approach will involve a sample size of 57 musician artists in Burundi. It will employ closed-ended questions and gather quantitative data to ensure a large sample size and high external validity. The qualitative approach will provide deeper insights and understanding through open-ended questions and in-depth interviews with selected participants. Findings: The research expects to find new theories, methodologies, empirical findings, and applications of existing knowledge that can contribute to the development of live music in Burundi. By exploring the challenges faced by artists and identifying potential solutions, the study aims to establish live music as a catalyst for development and generate a positive impact on both the Burundian and international community. Theoretical Importance: Theoretical contributions of this research will expand the current understanding of the live music industry in Burundi. It will propose new theories and models to address the issues faced by artists and highlight the potential of live music as a lucrative and influential industry. By bridging the gap between theory and practice, the research aims to provide valuable insights for academics, professionals, and policymakers. Data Collection and Analysis Procedures: Data will be collected through surveys, interviews, and archival research. Surveys will be administered to the sample of 57 musician artists, while interviews will be conducted to gain in-depth insights from selected participants. The collected data will be analyzed using both quantitative and qualitative methods, including statistical analysis and thematic analysis, respectively. This mixed-method approach will ensure a comprehensive and rigorous examination of the research questions addressed.Keywords: business music in burundi, music in burundi, promotion of art, burundi music culture
Procedia PDF Downloads 6125865 Valorization of the Waste Generated in Building Energy-Efficiency Rehabilitation Works as Raw Materials for Gypsum Composites
Authors: Paola Villoria Saez, Mercedes Del Rio Merino, Jaime Santacruz Astorqui, Cesar Porras Amores
Abstract:
In construction the Circular Economy covers the whole cycle of the building construction: from production and consumption to waste management and the market for secondary raw materials. The circular economy will definitely contribute to 'closing the loop' of construction product lifecycles through greater recycling and re-use, helping to build a market for reused construction materials salvaged from demolition sites, boosting global competitiveness and fostering sustainable economic growth. In this context, this paper presents the latest research of 'Waste to resources (W2R)' project funded by the Spanish Government, which seeks new solutions to improve energy efficiency in buildings by developing new building materials and products that are less expensive, more durable, with higher quality and more environmentally friendly. This project differs from others as its main objective is to reduce to almost zero the Construction and Demolition Waste (CDW) generated in building rehabilitation works. In order to achieve this objective, the group is looking for new ways of CDW recycling as raw materials for new conglomerate materials. With these new materials, construction elements reducing building energy consumption will be proposed. In this paper, the results obtained in the project are presented. Several tests were performed to gypsum samples containing different percentages of CDW waste generated in Spanish building retroffiting works. Results were further analyzed and one of the gypsum composites was highlighted and discussed. Acknowledgements: This research was supported by the Spanish State Secretariat for Research, Development and Innovation of the Ministry of Economy and Competitiveness under 'Waste 2 Resources' Project (BIA2013-43061-R).Keywords: building waste, CDW, gypsum, recycling, resources
Procedia PDF Downloads 33025864 The Importance of Value Added Services Provided by Science and Technology Parks to Boost Entrepreneurship Ecosystem in Turkey
Authors: Faruk Inaltekin, Imran Gurakan
Abstract:
This paper will aim to discuss the importance of value-added services provided by Science and Technology Parks for entrepreneurship development in Turkey. Entrepreneurship is vital subject for all countries. It has not only fostered economic development but also promoted innovation at local and international levels. To foster high tech entrepreneurship ecosystem, Technopark (Science and Technology Park/STP) concept was initiated with the establishment of Silicon Valley in the 1950s. The success and rise of Silicon Valley led to the spread of technopark activities. Developed economies have been setting up projects to plan and build STPs since the 1960s and 1970s. To promote the establishment of STPs, necessary legislations were made by Ministry of Science, Industry, and Technology in 2001, Technology Development Zones Law (No. 4691) and it has been revised in 2016 to provide more supports. STPs’ basic aim is to provide customers high-quality office spaces with various 'value added services' such as business development, network connections, cooperation programs, investor/customers meetings and internationalization services. For this aim, STPs should help startups deal with difficulties in the early stages and to support mature companies’ export activities in the foreign market. STPs should support the production, commercialization and more significantly internationalization of technology-intensive business and foster growth of companies. Nowadays within this value-added services, internationalization is very popular subject in the world. Most of STPs design clusters or accelerator programs in order to support their companies in the foreign market penetration. If startups are not ready for international competition, STPs should help them to get ready for foreign market with training and mentoring sessions. These training and mentoring sessions should take a goal based approach to working with companies. Each company has different needs and goals. Therefore the definition of ‘success' varies for each company. For this reason, it is very important to create customized value added services to meet the needs of startups. After local supports, STPs should also be able to support their startups in foreign market. Organizing well defined international accelerator program plays an important role in this mission. Turkey is strategically placed between key markets in Europe, Russia, Central Asia and the Middle East. Its population is young and well educated. So both government agencies and the private sectors endeavor to foster and encourage entrepreneurship ecosystem with many supports. In sum, the task of technoparks with these and similar value added services is very important for developing entrepreneurship ecosystem. The priorities of all value added services are to identify the commercialization and growth obstacles faced by entrepreneurs and get rid of them with the one-to-one customized services. Also, in order to have a healthy startup ecosystem and create sustainable entrepreneurship, stakeholders (technoparks, incubators, accelerators, investors, universities, governmental organizations etc.) should fulfill their roles and/or duties and collaborate with each other. STPs play an important role as bridge for these stakeholders & entrepreneurs. STPs always should benchmark and renew services offered to how to help the start-ups to survive, develop their business and benefit from these stakeholders.Keywords: accelerator, cluster, entrepreneurship, startup, technopark, value added services
Procedia PDF Downloads 14325863 One Step Further: Pull-Process-Push Data Processing
Authors: Romeo Botes, Imelda Smit
Abstract:
In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list
Procedia PDF Downloads 24425862 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution
Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph
Abstract:
In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.Keywords: forecasting, generalized extreme value (GEV), meteorology, return level
Procedia PDF Downloads 47825861 Internationalization Process Model for Construction Firms: Stages and Strategies
Authors: S. Ping Ho, R. Dahal
Abstract:
The global economy has drastically changed how firms operate and compete. Although the construction industry is ‘local’ by its nature, the internationalization of the construction industry has become an inevitable reality. As a result of global competition, staying domestic is no longer safe from competition and, on the contrary, to grow and become an MNE (multi-national enterprise) becomes one of the important strategies for a firm to survive in the global competition. For the successful entrance into competing markets, the firms need to re-define their competitive advantages and re-identify the sources of the competitive advantages. A firm’s initiation of internationalization is not necessarily a result of strategic planning but also involves certain idiosyncratic events that pave the path leading to a firm’s internationalization. For example, a local firm’s incidental or unintentional collaboration with an MNE can become the initiating point of its internationalization process. However, because of the intensive competition in today’s global movement, many firms were compelled to initiate their internationalization as a strategic response to the competition. Understandingly stepping in in the process of internationalization and appropriately implementing the strategies (in the process) at different stages lead the construction firms to a successful internationalization journey. This study is carried out to develop a model of the internationalization process, which derives appropriate strategies that the construction firms can implement at each stage. The proposed model integrates two major and complementary views of internationalization and expresses the dynamic process of internationalization in three stages, which are the pre-international (PRE) stage, the foreign direct investment (FDI) stage, and the multi-national enterprise (MNE) stage. The strategies implied in the proposed model are derived, focusing on capability building, market locations, and entry modes based on the resource-based views: value, rareness, imitability, and substitutability (VRIN). With the proposed dynamic process model the potential construction firms which are willing to expand their business market area can be benefitted. Strategies for internationalization, such as core competence strategy, market selection, partner selection, and entry mode strategy, can be derived from the proposed model. The internationalization process is expressed in two different forms. First, we discuss the construction internationalization process, identify the driving factor/s of the process, and explain the strategy formation in the process. Second, we define the stages of internationalization along the process and the corresponding strategies in each stage. The strategies may include how to exploit existing advantages for the competition at the current stage and develop or explore additional advantages appropriate for the next stage. Particularly, the additionally developed advantages will then be accumulated and drive forward the firm’s stage of internationalization, which will further determine the subsequent strategies, and so on and so forth, spiraling up the stages of a higher degree of internationalization. However, the formation of additional strategies for the next stage does not happen automatically, and the strategy evolution is based on the firm’s dynamic capabilities.Keywords: construction industry, dynamic capabilities, internationalization process, internationalization strategies, strategic management
Procedia PDF Downloads 6225860 Impact of Stack Caches: Locality Awareness and Cost Effectiveness
Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang
Abstract:
Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.Keywords: hit rate, locality of program, stack cache, stack data
Procedia PDF Downloads 30325859 Dynamic Correlations and Portfolio Optimization between Islamic and Conventional Equity Indexes: A Vine Copula-Based Approach
Authors: Imen Dhaou
Abstract:
This study examines conditional Value at Risk by applying the GJR-EVT-Copula model, and finds the optimal portfolio for eight Dow Jones Islamic-conventional pairs. Our methodology consists of modeling the data by a bivariate GJR-GARCH model in which we extract the filtered residuals and then apply the Peak over threshold model (POT) to fit the residual tails in order to model marginal distributions. After that, we use pair-copula to find the optimal portfolio risk dependence structure. Finally, with Monte Carlo simulations, we estimate the Value at Risk (VaR) and the conditional Value at Risk (CVaR). The empirical results show the VaR and CVaR values for an equally weighted portfolio of Dow Jones Islamic-conventional pairs. In sum, we found that the optimal investment focuses on Islamic-conventional US Market index pairs because of high investment proportion; however, all other index pairs have low investment proportion. These results deliver some real repercussions for portfolio managers and policymakers concerning to optimal asset allocations, portfolio risk management and the diversification advantages of these markets.Keywords: CVaR, Dow Jones Islamic index, GJR-GARCH-EVT-pair copula, portfolio optimization
Procedia PDF Downloads 25625858 Seismic Assessment of Passive Control Steel Structure with Modified Parameter of Oil Damper
Authors: Ahmad Naqi
Abstract:
Today, the passively controlled buildings are extensively becoming popular due to its excellent lateral load resistance circumstance. Typically, these buildings are enhanced with a damping device that has high market demand. Some manufacturer falsified the damping device parameter during the production to achieve the market demand. Therefore, this paper evaluates the seismic performance of buildings equipped with damping devices, which their parameter modified to simulate the falsified devices, intentionally. For this purpose, three benchmark buildings of 4-, 10-, and 20-story were selected from JSSI (Japan Society of Seismic Isolation) manual. The buildings are special moment resisting steel frame with oil damper in the longitudinal direction only. For each benchmark buildings, two types of structural elements are designed to resist the lateral load with and without damping devices (hereafter, known as Trimmed & Conventional Building). The target building was modeled using STERA-3D, a finite element based software coded for study purpose. Practicing the software one can develop either three-dimensional Model (3DM) or Lumped Mass model (LMM). Firstly, the seismic performance of 3DM and LMM models was evaluated and found excellent coincide for the target buildings. The simplified model of LMM used in this study to produce 66 cases for both of the buildings. Then, the device parameters were modified by ± 40% and ±20% to predict many possible conditions of falsification. It is verified that the building which is design to sustain the lateral load with support of damping device (Trimmed Building) are much more under threat as a result of device falsification than those building strengthen by damping device (Conventional Building).Keywords: passive control system, oil damper, seismic assessment, lumped mass model
Procedia PDF Downloads 11425857 Autonomic Threat Avoidance and Self-Healing in Database Management System
Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik
Abstract:
Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.Keywords: autonomic computing, self-healing, threat avoidance, security
Procedia PDF Downloads 50425856 Information Extraction Based on Search Engine Results
Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk
Abstract:
The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.Keywords: search engines, information extraction, agent system
Procedia PDF Downloads 43025855 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography
Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya
Abstract:
In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography
Procedia PDF Downloads 29025854 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India
Authors: Anushtha Saxena
Abstract:
This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.Keywords: data monetization, e-commerce companies, regulatory framework, GDPR
Procedia PDF Downloads 12025853 Marketing of Non Timber Forest Products and Forest Management in Kaffa Biosphere Reserve, Ethiopia
Authors: Amleset Haile
Abstract:
Non-timber forest products (NTFPs) are harvested for both subsistence and commercial use and play a key role in the livelihoods of millions of rural people. Non-timber forest products (NTFPs) are important in rural southwest Ethiopia, Kaffa as a source of household income. market players at various levels in marketing chains are interviewed to getther information on elements of marketing system–products, product differentiation, value addition, pricing, promotion, distribution, and marketing chains. The study, therefore, was conducted in Kaffa Biosphere reserve of southwest Ethiopia with the main objective of assessing and analyzing the contribution of NTFPs to rural livelihood and to the conservation of the biosphere reserve and to identify factors influencing in the marketing of the NTFP. Five villages were selected based on their proximity gradient from Bonga town and availability of NTFP. Formal survey was carried out on rural households selected using stratified random sampling. The results indicate that Local people practice diverse livelihood activities mainly crops cultivation (cereals and cash crops) and livestock husbandry, gather forest products and off-farm/off-forest activities for surviva. NTFP trade is not a common phenomenon in southwest Ethiopia. The greatest opportunity exists for local level marketing of spices and other non timber forest products. Very little local value addition takes place within the region,and as a result local market players have little control. Policy interventions arc required to enhance the returns to local collectors, which will also contribute to sustainable management of forest resources in Kaffa biosphere reserve.Keywords: forest management, biosphere reserve, marketing, local people
Procedia PDF Downloads 54025852 Experiments on Weakly-Supervised Learning on Imperfect Data
Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler
Abstract:
Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation
Procedia PDF Downloads 19925851 Kou Jump Diffusion Model: An Application to the SP 500; Nasdaq 100 and Russell 2000 Index Options
Authors: Wajih Abbassi, Zouhaier Ben Khelifa
Abstract:
The present research points towards the empirical validation of three options valuation models, the ad-hoc Black-Scholes model as proposed by Berkowitz (2001), the constant elasticity of variance model of Cox and Ross (1976) and the Kou jump-diffusion model (2002). Our empirical analysis has been conducted on a sample of 26,974 options written on three indexes, the S&P 500, Nasdaq 100 and the Russell 2000 that were negotiated during the year 2007 just before the sub-prime crisis. We start by presenting the theoretical foundations of the models of interest. Then we use the technique of trust-region-reflective algorithm to estimate the structural parameters of these models from cross-section of option prices. The empirical analysis shows the superiority of the Kou jump-diffusion model. This superiority arises from the ability of this model to portray the behavior of market participants and to be closest to the true distribution that characterizes the evolution of these indices. Indeed the double-exponential distribution covers three interesting properties that are: the leptokurtic feature, the memory less property and the psychological aspect of market participants. Numerous empirical studies have shown that markets tend to have both overreaction and under reaction over good and bad news respectively. Despite of these advantages there are not many empirical studies based on this model partly because probability distribution and option valuation formula are rather complicated. This paper is the first to have used the technique of nonlinear curve-fitting through the trust-region-reflective algorithm and cross-section options to estimate the structural parameters of the Kou jump-diffusion model.Keywords: jump-diffusion process, Kou model, Leptokurtic feature, trust-region-reflective algorithm, US index options
Procedia PDF Downloads 42925850 Domestic Trade, Misallocation and Relative Prices
Authors: Maria Amaia Iza Padilla, Ibai Ostolozaga
Abstract:
The objective of this paper is to analyze how transportation costs between regions within a country can affect not only domestic trade but also the allocation of resources in a given region, aggregate productivity, and relative domestic prices (tradable versus non-tradable). On the one hand, there is a vast literature that analyzes the transportation costs faced by countries when trading with the rest of the world. However, this paper focuses on the effect of transportation costs on domestic trade. Countries differ in their domestic road infrastructure and transport quality. There is also some literature that focuses on the effect of road infrastructure on the price difference between regions but not on relative prices at the aggregate level. On the other hand, this work is also related to the literature on resource misallocation. Finally, the paper is also related to the literature analyzing the effect of trade on the development of the manufacturing sector. Using the World Bank Enterprise Survey database, it is observed cross-country differences in the proportion of firms that consider transportation as an obstacle. From the International Comparison Program, we obtain a significant negative correlation between GDP per worker and relative prices (manufacturing sector prices relative to the service sector). Furthermore, there is a significant negative correlation between a country’s transportation quality and the relative price of manufactured goods with respect to the price of services in that country. This is consistent with the empirical evidence of a negative correlation between transportation quality and GDP per worker, on the one hand, and the negative correlation between GDP per worker and domestic relative prices, on the other. It is also shown that in a country, the share of manufacturing firms whose main market is at the local (regional) level is negatively related to the quality of the transportation infrastructure within the country. Similarly, this index is positively related to the share of manufacturing firms whose main market is national or international. The data also shows that those countries with a higher proportion of manufacturing firms operating locally have higher relative prices. With this information in hand, the paper attempts to quantify the effects of the allocation of resources between and within sectors. The higher the trade barriers caused by transportation costs, the less efficient allocation, which causes lower aggregate productivity. Second, it is built a two-sector model where regions within a country trade with each other. On the one hand, it is found that with respect to the manufacturing sector, those countries with less trade between their regions will be characterized by a smaller variety of goods, less productive manufacturing firms on average, and higher relative prices for manufactured goods relative to service sector prices. Thus, the decline in the relative price of manufactured goods in more advanced countries could also be explained by the degree of trade between regions. This trade allows for efficient intra-industry allocation (traders are more productive, and resources are allocated more efficiently)).Keywords: misallocation, relative prices, TFP, transportation cost
Procedia PDF Downloads 8425849 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security
Authors: Kenneth Harper
Abstract:
Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs
Procedia PDF Downloads 1925848 PMEL Marker Identification of Dark and Light Feather Colours in Local Canary
Authors: Mudawamah Mudawamah, Muhammad Z. Fadli, Gatot Ciptadi, Aulanni’am
Abstract:
Canary breeders have spread throughout Indonesian regions for the low-middle society and become an income source for them. The interesting phenomenon of the canary market is the feather colours become one of determining factor for the price. The advantages of this research were contributed to the molecular database as a base of selection and mating for the Indonesia canary breeder. The research method was experiment with the genome obtained from canary blood isolation. The genome did the PCR amplification with PMEL marker followed by sequencing. Canaries were used 24 heads of light and dark colour feathers. Research data analyses used BioEdit and Network 4.6.0.0 software. The results showed that all samples were amplification with PMEL gene with 500 bp fragment length. In base sequence of 40 was found Cytosine(C) in the light colour canaries, while the dark colour canaries was obtained Thymine (T) in same base sequence. Sequence results had 286-415 bp fragment and 10 haplotypes. The conclusions were the PMEL gene (gene of white pigment) was likely to be used PMEL gene to detect molecular genetic variation of dark and light colour feather.Keywords: canary, haplotype, PMEL, sequence
Procedia PDF Downloads 23725847 Operating Speed Models on Tangent Sections of Two-Lane Rural Roads
Authors: Dražen Cvitanić, Biljana Maljković
Abstract:
This paper presents models for predicting operating speeds on tangent sections of two-lane rural roads developed on continuous speed data. The data corresponds to 20 drivers of different ages and driving experiences, driving their own cars along an 18 km long section of a state road. The data were first used for determination of maximum operating speeds on tangents and their comparison with speeds in the middle of tangents i.e. speed data used in most of operating speed studies. Analysis of continuous speed data indicated that the spot speed data are not reliable indicators of relevant speeds. After that, operating speed models for tangent sections were developed. There was no significant difference between models developed using speed data in the middle of tangent sections and models developed using maximum operating speeds on tangent sections. All developed models have higher coefficient of determination then models developed on spot speed data. Thus, it can be concluded that the method of measuring has more significant impact on the quality of operating speed model than the location of measurement.Keywords: operating speed, continuous speed data, tangent sections, spot speed, consistency
Procedia PDF Downloads 45225846 Effectiveness of Climate Smart Agriculture in Managing Field Stresses in Robusta Coffee
Authors: Andrew Kirabira
Abstract:
This study is an investigation into the effectiveness of climate-smart agriculture (CSA) technologies in improving productivity through managing biotic and abiotic stresses in the coffee agroecological zones of Uganda. The motive is to enhance farmer livelihoods. The study was initiated as a result of the decreasing productivity of the crop in Uganda caused by the increasing prevalence of pests, diseases and abiotic stresses. Despite 9 years of farmers’ application of CSA, productivity has stagnated between 700kg -800kg/ha/yr which is only 26% of the 3-5tn/ha/yr that CSA is capable of delivering if properly applied. This has negatively affected the incomes of the 10.6 million people along the crop value chain which has in essence affected the country’s national income. In 2019/20 FY for example, Uganda suffered a deficit of $40m out of singularly the increasing incidence of one pest; BCTB. The amalgamation of such trends cripples the realization of SDG #1 and #13 which are the eradication of poverty and mitigation of climate change, respectively. In probing CSA’s effectiveness in curbing such a trend, this study is guided by the objectives of; determining the existing farmers’ knowledge and perceptions of CSA amongst the coffee farmers in the diverse coffee agro-ecological zones of Uganda; examining the relationship between the use of CSA and prevalence of selected coffee pests, diseases and abiotic stresses; ascertaining the difference in the market organization and pricing between conventionally and CSA produced coffee; and analyzing the prevailing policy environment concerning the use of CSA in coffee production. The data collection research design is descriptive in nature; collecting data from farmers and agricultural extension workers in the districts of Ntungamo, Iganga and Luweero; each of these districts representing a distinct coffee agroecological zone. Policy custodian officers at district, cooperatives and at the crop’s overseeing national authority were also interviewed.Keywords: climate change, food security, field stresses, Productivity
Procedia PDF Downloads 5725845 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data
Authors: S. Nickolas, Shobha K.
Abstract:
The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing
Procedia PDF Downloads 27425844 The Effect That the Data Assimilation of Qinghai-Tibet Plateau Has on a Precipitation Forecast
Authors: Ruixia Liu
Abstract:
Qinghai-Tibet Plateau has an important influence on the precipitation of its lower reaches. Data from remote sensing has itself advantage and numerical prediction model which assimilates RS data will be better than other. We got the assimilation data of MHS and terrestrial and sounding from GSI, and introduced the result into WRF, then got the result of RH and precipitation forecast. We found that assimilating MHS and terrestrial and sounding made the forecast on precipitation, area and the center of the precipitation more accurate by comparing the result of 1h,6h,12h, and 24h. Analyzing the difference of the initial field, we knew that the data assimilating about Qinghai-Tibet Plateau influence its lower reaches forecast by affecting on initial temperature and RH.Keywords: Qinghai-Tibet Plateau, precipitation, data assimilation, GSI
Procedia PDF Downloads 23425843 Positive Affect, Negative Affect, Organizational and Motivational Factor on the Acceptance of Big Data Technologies
Authors: Sook Ching Yee, Angela Siew Hoong Lee
Abstract:
Big data technologies have become a trend to exploit business opportunities and provide valuable business insights through the analysis of big data. However, there are still many organizations that have yet to adopt big data technologies especially small and medium organizations (SME). This study uses the technology acceptance model (TAM) to look into several constructs in the TAM and other additional constructs which are positive affect, negative affect, organizational factor and motivational factor. The conceptual model proposed in the study will be tested on the relationship and influence of positive affect, negative affect, organizational factor and motivational factor towards the intention to use big data technologies to produce an outcome. Empirical research is used in this study by conducting a survey to collect data.Keywords: big data technologies, motivational factor, negative affect, organizational factor, positive affect, technology acceptance model (TAM)
Procedia PDF Downloads 36225842 Big Data Analysis with Rhipe
Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim
Abstract:
Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe
Procedia PDF Downloads 49725841 Security in Resource Constraints Network Light Weight Encryption for Z-MAC
Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy
Abstract:
Wireless sensor network was formed by a combination of nodes, systematically it transmitting the data to their base stations, this transmission data can be easily compromised if the limited processing power and the data consistency from these nodes are kept in mind; there is always a discussion to address the secure data transfer or transmission in actual time. This will present a mechanism to securely transmit the data over a chain of sensor nodes without compromising the throughput of the network by utilizing available battery resources available in the sensor node. Our methodology takes many different advantages of Z-MAC protocol for its efficiency, and it provides a unique key by sharing the mechanism using neighbor node MAC address. We present a light weighted data integrity layer which is embedded in the Z-MAC protocol to prove that our protocol performs well than Z-MAC when we introduce the different attack scenarios.Keywords: hybrid MAC protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node dataprocessing, Z-MAC
Procedia PDF Downloads 144