Search results for: heterogeneous massive data
25442 Signal Processing Techniques for Adaptive Beamforming with Robustness
Authors: Ju-Hong Lee, Ching-Wei Liao
Abstract:
Adaptive beamforming using antenna array of sensors is useful in the process of adaptively detecting and preserving the presence of the desired signal while suppressing the interference and the background noise. For conventional adaptive array beamforming, we require a prior information of either the impinging direction or the waveform of the desired signal to adapt the weights. The adaptive weights of an antenna array beamformer under a steered-beam constraint are calculated by minimizing the output power of the beamformer subject to the constraint that forces the beamformer to make a constant response in the steering direction. Hence, the performance of the beamformer is very sensitive to the accuracy of the steering operation. In the literature, it is well known that the performance of an adaptive beamformer will be deteriorated by any steering angle error encountered in many practical applications, e.g., the wireless communication systems with massive antennas deployed at the base station and user equipment. Hence, developing effective signal processing techniques to deal with the problem due to steering angle error for array beamforming systems has become an important research work. In this paper, we present an effective signal processing technique for constructing an adaptive beamformer against the steering angle error. The proposed array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. Based on the presumed steering vector and a preset angle range for steering mismatch tolerance, we first create a matrix related to the direction vector of signal sources. Two projection matrices are generated from the matrix. The projection matrix associated with the desired signal information and the received array data are utilized to iteratively estimate the actual direction vector of the desired signal. The estimated direction vector of the desired signal is then used for appropriately finding the quiescent weight vector. The other projection matrix is set to be the signal blocking matrix required for performing adaptive beamforming. Accordingly, the proposed beamformer consists of adaptive quiescent weights and partially adaptive weights. Several computer simulation examples are provided for evaluating and comparing the proposed technique with the existing robust techniques.Keywords: adaptive beamforming, robustness, signal blocking, steering angle error
Procedia PDF Downloads 12425441 Strongly Disordered Conductors and Insulators in Holography
Authors: Matthew Stephenson
Abstract:
We study the electrical conductivity of strongly disordered, strongly coupled quantum field theories, holographically dual to non-perturbatively disordered uncharged black holes. The computation reduces to solving a diffusive hydrostatic equation for an emergent horizon fluid. We demonstrate that a large class of theories in two spatial dimensions have a universal conductivity independent of disorder strength, and rigorously rule out disorder-driven conductor-insulator transitions in many theories. We present a (fine-tuned) axion-dilaton bulk theory which realizes the conductor-insulator transition, interpreted as a classical percolation transition in the horizon fluid. We address aspects of strongly disordered holography that can and cannot be addressed via mean-field modeling, such as massive gravity.Keywords: theoretical physics, black holes, holography, high energy
Procedia PDF Downloads 17825440 Optimizing Hydrogen Production from Biomass Pyro-Gasification in a Multi-Staged Fluidized Bed Reactor
Authors: Chetna Mohabeer, Luis Reyes, Lokmane Abdelouahed, Bechara Taouk
Abstract:
In the transition to sustainability and the increasing use of renewable energy, hydrogen will play a key role as an energy carrier. Biomass has the potential to accelerate the realization of hydrogen as a major fuel of the future. Pyro-gasification allows the conversion of organic matter mainly into synthesis gas, or “syngas”, majorly constituted by CO, H2, CH4, and CO2. A second, condensable fraction of biomass pyro-gasification products are “tars”. Under certain conditions, tars may decompose into hydrogen and other light hydrocarbons. These conditions include two types of cracking: homogeneous cracking, where tars decompose under the effect of temperature ( > 1000 °C), and heterogeneous cracking, where catalysts such as olivine, dolomite or biochar are used. The latter process favors cracking of tars at temperatures close to pyro-gasification temperatures (~ 850 °C). Pyro-gasification of biomass coupled with water-gas shift is the most widely practiced process route for biomass to hydrogen today. In this work, an innovating solution will be proposed for this conversion route, in that all the pyro-gasification products, not only methane, will undergo processes that aim to optimize hydrogen production. First, a heterogeneous cracking step was included in the reaction scheme, using biochar (remaining solid from the pyro-gasification reaction) as catalyst and CO2 and H2O as gasifying agents. This process was followed by a catalytic steam methane reforming (SMR) step. For this, a Ni-based catalyst was tested under different reaction conditions to optimize H2 yield. Finally, a water-gas shift (WGS) reaction step with a Fe-based catalyst was added to optimize the H2 yield from CO. The reactor used for cracking was a fluidized bed reactor, and the one used for SMR and WGS was a fixed bed reactor. The gaseous products were analyzed continuously using a µ-GC (Fusion PN 074-594-P1F). With biochar as bed material, it was seen that more H2 was obtained with steam as a gasifying agent (32 mol. % vs. 15 mol. % with CO2 at 900 °C). CO and CH4 productions were also higher with steam than with CO2. Steam as gasifying agent and biochar as bed material were hence deemed efficient parameters for the first step. Among all parameters tested, CH4 conversions approaching 100 % were obtained from SMR reactions using Ni/γ-Al2O3 as a catalyst, 800 °C, and a steam/methane ratio of 5. This gave rise to about 45 mol % H2. Experiments about WGS reaction are currently being conducted. At the end of this phase, the four reactions are performed consecutively, and the results analyzed. The final aim is the development of a global kinetic model of the whole system in a multi-stage fluidized bed reactor that can be transferred on ASPEN PlusTM.Keywords: multi-staged fluidized bed reactor, pyro-gasification, steam methane reforming, water-gas shift
Procedia PDF Downloads 13825439 Capacities of Early Childhood Education Professionals for the Prevention of Social Exclusion of Children
Authors: Dejana Bouillet, Vlatka Domović
Abstract:
Both policymakers and researchers recognize that participating in early childhood education and care (ECEC) is useful for all children, especially for those who are exposed to the high risk of social exclusion. Social exclusion of children is understood as a multidimensional construct including economic, social, cultural, health, and other aspects of disadvantage and deprivation, which individually or combined can have an unfavorable effect on the current life and development of a child, as well as on the child’s development and on disadvantaged life chances in adult life. ECEC institutions should be able to promote educational approaches that portray developmental, cultural, language, and other diversity amongst children. However, little is known about the ways in which Croatian ECEC institutions recognize and respect the diversity of children and their families and how they respond to their educational needs. That is why this paper is dedicated to the analysis of the capacities of ECEC professionals to respond to the demands of educational needs of this very diverse group of children and their families. The results obtained in the frame of the project “Models of response to educational needs of children at risk of social exclusion in ECEC institutions,” funded by the Croatian Science Foundation, will be presented. The research methodology arises from explanations of educational processes and risks of social exclusion as a complex and heterogeneous phenomenon. The preliminary results of the qualitative data analysis of educational practices regarding capacities to identify and appropriately respond to the requirements of children at risk of social exclusion will be presented. The data have been collected by interviewing educational staff in 10 Croatian ECEC institutions (n = 10). The questions in the interviews were related to various aspects of inclusive institutional policy, culture, and practices. According to the analysis, it is possible to conclude that Croatian ECEC professionals are still faced with great challenges in the process of implementation of inclusive policies, culture, and practices. There are several baselines of this conclusion. The interviewed educational professionals are not familiar enough with the whole complexity and diversity of needs of children at risk of social exclusion, and the ECEC institutions do not have enough resources to provide all interventions that these children and their families need.Keywords: children at risk of social exclusion, ECEC professionals, inclusive policies, culture and practices, quallitative analysis
Procedia PDF Downloads 11425438 Estimation of Natural Pozzolan Reserves in the Volcanic Province of the Moroccan Middle Atlas Using a Geographic Information System in Order to Valorize Them
Authors: Brahim Balizi, Ayoub Aziz, Abdelilah Bellil, Abdellali El Khadiri, Jamal Mabrouki
Abstract:
Mio-polio-quaternary volcanism of the Tabular Middle Atlas, which corresponds to prospective levels of exploitable usable raw minerals, is a feature of Morocco's Middle Atlas, especially the Azrou-Timahdite region. Given their importance in national policy in terms of human development by supporting the sociological and economic component, this area has consequently been the focus of various research and prospecting of these levels in order to develop these reserves. The outcome of this labor is a massive amount of data that needs to be managed appropriately because it comes from multiple sources and formats, including side points, contour lines, geology, hydrogeology, hydrology, geological and topographical maps, satellite photos, and more. In this regard, putting in place a Geographic Information System (GIS) is essential to be able to offer a side plan that makes it possible to see the most recent topography of the area being exploited, to compute the volume of exploitation that occurs every day, and to make decisions with the fewest possible restrictions in order to use the reserves for the realization of ecological light mortars The three sites' mining will follow the contour lines in five steps that are six meters high and decline. It is anticipated that each quarry produces about 90,000 m3/year. For a single quarry, this translates to a daily production of about 450 m3 (200 days/year). About 3,540,240 m3 and 10,620,720 m3, respectively, represent the possible net exploitable volume in place for a single quarry and the three exploitable zones.Keywords: GIS, topography, exploitation, quarrying, lightweight mortar
Procedia PDF Downloads 2625437 The Pricing-Out Phenomenon in the U.S. Housing Market
Authors: Francesco Berald, Yunhui Zhao
Abstract:
The COVID-19 pandemic further extended the multi-year housing boom in advanced economies and emerging markets alike against massive monetary easing during the pandemic. In this paper, we analyze the pricing-out phenomenon in the U.S. residential housing market due to higher house prices associated with monetary easing. We first set up a stylized general equilibrium model and show that although monetary easing decreases the mortgage payment burden, it would raise house prices and lower housing affordability for first-time homebuyers (through the initial housing wealth channel and the liquidity constraint channel that increases repeat buyers’ housing demand), and increase housing wealth inequality between first-time and repeat homebuyers. We then use the U.S. household-level data to quantify the effect of the house price change on housing affordability relative to that of the interest rate change. We find evidence of the pricing-out effect for all homebuyers; moreover, we find that the pricing-out effect is stronger for first-time homebuyers than for repeat homebuyers. The paper highlights the importance of accounting for general equilibrium effects and distributional implications of monetary policy while assessing housing affordability. It also calls for complementing monetary easing with well-targeted policy measures that can boost housing affordability, particularly for first-time and lower-income households. Such measures are also needed during aggressive monetary tightening, given that the fall in house prices may be insufficient or too slow to fully offset the immediate adverse impact of higher rates on housing affordability.Keywords: pricing-out, U.S. housing market, housing affordability, distributional effects, monetary policy
Procedia PDF Downloads 3425436 The Economic Limitations of Defining Data Ownership Rights
Authors: Kacper Tomasz Kröber-Mulawa
Abstract:
This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.Keywords: antitrust, data, data ownership, digital economy, property rights
Procedia PDF Downloads 8225435 Protecting the Cloud Computing Data Through the Data Backups
Authors: Abdullah Alsaeed
Abstract:
Virtualized computing and cloud computing infrastructures are no longer fuzz or marketing term. They are a core reality in today’s corporate Information Technology (IT) organizations. Hence, developing an effective and efficient methodologies for data backup and data recovery is required more than any time. The purpose of data backup and recovery techniques are to assist the organizations to strategize the business continuity and disaster recovery approaches. In order to accomplish this strategic objective, a variety of mechanism were proposed in the recent years. This research paper will explore and examine the latest techniques and solutions to provide data backup and restoration for the cloud computing platforms.Keywords: data backup, data recovery, cloud computing, business continuity, disaster recovery, cost-effective, data encryption.
Procedia PDF Downloads 8725434 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area
Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim
Abstract:
In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.Keywords: data estimation, link data, machine learning, road network
Procedia PDF Downloads 51025433 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies
Authors: Monica Lia
Abstract:
This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes
Procedia PDF Downloads 43025432 The Problem of the Use of Learning Analytics in Distance Higher Education: An Analytical Study of the Open and Distance University System in Mexico
Authors: Ismene Ithai Bras-Ruiz
Abstract:
Learning Analytics (LA) is employed by universities not only as a tool but as a specialized ground to enhance students and professors. However, not all the academic programs apply LA with the same goal and use the same tools. In fact, LA is formed by five main fields of study (academic analytics, action research, educational data mining, recommender systems, and personalized systems). These fields can help not just to inform academic authorities about the situation of the program, but also can detect risk students, professors with needs, or general problems. The highest level applies Artificial Intelligence techniques to support learning practices. LA has adopted different techniques: statistics, ethnography, data visualization, machine learning, natural language process, and data mining. Is expected that any academic program decided what field wants to utilize on the basis of his academic interest but also his capacities related to professors, administrators, systems, logistics, data analyst, and the academic goals. The Open and Distance University System (SUAYED in Spanish) of the University National Autonomous of Mexico (UNAM), has been working for forty years as an alternative to traditional programs; one of their main supports has been the employ of new information and communications technologies (ICT). Today, UNAM has one of the largest network higher education programs, twenty-six academic programs in different faculties. This situation means that every faculty works with heterogeneous populations and academic problems. In this sense, every program has developed its own Learning Analytic techniques to improve academic issues. In this context, an investigation was carried out to know the situation of the application of LA in all the academic programs in the different faculties. The premise of the study it was that not all the faculties have utilized advanced LA techniques and it is probable that they do not know what field of study is closer to their program goals. In consequence, not all the programs know about LA but, this does not mean they do not work with LA in a veiled or, less clear sense. It is very important to know the grade of knowledge about LA for two reasons: 1) This allows to appreciate the work of the administration to improve the quality of the teaching and, 2) if it is possible to improve others LA techniques. For this purpose, it was designed three instruments to determinate the experience and knowledge in LA. These were applied to ten faculty coordinators and his personnel; thirty members were consulted (academic secretary, systems manager, or data analyst, and coordinator of the program). The final report allowed to understand that almost all the programs work with basic statistics tools and techniques, this helps the administration only to know what is happening inside de academic program, but they are not ready to move up to the next level, this means applying Artificial Intelligence or Recommender Systems to reach a personalized learning system. This situation is not related to the knowledge of LA, but the clarity of the long-term goals.Keywords: academic improvements, analytical techniques, learning analytics, personnel expertise
Procedia PDF Downloads 12825431 Understanding Patterns of Hard Coral Demographics in Kenyan Reefs to Inform Restoration
Authors: Swaleh Aboud, Mishal Gudka, David Obura
Abstract:
Background: Coral reefs are becoming increasingly vulnerable due to several threats ranging from climate change to overfishing. This has resulted in increased management and conservation efforts to protect reefs from degradation and facilitate recovery. Recruitmentof new individuals are isimportant in the recovery process and critical for the persistence of coral reef ecosystems. Local coral community structure can be influenced by successful recruit settlement, survival, and growth Understanding coral recruitment patterns can help quantify reef resilience and connectivity, establish baselines and track changes and evaluate the effectiveness of reef restoration and conservation efforts. This study will examine the abundance and spatial pattern of coral recruits and how this relates to adult community structure, including the distribution of thermal resistance and sensitive genera and their distribution in different management regimes. Methods: Coral recruit and demography surveys were conducted from 2020 to 2022, covering 35 sites in 19coral reef locations along the Kenyan coast. These included marine parks, reserves, community conservation areas (CMAs), and open access areas from the north (Marereni) to the south (Kisite) coast of Kenya and across different reef habitats. The data was collected through the underwater visual census (UVC) technique. We counted adult corals (>10 cm diameter)of23 selected genera using belt transects (25 by 1 m) and sampling of 1 m2 quadrat (at an interval of 5m) for all coloniesless than 10 cm diameter. The benthic cover was collected using photo quadrats. The surveys were only done during the northeast monsoon season. The data wereanalyzed using the R program to see the distribution patterns and the Kruskal Wallis test to see whether there was a significant difference. Spearman correlation was also applied to assess the relationship between the distribution of coral genera in recruits and adults. Results: A total of 44 different coral genera were recorded for recruits, ranging from 3at Marereni to 30at Watamu Marine Reserve. Recruit densities ranged from 1.2±1.5recruit m-2 (mean±SD) at Likoni to 10.3± 8.4 recruit m-2 at Kisite Marine Park. The overall densityof recruitssignificantly differed between reef locations, with Kisite Marine Park and Reserve and Likonihaving significantly large differences from all the other locations, while Vuma, Watamu, Malindi, and Kilifi had significantly lower differences from all the other locations. The recruit generadensity along the Kenya coastwas divided into two clusters, one of which only included sites inKisite Marine Park. Adult colonies were dominated by Porites massive, Acropora, Platygyra, and Favites, whereas recruits were dominated by Porites branching, Porites massive, Galaxea, and Acropora. However, correlation analysis revealed a statistically significant positive correlation (r=0.81, p<0.05) between recruit and adult coral densities across the 23 coral genera. Marereni, which had the lowest densityof recruits, has only thermallyresistant coral genera, while Kisite Marine Park, with the highest recruit densities, has over 90% thermal sensitive coral genera. A weak positive correlation was found between recruit density and coralline algae, dead standing corals, and turf algae, whereas a weak negative correlation was found between recruit density and bare substrate and macroalgae. Between management regimes, marine reserves were found to have more recruits than no-take zones (marine parks and CMAs) and open access areas, although the difference was not significant. Conclusion: There was a statistically significant difference in the density of recruits between different reef locations along the Kenyan coast. Although the dominating genera of adults and recruits were different, there was a strong positive correlation between their coral communities, which could indicate self-recruitment processes or consistent distance seedings (of the same recruit genera). Sites such as Kisite Marine Park, with high recruit densities but dominated by thermally sensitive genera, will, on the other hand, be adversely affected by future thermal stress. This could imply that reducing the threats to coral reefs such as overfishingcould allow for their natural regeneration and recovery.Keywords: coral recruits, coral adult size-class, cora demography, resilience
Procedia PDF Downloads 12425430 Analysis of Tactile Perception of Textiles by Fingertip Skin Model
Authors: Izabela L. Ciesielska-Wrόbel
Abstract:
This paper presents finite element models of the fingertip skin which have been created to simulate the contact of textile objects with the skin to gain a better understanding of the perception of textiles through the skin, so-called Hand of Textiles (HoT). Many objective and subjective techniques have been developed to analyze HoT, however none of them provide exact overall information concerning the sensation of textiles through the skin. As the human skin is a complex heterogeneous hyperelastic body composed of many particles, some simplifications had to be made at the stage of building the models. The same concerns models of woven structures, however their utilitarian value was maintained. The models reflect only friction between skin and woven textiles, deformation of the skin and fabrics when “touching” textiles and heat transfer from the surface of the skin into direction of textiles.Keywords: fingertip skin models, finite element models, modelling of textiles, sensation of textiles through the skin
Procedia PDF Downloads 46525429 Improved Mechanical and Electrical Properties and Thermal Stability of Post-Consumer Polyethylene Terephthalate Glycol Containing Hybrid System of Nanofillers
Authors: Iman Taraghi, Sandra Paszkiewicz, Daria Pawlikowska, Anna Szymczyk, Izabela Irska, Rafal Stanik, Amelia Linares, Tiberio A. Ezquerra, Elżbieta Piesowicz
Abstract:
Currently, the massive use of thermoplastic materials in industrial applications causes huge amounts of polymer waste. The poly (ethylene glycol-co-1,4-cyclohexanedimethanol terephthalate) (PET-G) has been widely used in food packaging and polymer foils. In this research, the PET-G foils have been recycled and reused as a matrix to combine with different types of nanofillers such as carbon nanotubes, graphene nanoplatelets, and nanosized carbon black. The mechanical and electrical properties, as well as thermal stability and thermal conductivity of the PET-G, improved along with the addition of the aforementioned nanofillers and hybrid system of them.Keywords: polymer hybrid nanocomposites, carbon nanofillers, recycling, physical performance
Procedia PDF Downloads 13625428 Opening up Government Datasets for Big Data Analysis to Support Policy Decisions
Authors: K. Hardy, A. Maurushat
Abstract:
Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.Keywords: big data, open data, productivity, data governance
Procedia PDF Downloads 37125427 Airport Investment Risk Assessment under Uncertainty
Authors: Elena M. Capitanul, Carlos A. Nunes Cosenza, Walid El Moudani, Felix Mora Camino
Abstract:
The construction of a new airport or the extension of an existing one requires massive investments and many times public private partnerships were considered in order to make feasible such projects. One characteristic of these projects is uncertainty with respect to financial and environmental impacts on the medium to long term. Another one is the multistage nature of these types of projects. While many airport development projects have been a success, some others have turned into a nightmare for their promoters. This communication puts forward a new approach for airport investment risk assessment. The approach takes explicitly into account the degree of uncertainty in activity levels prediction and proposes milestones for the different stages of the project for minimizing risk. Uncertainty is represented through fuzzy dual theory and risk management is performed using dynamic programming. An illustration of the proposed approach is provided.Keywords: airports, fuzzy logic, risk, uncertainty
Procedia PDF Downloads 41325426 AG Loaded WO3 Nanoplates for Photocatalytic Degradation of Sulfanilamide and Bacterial Removal under Visible Light
Authors: W. Y. Zhu, X. L. Yan, Y. Zhou
Abstract:
Sulfonamides (SAs) are extensively used antibiotics; photocatalysis is an effective, way to remove the SAs from water driven by solar energy. Here we used WO3 nanoplates and their Ag heterogeneous as photocatalysts to investigate their photodegradation efficiency against sulfanilamide (SAM) which is the precursor of SAs. Results showed that WO3/Ag composites performed much better than pure WO3 where the highest removal rate was 96.2% can be achieved under visible light irradiation. Ag as excellent antibacterial agent also endows certain antibacterial efficiency to WO3, and 100% removal efficiency could be achieved in 2 h under visible light irradiation for all WO3/Ag composites. Generally, WO3/Ag composites are very effective photocatalysts with potentials in practical applications which mainly use cheap, clean and green solar energy as energy source.Keywords: antibacterial, photocatalysis, semiconductor, sulfanilamide
Procedia PDF Downloads 36025425 A Review on Existing Challenges of Data Mining and Future Research Perspectives
Authors: Hema Bhardwaj, D. Srinivasa Rao
Abstract:
Technology for analysing, processing, and extracting meaningful data from enormous and complicated datasets can be termed as "big data." The technique of big data mining and big data analysis is extremely helpful for business movements such as making decisions, building organisational plans, researching the market efficiently, improving sales, etc., because typical management tools cannot handle such complicated datasets. Special computational and statistical issues, such as measurement errors, noise accumulation, spurious correlation, and storage and scalability limitations, are brought on by big data. These unique problems call for new computational and statistical paradigms. This research paper offers an overview of the literature on big data mining, its process, along with problems and difficulties, with a focus on the unique characteristics of big data. Organizations have several difficulties when undertaking data mining, which has an impact on their decision-making. Every day, terabytes of data are produced, yet only around 1% of that data is really analyzed. The idea of the mining and analysis of data and knowledge discovery techniques that have recently been created with practical application systems is presented in this study. This article's conclusion also includes a list of issues and difficulties for further research in the area. The report discusses the management's main big data and data mining challenges.Keywords: big data, data mining, data analysis, knowledge discovery techniques, data mining challenges
Procedia PDF Downloads 11025424 Latent Factors of Severity in Truck-Involved and Non-Truck-Involved Crashes on Freeways
Authors: Shin-Hyung Cho, Dong-Kyu Kim, Seung-Young Kho
Abstract:
Truck-involved crashes have higher crash severity than non-truck-involved crashes. There have been many studies about the frequency of crashes and the development of severity models, but those studies only analyzed the relationship between observed variables. To identify why more people are injured or killed when trucks are involved in the crash, we must examine to quantify the complex causal relationship between severity of the crash and risk factors by adopting the latent factors of crashes. The aim of this study was to develop a structural equation or model based on truck-involved and non-truck-involved crashes, including five latent variables, i.e. a crash factor, environmental factor, road factor, driver’s factor, and severity factor. To clarify the unique characteristics of truck-involved crashes compared to non-truck-involved crashes, a confirmatory analysis method was used. To develop the model, we extracted crash data from 10,083 crashes on Korean freeways from 2008 through 2014. The results showed that the most significant variable affecting the severity of a crash is the crash factor, which can be expressed by the location, cause, and type of the crash. For non-truck-involved crashes, the crash and environment factors increase severity of the crash; conversely, the road and driver factors tend to reduce severity of the crash. For truck-involved crashes, the driver factor has a significant effect on severity of the crash although its effect is slightly less than the crash factor. The multiple group analysis employed to analyze the differences between the heterogeneous groups of drivers.Keywords: crash severity, structural structural equation modeling (SEM), truck-involved crashes, multiple group analysis, crash on freeway
Procedia PDF Downloads 38325423 The Development of a Precision Irrigation System for Durian
Authors: Chatrabhuti Pipop, Visessri Supattra, Charinpanitkul Tawatchai
Abstract:
Durian is one of the top agricultural products exported by Thailand. There is the massive market potential for the durian industry. While the global demand for Thai durians, especially the demand from China, is very high, Thailand's durian supply is far from satisfying strong demand. Poor agricultural practices result in low yields and poor quality of fruit. Most irrigation systems currently used by the farmers are fixed schedule or fixed rates that ignore actual weather conditions and crop water requirements. In addition, the technologies emerging are too difficult and complex and prices are too high for the farmers to adopt and afford. Many farmers leave the durian trees to grow naturally. With improper irrigation and nutrient management system, durians are vulnerable to a variety of issues, including stunted growth, not flowering, diseases, and death. Technical development or research for durian is much needed to support the wellbeing of the farmers and the economic development of the country. However, there are a limited number of studies or development projects for durian because durian is a perennial crop requiring a long time to obtain the results to report. This study, therefore, aims to address the problem of durian production by developing an autonomous and precision irrigation system. The system is designed and equipped with an industrial programmable controller, a weather station, and a digital flow meter. Daily water requirements are computed based on weather data such as rainfall and evapotranspiration for daily irrigation with variable flow rates. A prediction model is also designed as a part of the system to enhance the irrigation schedule. Before the system was installed in the field, a simulation model was built and tested in a laboratory setting to ensure its accuracy. Water consumption was measured daily before and after the experiment for further analysis. With this system, the crop water requirement is precisely estimated and optimized based on the data from the weather station. Durian will be irrigated at the right amount and at the right time, offering the opportunity for higher yield and higher income to the farmers.Keywords: Durian, precision irrigation, precision agriculture, smart farm
Procedia PDF Downloads 11825422 A Systematic Review on Challenges in Big Data Environment
Authors: Rimmy Yadav, Anmol Preet Kaur
Abstract:
Big Data has demonstrated the vast potential in streamlining, deciding, spotting business drifts in different fields, for example, producing, fund, Information Technology. This paper gives a multi-disciplinary diagram of the research issues in enormous information and its procedures, instruments, and system identified with the privacy, data storage management, network and energy utilization, adaptation to non-critical failure and information representations. Other than this, result difficulties and openings accessible in this Big Data platform have made.Keywords: big data, privacy, data management, network and energy consumption
Procedia PDF Downloads 31225421 Habits: Theoretical Foundations and a Conceptual Framework on a Managerial Trap and Chance
Authors: K. Piórkowska
Abstract:
The overarching aim of the paper is to incorporate the micro-foundations perspective in strategic management and offering possibilities to bridge the macro–micro divide, to review the concept of habits, as well as to propose research findings and directions in terms of further exploring the habit construct and its impact on higher epistemological level phenomena (for instance organizational routines, which is a domain inherently multilevel in nature). To realize this aim, the following sections have been developed: (1) habits’ origins, (2) habits – cognitive constellations, (3) interrelationships between habits and mental representations, intentions, (4) habits and organizational routines, and (5) habits and routines linkages with adaptation. The conclusions that have been made support recent and current studies linking the level of individual heterogeneous agents with the level of macro (organizational) outcomes.Keywords: behaviorism, habits, micro-foundations, routines
Procedia PDF Downloads 25825420 Of Digital Games and Dignity: Rationalizing E-Sports Amidst Stereotypes Associated with Gamers
Authors: Sarthak Mohapatra, Ajith Babu, Shyam Prasad Ghosh
Abstract:
The community of gamers has been at the crux of stigmatization and marginalization by the larger society, resulting in dignity erosion. India presents a unique context where e-sports have recently seen large-scale investments, a massive userbase, and appreciable demand for gaming as a career option. Yet the apprehension towards gaming is salient among parents and non-gamers who engage in the de-dignification of gamers, by advocating the discourse of violence promotion via video games. Even the government is relentless in banning games due to data privacy issues. Thus, the current study explores the experiences of gamers and how they navigate these de-dignifying circumstances. The study follows an exploratory qualitative approach where in-depth interviews are used as data collection tools guided by a semi-structured questionnaire. A total of 25 individuals were interviewed comprising casual gamers, professional gamers, and individuals who are indirectly impacted by gaming including parents, relatives, and friends of gamers. Thematic analysis via three-level coding is used to arrive at broad themes (categories) and their sub-themes. The results indicate that the de-dignification of gamers results from attaching stereotypes of introversion, aggression, low intelligence, and low aspirations to them. It is interesting to note that the intensity of de-dignification varies and is more salient in violent shooting games which are perceived to require low cognitive resources to master. The moral disengagement of gamers while playing violent video games becomes the basis for de-dignification. Findings reveal that circumventing de-dignification required gamers to engage in several tactics that included playing behind closed doors, consciously hiding the gamer identity, rationalizing behavior by idolizing professionals, bragging about achievements within the game, and so on. Theoretically, it contributes to dignity and social identity literature by focusing on stereotyping and stigmatization. From a policy perspective, improving legitimacy toward gaming is expected to improve the social standing of gamers and professionals. For practitioners, it is important that proper channels of promotion and communication are used to educate the non-gamers so that the stereotypes blur away.Keywords: dignity, social identity, stereotyping, video games
Procedia PDF Downloads 10025419 Metal Contaminants in River Water and Human Urine after an Episode of Major Pollution by Mining Wastes in the Kasai Province of DR Congo
Authors: Remy Mpulumba Badiambile, Paul Musa Obadia, Malick Useni Mutayo, Jeef Numbi Mukanya, Patient Nkulu Banza, Tony Kayembe Kitenge, Erik Smolders, Jean-François Picron, Vincent Haufroid, Célestin Banza Lubaba Nkulu, Benoit Nemery
Abstract:
Background: In July 2021, the Tshikapa river became heavily polluted by mining wastes from a diamond mine in neighboring Angola, leading to massive killing of fish, as well as disease and even deaths among residents living along the Tshikapa and Kasai rivers, a major contributory of the Congo river. The exact nature of the pollutants was unknown. Methods: In a cross-sectional study conducted in the city of Tshikapa in August 2021, we enrolled by opportunistic sampling 65 residents (11 children < 16y) living alongside the polluted rivers and 65 control residents (5 children) living alongside a non-affected portion of the Kasai river (upstream from the Tshikapa-Kasai confluence). We administered a questionnaire and obtained spot urine samples for measurements of thiocyanate (a metabolite of cyanide) and 26 trace metals (by ICP-MS). Metals (and pH) were also measured in samples of river water. Results: Participants from both groups consumed river water. In the area affected by the pollution, most participants had eaten dead fish. Prevalences of reported health symptoms were higher in the exposed group than among controls: skin rashes (52% vs 0%), diarrhea (40% vs 8%), abdominal pain (8% vs 3%), nausea (3% vs 0%). In polluted water, concentrations [median (range)] were only higher for nickel [(2.2(1.4–3.5)µg/L] and uranium [78(71–91)ng/L] than in non-polluted water [0.8(0.6–1.9)µg/L; 9(7–19)ng/L]. In urine, concentrations [µg/g creatinine, median(IQR)] were significantly higher in the exposed group than in controls for lithium [19.5(12.4–27.3) vs 6.9(5.9–12.1)], thallium [0.41(0.31–0.57) vs 0.19(0.16–0.39)], and uranium [0.026(0.013–0.037)] vs 0.012(0.006–0.024)]. Other elements did not differ between the groups, but levels were higher than reference values for several metals (including manganese, cobalt, nickel, and lead). Urinary thiocyanate concentrations did not differ. Conclusion: This study, after an ecological disaster in the DRC, has documented contamination of river water by nickel and uranium and high urinary levels of some trace metals among affected riverine populations. However, the exact cause of the massive fish kill and disease among residents remains elusive. The capacity to rapidly investigate toxic pollution events must be increased in the area.Keywords: metal contaminants, river water and human urine, pollution by mining wastes, DR Congo
Procedia PDF Downloads 15725418 Survey on Big Data Stream Classification by Decision Tree
Authors: Mansoureh Ghiasabadi Farahani, Samira Kalantary, Sara Taghi-Pour, Mahboubeh Shamsi
Abstract:
Nowadays, the development of computers technology and its recent applications provide access to new types of data, which have not been considered by the traditional data analysts. Two particularly interesting characteristics of such data sets include their huge size and streaming nature .Incremental learning techniques have been used extensively to address the data stream classification problem. This paper presents a concise survey on the obstacles and the requirements issues classifying data streams with using decision tree. The most important issue is to maintain a balance between accuracy and efficiency, the algorithm should provide good classification performance with a reasonable time response.Keywords: big data, data streams, classification, decision tree
Procedia PDF Downloads 52125417 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication
Authors: Aishwarya Shekhar, Himanshu Sharma
Abstract:
Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.Keywords: confidentiality, deduplication, data compression, hybridity of cloud
Procedia PDF Downloads 38325416 Electrochemical Properties of Bimetallic Silver-Platinum Core-Shell Nanoparticles
Authors: Fredrick O. Okumu, Mangaka C. Matoetoe
Abstract:
Silver-platinum (Ag-Pt) bimetallic nanoparticles (NPs) with varying mole fractions (1:1, 1:3 and 3:1) were prepared by co-reduction of hexachloroplatinate and silver nitrate with sodium citrate. Upon successful formation of both monometallic and bimetallic (BM) core shell nanoparticles, cyclic voltammetry (CV) was used to characterize the NPs. The drop coated nanofilms on the GC substrate showed characteristic peaks of monometallic Ag NPs; Ag+/Ag0 redox couple as well as the Pt NPs; hydrogen adsorption and desorption peaks. These characteristic peaks were confirmed in the bimetallic NPs voltammograms. The following varying current trends were observed in the BM NPs ratios; GCE/Ag-Pt 1:3 > GCE/Ag-Pt 3:1 > GCE/Ag-Pt 1:1. Fundamental electrochemical properties which directly or indirectly affects the applicability of films such as; diffusion coefficient (D), electroactive surface coverage, electrochemical band gap, electron transfer coefficient (α) and charge (Q) were assessed using Randles - Sevcik plot and Laviron’s equations . High charge and surface coverage was observed in GCE/Ag-Pt 1:3 which supports its enhanced current. GCE/Ag-Pt 3:1 showed high diffusion coefficient while GCE/Ag-Pt 1:1 possessed high electron transfer coefficient that is facilitated by its high apparent heterogeneous rate constant relative to other BM NPs ratios. Surface redox reaction was determined as adsorption controlled in all modified GCEs. Surface coverage is inversely proportional to size; therefore the surface coverage data suggests that Ag-Pt 1:1 NPs have a small particle size. Generally, GCE/Ag-Pt 1:3 depicts the best electrochemical properties.Keywords: characterization, core-shell, electrochemical, nanoparticles
Procedia PDF Downloads 26925415 Strengthening Legal Protection of Personal Data through Technical Protection Regulation in Line with Human Rights
Authors: Tomy Prihananto, Damar Apri Sudarmadi
Abstract:
Indonesia recognizes the right to privacy as a human right. Indonesia provides legal protection against data management activities because the protection of personal data is a part of human rights. This paper aims to describe the arrangement of data management and data management in Indonesia. This paper is a descriptive research with qualitative approach and collecting data from literature study. Results of this paper are comprehensive arrangement of data that have been set up as a technical requirement of data protection by encryption methods. Arrangements on encryption and protection of personal data are mutually reinforcing arrangements in the protection of personal data. Indonesia has two important and immediately enacted laws that provide protection for the privacy of information that is part of human rights.Keywords: Indonesia, protection, personal data, privacy, human rights, encryption
Procedia PDF Downloads 18325414 A Survey on Protests Against Compulsory Hejab in Iran From Iranian Women’s Point of View After Mahsa Amini`S Death: A Grounded Theory Approach
Authors: Shirin Arefi
Abstract:
In Iran, women and girls are treated as second class citizens and suffer from many discrimination and inequality such as compulsory Hejab, a phenomena which has required all women to wear the hijab head-covering since the 1979 Islamic revolution. Now, the crackdown of new government has caused a massive uproar in the country. The morality police also continue to curb the choices of women, and the latest unfortunate incidents accelerate the hardened rules. The author is going to survey the views and of women against compulsory Hejab and morality and chastity police arrests. The methodology is a qualitative one in which narratives of them are coded based on grounded theory and horizons of the process is explained by phenomenological research as well. The findings and results will show the current attitudes of women of Hejab and their reactions against morality police behaviors.Keywords: compulsory hejab, morality police, people, arrest
Procedia PDF Downloads 11125413 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement
Procedia PDF Downloads 94