Search results for: earth observation data cube
25643 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course
Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu
Abstract:
This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN
Procedia PDF Downloads 4425642 Volatile Organic Compounds (VOCS) Destruction by Catalytic Oxidation for Environmental Applications
Authors: Mohammed Nasir Kajama, Ngozi Claribelle Nwogu, Edward Gobina
Abstract:
Pt/γ-Al2O3 membrane catalysts were prepared via an evaporative-crystallization deposition method. The obtained Pt/γ-Al2O3 catalyst activity was tested after characterization (SEM-EDAX observation, BET measurement, permeability assessment) in the catalytic oxidation of selected volatile organic compound (VOC) i.e. propane, fed in mixture of oxygen. The VOC conversion (nearly 90%) obtained by varying the operating temperature showed that flow-through membrane reactor might do better in the abatement of VOCs.Keywords: VOC combustion, flow-through membrane reactor, platinum supported alumina catalysts
Procedia PDF Downloads 54425641 Exploring How Online Applications Help Students to Learn Music Virtually: A Study in an Australian Music Academy
Authors: Ali Shah
Abstract:
This paper outlines the case study experience of using a variety of online strategies in an Australian music academy context during covid times. The study aimed at exploring how online applications help students to learn music, specifically playing musical instruments, composing songs, and performing virtually. To explore this, music teachers’ perceptions and experiences regarding online learning, the teaching strategies they implemented, and the challenges they faced were examined. For the purpose of this study, a qualitative research structure was adopted through the use of three data collection tools. These methods included pre- and post-research individual interviews of teachers and students, analysis of their lesson plans, virtual classroom observations of the teachers followed by the researcher’sown reflections, post-observation discussions, and teachers’ reflective journals. The findings revealed that teachers had a theoretical understanding of virtual learning and recent musical application such as Flowkey, Skoove, and Piano marvel, which are benefits of e-learning. While teachers faced challenges in implementing strategies to teach keyboard/piano online, overall, both students and teachers felt the positive impact of online applications and strategies on their learning and felt that modern technology made it possible for anyone to take music lessons at home.Keywords: music, keyboard, piano, online learning, virtual learning
Procedia PDF Downloads 7525640 Data Access, AI Intensity, and Scale Advantages
Authors: Chuping Lo
Abstract:
This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.Keywords: digital intensity, digital divide, international trade, scale of economics
Procedia PDF Downloads 6825639 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data
Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju
Abstract:
Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding
Procedia PDF Downloads 41225638 Identity Verification Using k-NN Classifiers and Autistic Genetic Data
Authors: Fuad M. Alkoot
Abstract:
DNA data have been used in forensics for decades. However, current research looks at using the DNA as a biometric identity verification modality. The goal is to improve the speed of identification. We aim at using gene data that was initially used for autism detection to find if and how accurate is this data for identification applications. Mainly our goal is to find if our data preprocessing technique yields data useful as a biometric identification tool. We experiment with using the nearest neighbor classifier to identify subjects. Results show that optimal classification rate is achieved when the test set is corrupted by normally distributed noise with zero mean and standard deviation of 1. The classification rate is close to optimal at higher noise standard deviation reaching 3. This shows that the data can be used for identity verification with high accuracy using a simple classifier such as the k-nearest neighbor (k-NN).Keywords: biometrics, genetic data, identity verification, k nearest neighbor
Procedia PDF Downloads 25825637 Landslide Vulnerability Assessment in Context with Indian Himalayan
Authors: Neha Gupta
Abstract:
Landslide vulnerability is considered as the crucial parameter for the assessment of landslide risk. The term vulnerability defined as the damage or degree of elements at risk of different dimensions, i.e., physical, social, economic, and environmental dimensions. Himalaya region is very prone to multi-hazard such as floods, forest fires, earthquakes, and landslides. With the increases in fatalities rates, loss of infrastructure, and economy due to landslide in the Himalaya region, leads to the assessment of vulnerability. In this study, a methodology to measure the combination of vulnerability dimension, i.e., social vulnerability, physical vulnerability, and environmental vulnerability in one framework. A combined result of these vulnerabilities has rarely been carried out. But no such approach was applied in the Indian Scenario. The methodology was applied in an area of east Sikkim Himalaya, India. The physical vulnerability comprises of building footprint layer extracted from remote sensing data and Google Earth imaginary. The social vulnerability was assessed by using population density based on land use. The land use map was derived from a high-resolution satellite image, and for environment vulnerability assessment NDVI, forest, agriculture land, distance from the river were assessed from remote sensing and DEM. The classes of social vulnerability, physical vulnerability, and environment vulnerability were normalized at the scale of 0 (no loss) to 1 (loss) to get the homogenous dataset. Then the Multi-Criteria Analysis (MCA) was used to assign individual weights to each dimension and then integrate it into one frame. The final vulnerability was further classified into four classes from very low to very high.Keywords: landslide, multi-criteria analysis, MCA, physical vulnerability, social vulnerability
Procedia PDF Downloads 30125636 High-Resolution Spatiotemporal Retrievals of Aerosol Optical Depth from Geostationary Satellite Using Sara Algorithm
Authors: Muhammad Bilal, Zhongfeng Qiu
Abstract:
Aerosols, suspended particles in the atmosphere, play an important role in the earth energy budget, climate change, degradation of atmospheric visibility, urban air quality, and human health. To fully understand aerosol effects, retrieval of aerosol optical properties such as aerosol optical depth (AOD) at high spatiotemporal resolution is required. Therefore, in the present study, hourly AOD observations at 500 m resolution were retrieved from the geostationary ocean color imager (GOCI) using the simplified aerosol retrieval algorithm (SARA) over the urban area of Beijing for the year 2016. The SARA requires top-of-the-atmosphere (TOA) reflectance, solar and sensor geometry information and surface reflectance observations to retrieve an accurate AOD. For validation of the GOCI retrieved AOD, AOD measurements were obtained from the aerosol robotic network (AERONET) version 3 level 2.0 (cloud-screened and quality assured) data. The errors and uncertainties were reported using the root mean square error (RMSE), relative percent mean error (RPME), and the expected error (EE = ± (0.05 + 0.15AOD). Results showed that the high spatiotemporal GOCI AOD observations were well correlated with the AERONET AOD measurements with a correlation coefficient (R) of 0.92, RMSE of 0.07, and RPME of 5%, and 90% of the observations were within the EE. The results suggested that the SARA is robust and has the ability to retrieve high-resolution spatiotemporal AOD observations over the urban area using the geostationary satellite.Keywords: AEORNET, AOD, SARA, GOCI, Beijing
Procedia PDF Downloads 17125635 A Review on Intelligent Systems for Geoscience
Authors: R Palson Kennedy, P.Kiran Sai
Abstract:
This article introduces machine learning (ML) researchers to the hurdles that geoscience problems present, as well as the opportunities for improvement in both ML and geosciences. This article presents a review from the data life cycle perspective to meet that need. Numerous facets of geosciences present unique difficulties for the study of intelligent systems. Geosciences data is notoriously difficult to analyze since it is frequently unpredictable, intermittent, sparse, multi-resolution, and multi-scale. The first half addresses data science’s essential concepts and theoretical underpinnings, while the second section contains key themes and sharing experiences from current publications focused on each stage of the data life cycle. Finally, themes such as open science, smart data, and team science are considered.Keywords: Data science, intelligent system, machine learning, big data, data life cycle, recent development, geo science
Procedia PDF Downloads 13525634 Integrating Virtual Reality and Building Information Model-Based Quantity Takeoffs for Supporting Construction Management
Authors: Chin-Yu Lin, Kun-Chi Wang, Shih-Hsu Wang, Wei-Chih Wang
Abstract:
A construction superintendent needs to know not only the amount of quantities of cost items or materials completed to develop a daily report or calculate the daily progress (earned value) in each day, but also the amount of quantities of materials (e.g., reinforced steel and concrete) to be ordered (or moved into the jobsite) for performing the in-progress or ready-to-start construction activities (e.g., erection of reinforced steel and concrete pouring). These daily construction management tasks require great effort in extracting accurate quantities in a short time (usually must be completed right before getting off work every day). As a result, most superintendents can only provide these quantity data based on either what they see on the site (high inaccuracy) or the extraction of quantities from two-dimension (2D) construction drawings (high time consumption). Hence, the current practice of providing the amount of quantity data completed in each day needs improvement in terms of more accuracy and efficiency. Recently, a three-dimension (3D)-based building information model (BIM) technique has been widely applied to support construction quantity takeoffs (QTO) process. The capability of virtual reality (VR) allows to view a building from the first person's viewpoint. Thus, this study proposes an innovative system by integrating VR (using 'Unity') and BIM (using 'Revit') to extract quantities to support the above daily construction management tasks. The use of VR allows a system user to be present in a virtual building to more objectively assess the construction progress in the office. This VR- and BIM-based system is also facilitated by an integrated database (consisting of the information and data associated with the BIM model, QTO, and costs). In each day, a superintendent can work through a BIM-based virtual building to quickly identify (via a developed VR shooting function) the building components (or objects) that are in-progress or finished in the jobsite. And he then specifies a percentage (e.g., 20%, 50% or 100%) of completion of each identified building object based on his observation on the jobsite. Next, the system will generate the completed quantities that day by multiplying the specified percentage by the full quantities of the cost items (or materials) associated with the identified object. A building construction project located in northern Taiwan is used as a case study to test the benefits (i.e., accuracy and efficiency) of the proposed system in quantity extraction for supporting the development of daily reports and the orders of construction materials.Keywords: building information model, construction management, quantity takeoffs, virtual reality
Procedia PDF Downloads 13225633 Density Measurement of Underexpanded Jet Using Stripe Patterned Background Oriented Schlieren Method
Authors: Shinsuke Udagawa, Masato Yamagishi, Masanori Ota
Abstract:
The Schlieren method, which has been conventionally used to visualize high-speed flows, has disadvantages such as the complexity of the experimental setup and the inability to quantitatively analyze the amount of refraction of light. The Background Oriented Schlieren (BOS) method proposed by Meier is one of the measurement methods that solves the problems, as mentioned above. The refraction of light is used for BOS method same as the Schlieren method. The BOS method is characterized using a digital camera to capture the images of the background behind the observation area. The images are later analyzed by a computer to quantitatively detect the amount of shift of the background image. The experimental setup for BOS does not require concave mirrors, pinholes, or color filters, which are necessary in the conventional Schlieren method, thus simplifying the experimental setup. However, the defocusing of the observation results is caused in case of using BOS method. Since the focus of camera on the background image leads to defocusing of the observed object. The defocusing of object becomes greater with increasing the distance between the background and the object. On the other hand, the higher sensitivity can be obtained. Therefore, it is necessary to adjust the distance between the background and the object to be appropriate for the experiment, considering the relation between the defocus and the sensitivity. The purpose of this study is to experimentally clarify the effect of defocus on density field reconstruction. In this study, the visualization experiment of underexpanded jet using BOS measurement system with ronchi ruling as the background that we constructed, have been performed. The reservoir pressure of the jet and the distance between camera and axis of jet is fixed, and the distance between background and axis of jet has been changed as the parameter. The images have been later analyzed by using personal computer to quantitatively detect the amount of shift of the background image from the comparison between the background pattern and the captured image of underexpanded jet. The quantitatively measured amount of shift have been reconstructed into a density flow field using the Abel transformation and the Gradstone-Dale equation. From the experimental results, it is found that the reconstructed density image becomes blurring, and noise becomes decreasing with increasing the distance between background and axis of underexpanded jet. Consequently, it is cralified that the sensitivity constant should be greater than 20, and the circle of confusion diameter should be less than 2.7mm at least in this experimental setup.Keywords: BOS method, underexpanded jet, abel transformation, density field visualization
Procedia PDF Downloads 7825632 Sorghum Resilience and Sustainability under Limiting and Non-limiting Conditions of Water and Nitrogen
Authors: Muhammad Tanveer Altaf, Mehmet Bedir, Waqas Liaqat, Gönül Cömertpay, Volkan Çatalkaya, Celaluddin Barutçular, Nergiz Çoban, Ibrahim Cerit, Muhammad Azhar Nadeem, Tolga Karaköy, Faheem Shehzad Baloch
Abstract:
Food production needs to be almost double by 2050 in order to feed around 9 billion people around the Globe. Plant production mostly relies on fertilizers, which also have one of the main roles in environmental pollution. In addition to this, climatic conditions are unpredictable, and the earth is expected to face severe drought conditions in the future. Therefore, water and fertilizers, especially nitrogen are considered as main constraints for future food security. To face these challenges, developing integrative approaches for germplasm characterization and selecting the resilient genotypes performing under limiting conditions is very crucial for effective breeding to meet the food requirement under climatic change scenarios. This study is part of the European Research Area Network (ERANET) project for the characterization of the diversity panel of 172 sorghum accessions and six hybrids as control cultivars under limiting (+N/-H2O, -N/+H2O) and non-limiting conditions (+N+H2O). This study was planned to characterize the sorghum diversity in relation to resource Use Efficiency (RUE), with special attention on harnessing the interaction between genotype and environment (GxE) from a physiological and agronomic perspective. Experiments were conducted at Adana, a Mediterranean climate, with augmented design, and data on various agronomic and physiological parameters were recorded. Plentiful diversity was observed in the sorghum diversity panel and significant variations were seen among the limiting water and nitrogen conditions in comparison with the control experiment. Potential genotypes with the best performance are identified under limiting conditions. Whole genome resequencing was performed for whole germplasm under investigation for diversity analysis. GWAS analysis will be performed using genotypic and phenotypic data and linked markers will be identified. The results of this study will show the adaptation and improvement of sorghum under climate change conditions for future food security.Keywords: germplasm, sorghum, drought, nitrogen, resources use efficiency, sequencing
Procedia PDF Downloads 7725631 Analysis of Fish Preservation Methods for Traditional Fishermen Boat
Authors: Kusno Kamil, Andi Asni, Sungkono
Abstract:
According to a report of the World Food and Agriculture Agency (FAO): the post-harvest fish losses in Indonesia reaches 30 percent from 170 trillion rupiahs of marine fisheries reserves, then the potential loss reaches 51 trillion rupiahs (end of 2016 data). This condition is caused by traditionally vulnerable fish catches damaged due to disruption of the cold chain of preservation. The physical and chemical changes in fish flesh increase rapidly, especially if exposed to the scorching heat in the middle of the sea, exacerbated by the low awareness of catch hygiene; many unclean catches which contain blood are often treated without special attention and mixed with freshly caught fish, thereby increasing the potential for faster fish spoilage. This background encourages research on traditional fisherman catch preservation methods that aim to find the best and most affordable methods and/or combinations of fish preservation methods so that they can help fishermen increase their fishing duration without worrying that their catch will be damaged, thereby reducing their economic value when returning to the beach to sell their catches. This goal is expected to be achieved through experimental methods of treatment of fresh fish catches in containers with the addition of anti-bacterial copper, liquid smoke solution, and the use of vacuum containers. The other three treatments combined the three previous treatment variables with an electrically powered cooler (temperature 0~4 ᵒC). As a control specimen, the untreated fresh fish (placed in the open air and in the refrigerator) were also prepared for comparison for 1, 3, and 6 days. To test the level of freshness of fish for each treatment, physical observations were used, which were complemented by tests for bacterial content in a trusted laboratory. The content of copper (Cu) in fish meat (which is suspected of having a negative impact on consumers) was also part of the examination on the 6th day of experimentation. The results of physical observations on the test specimens (organoleptic method) showed that preservation assisted by the use of coolers was still better for all treatment variables. The specimens, without cooling, sequentially showed that the best preservation effectiveness was the addition of copper plates, the use of vacuum containers, and then liquid smoke immersion. Especially for liquid smoke, soaking for 6 days of preservation makes the fish meat soft and easy to crumble, even though it doesn't have a bad odor. The visual observation was then complemented by the results of testing the amount of growth (or retardation) of putrefactive bacteria in each treatment of test specimens within similar observation periods. Laboratory measurements report that the minimum amount of putrefactive bacteria achieved by preservation treatment combining cooler with liquid smoke (sample A+), then cooler only (D+), copper layer inside cooler (B+), vacuum container inside cooler (C+), respectively. Other treatments in open air produced a hundred times more putrefactive bacteria. In addition, treatment of the copper layer contaminated the preserved fresh fish more than a thousand times bigger compared to the initial amount, from 0.69 to 1241.68 µg/g.Keywords: fish, preservation, traditional, fishermen, boat
Procedia PDF Downloads 6925630 Farm Bank: The Leveraging of Capital on a Limpopo Citrus Farm
Authors: Gabriella Vermeulen
Abstract:
This paper applies a Bourdieusian lens to a Limpopo Citrus farm referred to as Malapeng in order to understand how conflict and authority are reproduced in Malapeng in the larger context of the South African agricultural industry. The South African citrus industry is an export industry, with South Africa being the second largest exporter of citrus in the world. Agriculture in South Africa has undergone extensive liberalisation since 1994, and many historical patterns, such as the racial divide in agriculture and the exploitation of black workers, are still continuously reproduced on farms in South Africa. This chapter looks at the institution of the ‘farm bank’ on Malapeng, which provides loans to workers whose livelihood strategies have been otherwise limited both by the larger agricultural context they are a part of and by the owner of Malapeng. By discussing the role of farm banks in a conflict between two permanent workers, the chapter illustrates how various oppositional discourses are strategically emphasised or de-emphasised at different times by the actors on Malapeng depending on their immediate goals. Farm bank proves to be a nexus of various discourses on Malapeng as the actors on Malapeng all construct farm bank in different (and often contradictory) terms in order to explain their influence and responsibility on Malapeng. The findings of the paper are based on data collected during fieldwork for an MA dissertation and are based on observation and semi-structured interviews conducted in 2021.Keywords: agriculture, South Africa, capital, labour
Procedia PDF Downloads 6825629 Elderly Health Care Process by Community Participation: A Sub-District in the Lower Northern Region of Thailand
Authors: Amaraporn Puraya, Roongtiva Boonpracom, Somsak Thojampa, Sirikanok Klankhajhon, Kittisak Kumpeera
Abstract:
The objective of this qualitative research was to study the elderly health care process by community participation. Data were collected by quality research methods, including secondary data study, observation, in-depth interviews, and focus group discussions and analyzed by content analysis, reflection and review of information. The research results pointed out that the important elderly health care process by community participation consisted of 2 parts, namely the community participation development process in elderly health care and the outcomes from the participation development process. The community participation development process consisted of 4 steps as follows: 1) Building the leadership team, an important social capital of the community, which started from searching for both formal and informal leaders by giving the opportunity for public participation and creating clear agreements defining roles, duties and responsibilities; 2) investigating the problems and the needs of the community, 3) designing the elderly health care activities under the concept of self-care potential development of the elderly through participation in community forums and meetings to exchange knowledge with common goals, plans and operation and 4) the development process of sustainable health care agreement at the local level, starting from opening communication channels to create awareness and participation in various activities at both individual and group levels as well as pushing activities/projects into the community development plan consistent with the local administration policy. The outcomes from the participation development process were as follows. 1) There was the integration of the elderly for doing the elderly health care activities/projects in the community managed by the elderly themselves. 2) The service system was changed from the passive to the proactive one, focusing on health promotion rather than treating diseases or illnesses. 3) The registered nurses / the public health officers can provide care for the elderly with chronic illnesses through the implementation of activities/projects of elderly health care so that the elderly can access the services more. 4) The local government organization became the main mechanism in driving the elderly health care process by community participation.Keywords: elderly health care process, community participation, elderly, Thailand
Procedia PDF Downloads 21325628 Cosmic Radiation Hazards and Protective Strategies in Space Exploration
Authors: Mehrnaz Mostafavi, Alireza Azani, Mahtab Shabani, Fatemeh Ghafari
Abstract:
While filled with promise and wonder, space exploration also presents significant challenges, one of the foremost being the threat of cosmic radiation to astronaut health. Recent advancements in assessing these risks and developing protective strategies have shed new light on this issue. Cosmic radiation encompasses a variety of high-energy particles originating from sources like solar particle events, galactic cosmic rays, and cosmic rays from beyond the solar system. These particles, composed of protons, electrons, and heavy ions, pose a substantial threat to human health in space due to the lack of Earth's protective atmosphere and magnetic field. Researchers have made significant progress in assessing the risks associated with cosmic radiation exposure. By employing advanced dosimetry techniques and conducting biological studies, they have gained insights into how cosmic radiation affects astronauts' health, including increasing the risk of cancer and radiation sickness. This research has led to personalized risk assessment methods tailored to individual astronaut profiles. Distinctive protection strategies have been proposed to combat the dangers of cosmic radiation. These include developing spacecraft shielding materials and designs to enhance radiation protection. Additionally, researchers are exploring pharmacological interventions such as radioprotective drugs and antioxidant therapies to mitigate the biological effects of radiation exposure and preserve astronaut well-being. The findings from recent research have significant implications for the future of space exploration. By advancing our understanding of cosmic radiation risks and developing effective protection strategies, we pave the way for safer and more sustainable human missions beyond Earth's orbit. This is especially crucial for long-duration missions to destinations like Mars, where astronauts will face prolonged exposure to cosmic radiation. In conclusion, recent research has marked a milestone in addressing the challenges posed by cosmic radiation in space exploration. By delving into the complexities of cosmic radiation exposure and developing innovative protection strategies, scientists are ensuring the health and resilience of astronauts as they venture into the vast expanse of the cosmos. Continued research and collaboration in this area are essential for overcoming the cosmic radiation challenge and enabling humanity to embark on new frontiers of exploration and discovery in space.Keywords: Space exploration, cosmic radiation, astronaut health, risk assessment, protective strategies
Procedia PDF Downloads 7825627 The Promoting of Early Childhood Development in Local Government Child Center
Authors: Vorapoj Promasatayaprot, Sumattana Glangkarn
Abstract:
Background: Early childhood, the first five years of life, is a time of rapid cognitive, linguistic, social, emotional and motor development. This study was descriptive research which the main purpose of this research was to study early childhood development in Child Center of Local Government in order to emphasize the public citizen and communities participate in the Child Development Center. Method: The study designed was Action Research and divided into four steps consisted of (1) Planning (2) Acting (3) Observing and (4) Reflecting. This study was employed the areas and the subjects consisted of 10 committees of the Child Center in Thakhonyang municipality, Kantharawichai District, Maha Sarakham Province, Thailand and 50 representative parents by using the purposive sampling technique. The instrument used in this study were questionnaires. The data were analyzed using descriptive statistic; percentage, mean, standard deviation, maximum value, minimum, median. Qualitative data was collected using the observation and interview and was analysed by content analysis. Results: The results of this research were as follows: The promoting of early childhood development in child center at Thakhonyang Municipality, Kantharawichai District, Maha Sarakham Province, Thailand were 6 procedures ; (1) workshop participation (2) workshop in action plan (3) performing in action plan (4) following supervision (5) self – assessment (6) knowledge sharing seminar. The service model of the Local Fund Health Security in Thailand was passed the qualifications of local fund health security by 6 procedures to be the high potential local fund health security. Conclusion: The key success is that the commission will have to respond the performance at all process of plan to address the issue in the future. Factor of success is to community participate with transparent procedure. Coordination committee should manipulate the child center benefits among stake holders.Keywords: child center, develop, early childhood development, local government, promote
Procedia PDF Downloads 19325626 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh
Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila
Abstract:
Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.Keywords: data culture, data-driven organization, data mesh, data quality for business success
Procedia PDF Downloads 13525625 Sustainable Desert Traditional Architecture of the Central Region of Saudi Arabia
Authors: Hisham Mortada
Abstract:
For thousands of years mud houses have represented the practical wisdom and spirituality of people, particularly those of desert regions, who learned how to use local materials to build homes that fitted the environmental and cultural conditions which they lived in. As a case study, the central region of Saudi Arabia exhibits a tradition of earth architecture that is unique in style, culture and sustainability. Aiming to contribute towards the local debate of the suitability of the traditional mud architecture for today’s lifestyle of Saudis, this paper explores the sustainable nature of the traditional adobe architecture of this hot arid region from environmental, social and technical points of view.Keywords: desert architecture, alternative materials, Saudi Arabia, arid climate, green architecture
Procedia PDF Downloads 37725624 Teaching Techno-Criticism to Digital Natives: Participatory Journalism as Pedagogical Practice
Authors: Stephen D. Caldes
Abstract:
Teaching media and digital literacy to “digital natives” presents a unique set of pedagogical obstacles, especially when critique is involved, as these early-adopters tend to deify most technological and/or digital advancements and inventions. Knowing no other way of being, these natives are often reluctant to hear criticisms of the way they receive information, educate themselves, communicate with others, and even become enculturated because critique often connotes generational gaps and/or clandestine efforts to produce neo-Luddites. To digital natives, techno-criticism is more the result of an antiquated, out-of-touch agenda rather than a constructive, progressive praxis. However, the need to cultivate a techno-critical perspective among technology’s premier users has, perhaps, never been more pressing. In an effort to sidestep reluctance and encourage critical thought about where we are in terms of digital technology and where exactly it may be taking us, this essay outlines a new model for teaching techno-criticism to digital natives. Specifically, it recasts the techniques of participatory journalism—helping writers and readers understand subjects outside of their specific historical context—as progressive, interdisciplinary pedagogy. The model arises out of a review of relevant literature and data gathered via literary analysis and participant observation. Given the tenuous relationships between novel digital advancements, individual identity, collective engagement, and, indeed, Truth/fact, shepherding digital natives toward routine practice of “techno-realism” seems of utter importance.Keywords: digital natives, journalism education, media literacy, techno-criticism
Procedia PDF Downloads 32025623 A Green Optically Active Hydrogen and Oxygen Generation System Employing Terrestrial and Extra-Terrestrial Ultraviolet Solar Irradiance
Authors: H. Shahid
Abstract:
Due to Ozone layer depletion on earth, the incoming ultraviolet (UV) radiation is recorded at its high index levels such as 25 in South Peru (13.5° S, 3360 m a.s.l.) Also, the planning of human inhabitation on Mars is under discussion where UV radiations are quite high. The exposure to UV is health hazardous and is avoided by UV filters. On the other hand, artificial UV sources are in use for water thermolysis to generate Hydrogen and Oxygen, which are later used as fuels. This paper presents the utility of employing UVA (315-400nm) and UVB (280-315nm) electromagnetic radiation from the solar spectrum to design and implement an optically active, Hydrogen and Oxygen generation system via thermolysis of desalinated seawater. The proposed system finds its utility on earth and can be deployed in the future on Mars (UVB). In this system, by using Fresnel lens arrays as an optical filter and via active tracking, the ultraviolet light from the sun is concentrated and then allowed to fall on two sub-systems of the proposed system. The first sub-system generates electrical energy by using UV based tandem photovoltaic cells such as GaAs/GaInP/GaInAs/GaInAsP and the second elevates temperature of water to lower the electric potential required to electrolyze the water. An empirical analysis is performed at 30 atm and an electrical potential is observed to be the main controlling factor for the rate of production of Hydrogen and Oxygen and hence the operating point (Q-Point) of the proposed system. The hydrogen production rate in the case of the commercial system in static mode (650ᵒC, 0.6V) is taken as a reference. The silicon oxide electrolyzer cell (SOEC) is used in the proposed (UV) system for the Hydrogen and Oxygen production. To achieve the same amount of Hydrogen as in the case of the reference system, with minimum chamber operating temperature of 850ᵒC in static mode, the corresponding required electrical potential is calculated as 0.3V. However, practically, the Hydrogen production rate is observed to be low in comparison to the reference system at 850ᵒC at 0.3V. However, it has been shown empirically that the Hydrogen production can be enhanced and by raising the electrical potential to 0.45V. It increases the production rate to the same level as is of the reference system. Therefore, 850ᵒC and 0.45V are assigned as the Q-point of the proposed system which is actively stabilized via proportional integral derivative controllers which adjust the axial position of the lens arrays for both subsystems. The functionality of the controllers is based on maintaining the chamber fixed at 850ᵒC (minimum operating temperature) and 0.45V; Q-Point to realize the same Hydrogen production rate as-is for the reference system.Keywords: hydrogen, oxygen, thermolysis, ultraviolet
Procedia PDF Downloads 13325622 Detection of Hepatitis B by the Use of Artifical Intelegence
Authors: Shizra Waris, Bilal Shoaib, Munib Ahmad
Abstract:
Background; The using of clinical decision support systems (CDSSs) may recover unceasing disease organization, which requires regular visits to multiple health professionals, treatment monitoring, disease control, and patient behavior modification. The objective of this survey is to determine if these CDSSs improve the processes of unceasing care including diagnosis, treatment, and monitoring of diseases. Though artificial intelligence is not a new idea it has been widely documented as a new technology in computer science. Numerous areas such as education business, medical and developed have made use of artificial intelligence Methods: The survey covers articles extracted from relevant databases. It uses search terms related to information technology and viral hepatitis which are published between 2000 and 2016. Results: Overall, 80% of studies asserted the profit provided by information technology (IT); 75% of learning asserted the benefits concerned with medical domain;25% of studies do not clearly define the added benefits due IT. The CDSS current state requires many improvements to hold up the management of liver diseases such as HCV, liver fibrosis, and cirrhosis. Conclusion: We concluded that the planned model gives earlier and more correct calculation of hepatitis B and it works as promising tool for calculating of custom hepatitis B from the clinical laboratory data.Keywords: detection, hapataties, observation, disesese
Procedia PDF Downloads 15625621 Big Data Analysis with RHadoop
Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim
Abstract:
It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop
Procedia PDF Downloads 43725620 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: data augmentation, mutex task generation, meta-learning, text classification.
Procedia PDF Downloads 9325619 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network
Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan
Abstract:
Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.Keywords: aggregation point, data communication, data aggregation, wireless sensor network
Procedia PDF Downloads 15725618 Spatial Econometric Approaches for Count Data: An Overview and New Directions
Authors: Paula Simões, Isabel Natário
Abstract:
This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data
Procedia PDF Downloads 59325617 Digitalising the Instruction: Between Technology Integration and Instrumental Use
Authors: H. Zouar, I. Kassous, F. Benzert
Abstract:
The relentless pace of technology development in the last two decades has pervaded much of the recent educational discourse on a nation-wide scale. The rippling echoes of the buzz that account for the myriad of advantages the new technologies bring to the pedagogical activity has inevitably transcended from the western world to the Algerian educational contexts. Attempts have been made by Algerian practitioners to heed this digital advancement and push their instructional practices forward. However, due to the still largely existing first-order barriers as exemplified in the forms of deficient institutional infrastructure and unavailability of sufficient digital materials, the results of those attempts have polarised the views of Algerian academics regarding technology integration within higher education context. Hence, this study aims at measuring the possibility of integrating technology in our classrooms in a way that conforms to the philosophy of hybrid education. It also attempts to re-consider teachers’ understanding of technology integration in our context. Furthermore, the purpose of this research is also to reveal the level of teachers’ awareness regarding the distinction between technology integration and instrumental use. In view of the nature of these aims, a mixed-methods mode of investigation has been adopted to collect both qualitative and quantitative data from different perspectives. The data collection tools comprise of an observation as well as students’ and teachers’ questionnaires. The findings show that despite the fact that the examined context is not without its technological limitations, technology integration can be successfully incorporated contingent on teachers' level of knowledge and agency. Technology integration in Algerian universities does not proceed as the bedrock theory of it entails due to issues within teachers' general understanding of utilizing technology in class. It seems that technology is a means to an end, depending on the teachers who make use of it in order to deliver lessons (PowerPoint presentation) and issue commands (Facebook posting). Teachers' ability to clearly discern between integrating technology in their practices versus employing it as an instrument of instruction needs further consideration in order to establish a solid understanding of technology integration within higher education context.Keywords: technology integration, hybrid education, teachers' understanding, teachers' awareness, instrumental use
Procedia PDF Downloads 12425616 The Results of the Study of Clinical Forms of Actinic Keratosis in Uzbekistan
Authors: Ayubova Nargiza Mirzabixulaevna, Kiryakov Dmitriy Andreyevich
Abstract:
Relevance: According to experts from the World Health Organization, in 80% of cases, the causes of skin cancer are external factors: polluted air, radioactive substances, solar flares, and free radicals. In dermatology, one of the most common related to obligate diseases is actinic keratosis. Actinic keratosis (AC) is an area of abnormal proliferation and differentiation of keratinocytes, which carry the risk of progression into invasive squamous cell carcinoma of the skin. The purpose of the study is to study the prevalence of various forms of actinic keratosis among the population of Uzbekistan. Materials and methods of research: The study is based on the observation and clinical laboratory examination of 96 patients who were divided by gender and age. Women made up 45% and men made up 55%. The youngest patient was 43 years old, and the oldest was 92 years old. The control group consisted of 40 patients. The following clinical signs were evaluated: peeling, hyperkeratosis, erythema, pigmentation, atrophy. Results: Studies have shown that of all forms of actinic keratosis, erythematous (36%), hyperkeratotic (27%), pigmented (12%), cutaneous horn (7.0%), atrophic (7.0%), Actinic cheilitis (6%), lichenoid (5%) are common. Conclusion: Thus, the data we have obtained indicate that the main and pronounced clinical sign in the erythematous form is erythema and the hyperkeratic form is often found. With cutaneous horn, there is a sharp hyperkeratosis of the epidermis.Keywords: actinic keratosis, patient, skin cancer, obligate diseases
Procedia PDF Downloads 2725615 A NoSQL Based Approach for Real-Time Managing of Robotics's Data
Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir
Abstract:
This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.Keywords: NoSQL databases, database management systems, robotics, big data
Procedia PDF Downloads 35425614 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis
Authors: C. B. Le, V. N. Pham
Abstract:
In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering
Procedia PDF Downloads 189