Search results for: data driven decision making
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30557

Search results for: data driven decision making

26087 Sampled-Data Control for Fuel Cell Systems

Authors: H. Y. Jung, Ju H. Park, S. M. Lee

Abstract:

A sampled-data controller is presented for solid oxide fuel cell systems which is expressed by a sector bounded nonlinear model. The sector bounded nonlinear systems, which have a feedback connection with a linear dynamical system and nonlinearity satisfying certain sector type constraints. Also, the sampled-data control scheme is very useful since it is possible to handle digital controller and increasing research efforts have been devoted to sampled-data control systems with the development of modern high-speed computers. The proposed control law is obtained by solving a convex problem satisfying several linear matrix inequalities. Simulation results are given to show the effectiveness of the proposed design method.

Keywords: sampled-data control, fuel cell, linear matrix inequalities, nonlinear control

Procedia PDF Downloads 568
26086 Designing and Costing the Concept of Servicer Satellites That Can Be Used to De-Orbit Space Debris

Authors: Paras Adlakha

Abstract:

Today the major threat to our existing and future satellites is space debris; the collision of bodies like defunct satellites with any other objects in space, including the new age ASAT (anti-satellite) weaponry system, are the main causes of the increasing amount of space debris every year. After analyzing the current situation of space debris, low earth orbit is found to be having a large density of debris as compared to any other orbit range; that's why it is selected as the target orbit for space debris removal mission. In this paper, the complete data of 24000 debris is studied based on size, altitude, inclination, mass, number of existing satellites threaten by each debris from which the rocket bodies are the type of wreckage found to be most suited for removal. The optimal method of active debris removal using a robotic arm for capturing the body to attach a de-orbit kit is used to move the debris from its orbit without making the actual contact of servicer with the debris to reduce the further the threat of collision with defunct material. The major factors which are brought into consideration while designing the concept of debris removal are tumbling, removal of debris under a low-cost mission and decreasing the factor of collisions during the mission.

Keywords: de-orbit, debris, servicer, satellite, space junk

Procedia PDF Downloads 141
26085 Choice Analysis of Ground Access to São Paulo/Guarulhos International Airport Using Adaptive Choice-Based Conjoint Analysis (ACBC)

Authors: Carolina Silva Ansélmo

Abstract:

Airports are demand-generating poles that affect the flow of traffic around them. The airport access system must be fast, convenient, and adequately planned, considering its potential users. An airport with good ground access conditions can provide the user with a more satisfactory access experience. When several transport options are available, service providers must understand users' preferences and the expected quality of service. The present study focuses on airport access in a comparative scenario between bus, private vehicle, subway, taxi and urban mobility transport applications to São Paulo/Guarulhos International Airport. The objectives are (i) to identify the factors that influence the choice, (ii) to measure Willingness to Pay (WTP), and (iii) to estimate the market share for each modal. The applied method was Adaptive Choice-based Conjoint Analysis (ACBC) technique using Sawtooth Software. Conjoint analysis, rooted in Utility Theory, is a survey technique that quantifies the customer's perceived utility when choosing alternatives. Assessing user preferences provides insights into their priorities for product or service attributes. An additional advantage of conjoint analysis is its requirement for a smaller sample size compared to other methods. Furthermore, ACBC provides valuable insights into consumers' preferences, willingness to pay, and market dynamics, aiding strategic decision-making to provide a better customer experience, pricing, and market segmentation. In the present research, the ACBC questionnaire had the following variables: (i) access time to the boarding point, (ii) comfort in the vehicle, (iii) number of travelers together, (iv) price, (v) supply power, and (vi) type of vehicle. The case study questionnaire reached 213 valid responses considering the scenario of access from the São Paulo city center to São Paulo/Guarulhos International Airport. As a result, the price and the number of travelers are the most relevant attributes for the sample when choosing airport access. The market share of the selection is mainly urban mobility transport applications, followed by buses, private vehicles, taxis and subways.

Keywords: adaptive choice-based conjoint analysis, ground access to airport, market share, willingness to pay

Procedia PDF Downloads 81
26084 Teacher Collaboration Impact on Bilingual Students’ Oral Communication Skills in Inclusive Contexts

Authors: Diana González, Marta Gràcia, Ana Luisa Adam-Alcocer

Abstract:

Incorporating digital tools into educational practices represents a valuable approach for enriching the quality of teachers' educational practices in oral competence and fostering improvements in student learning outcomes. This study aims to promote a collaborative and culturally sensitive approach to professional development between teachers and a speech therapist to enhance their self-awareness and reflection on high-quality educational practices that integrate school components to strengthen children’s oral communication and pragmatic skills. The study involved five bilingual teachers fluent in both English and Spanish, with three specializing in special education and two in general education. It focused on Spanish-English bilingual students, aged 3-6, who were experiencing speech delays or disorders in a New York City public school, with the collaboration of a speech therapist. Using EVALOE-DSS (Assessment Scale of Oral Language Teaching in the School Context - Decision Support System), teachers conducted self-assessments of their teaching practices, reflect and make-decisions throughout six classes from March to June, focusing on students' communicative competence across various activities. Concurrently, the speech therapist observed and evaluated six classes per teacher using EVALOE-DSS during the same period. Additionally, professional development meetings were held monthly between the speech therapist and teachers, centering on discussing classroom interactions, instructional strategies, and the progress of both teachers and students in their classes. Findings highlight the digital tool EVALOE-DSS's value in analyzing communication patterns and trends among bilingual children in inclusive settings. It helps in identifying improvement areas through teacher and speech therapist collaboration. After self-reflection meetings, teachers demonstrated increased awareness of student needs in oral language and pragmatic skills. They also exhibited enhanced utilization of strategies outlined in EVALOE-DSS, such as actively guiding and orienting students during oral language activities, promoting student-initiated communicative interactions, teaching students how to seek and provide information, and managing turn-taking to ensure inclusive participation. Teachers participating in the professional development program have shown positive progress in assessing their classes across all dimensions of the training tool, including instructional design, teacher conversation management, pupil conversation management, communicative functions, teacher strategies, and pupil communication functions. This includes aspects related to both teacher actions and child actions, particularly in child language development. This progress underscores the effectiveness of individual reflection (conducted weekly or biweekly using EVALOE-DSS) as well as collaborative reflection among teachers and the speech therapist during meetings. The EVALOE-SSD has proven effective in supporting teachers' self-reflection, decision-making, and classroom changes, leading to improved development of students' oral language and pragmatic skills. It has facilitated culturally sensitive evaluations of communication among bilingual children, cultivating collaboration between teachers and speech therapist to identify areas of growth. Participants in the professional development program demonstrated substantial progress across all dimensions assessed by EVALOE-DSS. This included improved management of pupil communication functions, implementation of effective teaching strategies, and better classroom dynamics. Regular reflection sessions using EVALOE-SSD supported continuous improvement in instructional practices, highlighting its role in fostering reflective teaching and enriching student learning experiences. Overall, EVALOE-DSS has proven invaluable for enhancing teaching effectiveness and promoting meaningful student interactions in diverse educational settings.

Keywords: bilingual students, collaboration, culturally sensitive, oral communication skills, self-reflection

Procedia PDF Downloads 41
26083 How Western Donors Allocate Official Development Assistance: New Evidence From a Natural Language Processing Approach

Authors: Daniel Benson, Yundan Gong, Hannah Kirk

Abstract:

Advancement in national language processing techniques has led to increased data processing speeds, and reduced the need for cumbersome, manual data processing that is often required when processing data from multilateral organizations for specific purposes. As such, using named entity recognition (NER) modeling and the Organisation of Economically Developed Countries (OECD) Creditor Reporting System database, we present the first geotagged dataset of OECD donor Official Development Assistance (ODA) projects on a global, subnational basis. Our resulting data contains 52,086 ODA projects geocoded to subnational locations across 115 countries, worth a combined $87.9bn. This represents the first global, OECD donor ODA project database with geocoded projects. We use this new data to revisit old questions of how ‘well’ donors allocate ODA to the developing world. This understanding is imperative for policymakers seeking to improve ODA effectiveness.

Keywords: international aid, geocoding, subnational data, natural language processing, machine learning

Procedia PDF Downloads 85
26082 Buy-and-Hold versus Alternative Strategies: A Comparison of Market-Timing Techniques

Authors: Jonathan J. Burson

Abstract:

With the rise of virtually costless, mobile-based trading platforms, stock market trading activity has increased significantly over the past decade, particularly for the millennial generation. This increased stock market attention, combined with the recent market turmoil due to the economic upset caused by COVID-19, make the topics of market-timing and forecasting particularly relevant. While the overall stock market saw an unprecedented, historically-long bull market from March 2009 to February 2020, the end of that bull market reignited a search by investors for a way to reduce risk and increase return. Similar searches for outperformance occurred in the early, and late 2000’s as the Dotcom bubble burst and the Great Recession led to years of negative returns for mean-variance, index investors. Extensive research has been conducted on fundamental analysis, technical analysis, macroeconomic indicators, microeconomic indicators, and other techniques—all using different methodologies and investment periods—in pursuit of higher returns with lower risk. The enormous variety of timeframes, data, and methodologies used by the diverse forecasting methods makes it difficult to compare the outcome of each method directly to other methods. This paper establishes a process to evaluate the market-timing methods in an apples-to-apples manner based on simplicity, performance, and feasibility. Preliminary findings show that certain technical analysis models provide a higher return with lower risk when compared to the buy-and-hold method and to other market-timing strategies. Furthermore, technical analysis models tend to be easier for individual investors both in terms of acquiring the data and in analyzing it, making technical analysis-based market-timing methods the preferred choice for retail investors.

Keywords: buy-and-hold, forecast, market-timing, probit, technical analysis

Procedia PDF Downloads 100
26081 European Commission Radioactivity Environmental Monitoring Database REMdb: A Law (Art. 36 Euratom Treaty) Transformed in Environmental Science Opportunities

Authors: M. Marín-Ferrer, M. A. Hernández, T. Tollefsen, S. Vanzo, E. Nweke, P. V. Tognoli, M. De Cort

Abstract:

Under the terms of Article 36 of the Euratom Treaty, European Union Member States (MSs) shall periodically communicate to the European Commission (EC) information on environmental radioactivity levels. Compilations of the information received have been published by the EC as a series of reports beginning in the early 1960s. The environmental radioactivity results received from the MSs have been introduced into the Radioactivity Environmental Monitoring database (REMdb) of the Institute for Transuranium Elements of the EC Joint Research Centre (JRC) sited in Ispra (Italy) as part of its Directorate General for Energy (DG ENER) support programme. The REMdb brings to the scientific community dealing with environmental radioactivity topics endless of research opportunities to exploit the near 200 millions of records received from MSs containing information of radioactivity levels in milk, water, air and mixed diet. The REM action was created shortly after Chernobyl crisis to support the EC in its responsibilities in providing qualified information to the European Parliament and the MSs on the levels of radioactive contamination of the various compartments of the environment (air, water, soil). Hence, the main line of REM’s activities concerns the improvement of procedures for the collection of environmental radioactivity concentrations for routine and emergency conditions, as well as making this information available to the general public. In this way, REM ensures the availability of tools for the inter-communication and access of users from the Member States and the other European countries to this information. Specific attention is given to further integrate the new MSs with the existing information exchange systems and to assist Candidate Countries in fulfilling these obligations in view of their membership of the EU. Article 36 of the EURATOM treaty requires the competent authorities of each MS to provide regularly the environmental radioactivity monitoring data resulting from their Article 35 obligations to the EC in order to keep EC informed on the levels of radioactivity in the environment (air, water, milk and mixed diet) which could affect population. The REMdb has mainly two objectives: to keep a historical record of the radiological accidents for further scientific study, and to collect the environmental radioactivity data gathered through the national environmental monitoring programs of the MSs to prepare the comprehensive annual monitoring reports (MR). The JRC continues his activity of collecting, assembling, analyzing and providing this information to public and MSs even during emergency situations. In addition, there is a growing concern with the general public about the radioactivity levels in the terrestrial and marine environment, as well about the potential risk of future nuclear accidents. To this context, a clear and transparent communication with the public is needed. EURDEP (European Radiological Data Exchange Platform) is both a standard format for radiological data and a network for the exchange of automatic monitoring data. The latest release of the format is version 2.0, which is in use since the beginning of 2002.

Keywords: environmental radioactivity, Euratom, monitoring report, REMdb

Procedia PDF Downloads 452
26080 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano

Authors: Guo Wenyu, Qu Youli

Abstract:

A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.

Keywords: compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA

Procedia PDF Downloads 254
26079 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment

Authors: Anju Bala, Inderveer Chana

Abstract:

Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.

Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation

Procedia PDF Downloads 521
26078 Data, Digital Identity and Antitrust Law: An Exploratory Study of Facebook’s Novi Digital Wallet

Authors: Wanjiku Karanja

Abstract:

Facebook has monopoly power in the social networking market. It has grown and entrenched its monopoly power through the capture of its users’ data value chains. However, antitrust law’s consumer welfare roots have prevented it from effectively addressing the role of data capture in Facebook’s market dominance. These regulatory blind spots are augmented in Facebook’s proposed Diem cryptocurrency project and its Novi Digital wallet. Novi, which is Diem’s digital identity component, shall enable Facebook to collect an unprecedented volume of consumer data. Consequently, Novi has seismic implications on internet identity as the network effects of Facebook’s large user base could establish it as the de facto internet identity layer. Moreover, the large tracts of data Facebook shall collect through Novi shall further entrench Facebook's market power. As such, the attendant lock-in effects of this project shall be very difficult to reverse. Urgent regulatory action is therefore required to prevent this expansion of Facebook’s data resources and monopoly power. This research thus highlights the importance of data capture to competition and market health in the social networking industry. It utilizes interviews with key experts to empirically interrogate the impact of Facebook’s data capture and control of its users’ data value chains on its market power. This inquiry is contextualized against Novi’s expansive effect on Facebook’s data value chains. It thus addresses the novel antitrust issues arising at the nexus of Facebook’s monopoly power and the privacy of its users’ data. It also explores the impact of platform design principles, specifically data portability and data portability, in mitigating Facebook’s anti-competitive practices. As such, this study finds that Facebook is a powerful monopoly that dominates the social media industry to the detriment of potential competitors. Facebook derives its power from its size, annexure of the consumer data value chain, and control of its users’ social graphs. Additionally, the platform design principles of data interoperability and data portability are not a panacea to restoring competition in the social networking market. Their success depends on the establishment of robust technical standards and regulatory frameworks.

Keywords: antitrust law, data protection law, data portability, data interoperability, digital identity, Facebook

Procedia PDF Downloads 125
26077 What 4th-Year Primary-School Students are Thinking: A Paper Airplane Problem

Authors: Neslihan Şahin Çelik, Ali Eraslan

Abstract:

In recent years, mathematics educators have frequently stressed the necessity of instructing students about models and modeling approaches that encompass cognitive and metacognitive thought processes, starting from the first years of school and continuing on through the years of higher education. The purpose of this study is to examine the thought processes of 4th-grade primary school students in their modeling activities and to explore the difficulties encountered in these processes, if any. The study, of qualitative design, was conducted in the 2015-2016 academic year at a public state-school located in a central city in the Black Sea Region of Turkey. A preliminary study was first implemented with designated 4th grade students, after which the criterion sampling method was used to select three students that would be recruited into the focus group. The focus group that was thus formed was asked to work on the model eliciting activity of the Paper Airplane Problem and the entire process was recorded on video. The Paper Airplane Problem required the students to determine the winner with respect to: (a) the plane that stays in the air for the longest time; (b) the plane that travels the greatest distance in a straight-line path; and (c) the overall winner for the contest. A written transcript was made of the video recording, after which the recording and the students' worksheets were analyzed using the Blum and Ferri modeling cycle. The results of the study revealed that the students tested the hypotheses related to daily life that they had set up, generated ideas of their own, verified their models by making connections with real life, and tried to make their models generalizable. On the other hand, the students had some difficulties in terms of their interpretation of the table of data and their ways of operating on the data during the modeling processes.

Keywords: primary school students, model eliciting activity, mathematical modeling, modeling process, paper airplane problem

Procedia PDF Downloads 362
26076 Effects of Health Information Websites on Health Care Facility Visits

Authors: M. Aljumaan, F. Alkhadra, A. Aldajani, M. Alarfaj, A. Alawami, Y. Aljamaan

Abstract:

Introduction: The internet has been widely available with 18 million users in Saudi Arabia alone. It was shown that 58% of Saudis are using the internet as a source of health-related information which may contribute to overcrowding of the Emergency Room (ER). Not many studies have been conducted to show the effect of online searching for health related information (HRI) and its role in influencing internet users to visit various health care facilities. So the main objective is to determine a correlation between HRI website use and health care facility visits in Saudi Arabia. Methodology: By conducting a cross sectional study and distributing a questionnaire, a total number of 1095 people were included in the study. Demographic data was collected as well as questions including the use of HRI websites, type of websites used, the reason behind the internet search, which health care facility it lead them to visit and whether seeking health information on the internet influenced their attitude towards visiting health care facilities. The survey was distributed using an internet survey applications. The data was then put on an excel sheet and analyzed with the help of a biostatician for making a correlation. Results: We found 91.4% of our population have used the internet for medical information using mainly General medical websites (77.8%), Forums (34.2%), Social Media (21.6%), and government websites (21.6%). We also found that 66.9% have used the internet for medical information to diagnose and treat their medical conditions on their own while 34.7% did so due to the inability to have a close referral and 29.5% due to their lack of time. Searching for health related information online caused 62.5% of people to visit health care facilities. Outpatient clinics were most visited at 77.9% followed by the ER (27.9%). The remaining 37.5% do not visit because using HRI websites reassure them of their condition. Conclusion: In conclusion, there may be a correlation between health information website use and health care facility visits. However, to avoid potentially inaccurate medical information, we believe doctors have an important role in educating their patients and the public on where to obtain the correct information & advertise the sites that are regulated by health care officials.

Keywords: ER visits, health related information, internet, medical websites

Procedia PDF Downloads 194
26075 Recommendations for Data Quality Filtering of Opportunistic Species Occurrence Data

Authors: Camille Van Eupen, Dirk Maes, Marc Herremans, Kristijn R. R. Swinnen, Ben Somers, Stijn Luca

Abstract:

In ecology, species distribution models are commonly implemented to study species-environment relationships. These models increasingly rely on opportunistic citizen science data when high-quality species records collected through standardized recording protocols are unavailable. While these opportunistic data are abundant, uncertainty is usually high, e.g., due to observer effects or a lack of metadata. Data quality filtering is often used to reduce these types of uncertainty in an attempt to increase the value of studies relying on opportunistic data. However, filtering should not be performed blindly. In this study, recommendations are built for data quality filtering of opportunistic species occurrence data that are used as input for species distribution models. Using an extensive database of 5.7 million citizen science records from 255 species in Flanders, the impact on model performance was quantified by applying three data quality filters, and these results were linked to species traits. More specifically, presence records were filtered based on record attributes that provide information on the observation process or post-entry data validation, and changes in the area under the receiver operating characteristic (AUC), sensitivity, and specificity were analyzed using the Maxent algorithm with and without filtering. Controlling for sample size enabled us to study the combined impact of data quality filtering, i.e., the simultaneous impact of an increase in data quality and a decrease in sample size. Further, the variation among species in their response to data quality filtering was explored by clustering species based on four traits often related to data quality: commonness, popularity, difficulty, and body size. Findings show that model performance is affected by i) the quality of the filtered data, ii) the proportional reduction in sample size caused by filtering and the remaining absolute sample size, and iii) a species ‘quality profile’, resulting from a species classification based on the four traits related to data quality. The findings resulted in recommendations on when and how to filter volunteer generated and opportunistically collected data. This study confirms that correctly processed citizen science data can make a valuable contribution to ecological research and species conservation.

Keywords: citizen science, data quality filtering, species distribution models, trait profiles

Procedia PDF Downloads 209
26074 Nigeria's Distressed Economy and Achievement of Child-Friendly School Model

Authors: Onyeke Paul Chuks

Abstract:

Nigeria is ranked among the developing nations and a country with a low income per capita. The consequences of this economic situation have led to the low achievement records below UN benchmark especially in the area of basic education for her citizens. The country is, however, making relentless efforts at arresting the situation by making budgetary allocations to ensure the realization of Millennium Development Goal No. 2 which is achieving universal basic education, her distressed economy notwithstanding. Basic education which comprises primary and lower secondary education as well as pre-primary and/or adult literacy programs have suffered serious setbacks orchestrated by the dwindling of the nation’s economy. This category of education being the bedrock of all other levels of education is regarded as a priority by developing countries and also the focus of the Education for All Movement led by UNESCO. The introduction of child-friendly school model is one of the strategies designed by UNESCO to achieving this all important MDGs goal No. 2. Child-friendly education model is aimed at replacing the out-dated, mundane, regimented and officious school administrative model where the basic rights of school children are trampled upon with impunity and community participation in school activities is viewed as unnecessary interference by school managers. This paper ex-rayed the potential obstacles likely to impinge on the implementation of child-friendly school model in Nigeria especially from the angle of her distressed economy and the colossal effects of the corrupt practices bedeviling the nation. The paper as well outlines prospects for the successful implementation of the child-friendly school model in Nigeria.

Keywords: child-friendly school, distressed economy, model, Nigeria

Procedia PDF Downloads 284
26073 Data Quality Enhancement with String Length Distribution

Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda

Abstract:

Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.

Keywords: string classification, data quality, feature selection, probability distribution, string length

Procedia PDF Downloads 321
26072 Fuzzy Data, Random Drift, and a Theoretical Model for the Sequential Emergence of Religious Capacity in Genus Homo

Authors: Margaret Boone Rappaport, Christopher J. Corbally

Abstract:

The ancient ape ancestral population from which living great ape and human species evolved had demographic features affecting their evolution. The population was large, had great genetic variability, and natural selection was effective at honing adaptations. The emerging populations of chimpanzees and humans were affected more by founder effects and genetic drift because they were smaller. Natural selection did not disappear, but it was not as strong. Consequences of the 'population crash' and the human effective population size are introduced briefly. The history of the ancient apes is written in the genomes of living humans and great apes. The expansion of the brain began before the human line emerged. Coalescence times for some genes are very old – up to several million years, long before Homo sapiens. The mismatch between gene trees and species trees highlights the anthropoid speciation processes, and gives the human genome history a fuzzy, probabilistic quality. However, it suggests traits that might form a foundation for capacities emerging later. A theoretical model is presented in which the genomes of early ape populations provide the substructure for the emergence of religious capacity later on the human line. The model does not search for religion, but its foundations. It suggests a course by which an evolutionary line that began with prosimians eventually produced a human species with biologically based religious capacity. The model of the sequential emergence of religious capacity relies on cognitive science, neuroscience, paleoneurology, primate field studies, cognitive archaeology, genomics, and population genetics. And, it emphasizes five trait types: (1) Documented, positive selection of sensory capabilities on the human line may have favored survival, but also eventually enriched human religious experience. (2) The bonobo model suggests a possible down-regulation of aggression and increase in tolerance while feeding, as well as paedomorphism – but, in a human species that remains cognitively sharp (unlike the bonobo). The two species emerged from the same ancient ape population, so it is logical to search for shared traits. (3) An up-regulation of emotional sensitivity and compassion seems to have occurred on the human line. This finds support in modern genetic studies. (4) The authors’ published model of morality's emergence in Homo erectus encompasses a cognitively based, decision-making capacity that was hypothetically overtaken, in part, by religious capacity. Together, they produced a strong, variable, biocultural capability to support human sociability. (5) The full flowering of human religious capacity came with the parietal expansion and smaller face (klinorhynchy) found only in Homo sapiens. Details from paleoneurology suggest the stage was set for human theologies. Larger parietal lobes allowed humans to imagine inner spaces, processes, and beings, and, with the frontal lobe, led to the first theologies composed of structured and integrated theories of the relationships between humans and the supernatural. The model leads to the evolution of a small population of African hominins that was ready to emerge with religious capacity when the species Homo sapiens evolved two hundred thousand years ago. By 50-60,000 years ago, when human ancestors left Africa, they were fully enabled.

Keywords: genetic drift, genomics, parietal expansion, religious capacity

Procedia PDF Downloads 344
26071 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data

Authors: Salam Khalifa, Naveed Ahmed

Abstract:

We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignment method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.

Keywords: 3D video, 3D animation, RGB-D video, temporally coherent 3D animation

Procedia PDF Downloads 375
26070 Determining Abnomal Behaviors in UAV Robots for Trajectory Control in Teleoperation

Authors: Kiwon Yeom

Abstract:

Change points are abrupt variations in a data sequence. Detection of change points is useful in modeling, analyzing, and predicting time series in application areas such as robotics and teleoperation. In this paper, a change point is defined to be a discontinuity in one of its derivatives. This paper presents a reliable method for detecting discontinuities within a three-dimensional trajectory data. The problem of determining one or more discontinuities is considered in regular and irregular trajectory data from teleoperation. We examine the geometric detection algorithm and illustrate the use of the method on real data examples.

Keywords: change point, discontinuity, teleoperation, abrupt variation

Procedia PDF Downloads 170
26069 Multidimensional Item Response Theory Models for Practical Application in Large Tests Designed to Measure Multiple Constructs

Authors: Maria Fernanda Ordoñez Martinez, Alvaro Mauricio Montenegro

Abstract:

This work presents a statistical methodology for measuring and founding constructs in Latent Semantic Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations present on Item Response Theory. More precisely, we propose initially reducing dimensionality with specific use of Principal Component Analysis for the linguistic data and then, producing axes of groups made from a clustering analysis of the semantic data. This approach allows the user to give meaning to previous clusters and found the real latent structure presented by data. The methodology is applied in a set of real semantic data presenting impressive results for the coherence, speed and precision.

Keywords: semantic analysis, factorial analysis, dimension reduction, penalized logistic regression

Procedia PDF Downloads 449
26068 Assessment of Groundwater Chemistry and Quality Characteristics in an Alluvial Aquifer and a Single Plane Fractured-Rock Aquifer in Bloemfontein, South Africa

Authors: Modreck Gomo

Abstract:

The evolution of groundwater chemistry and its quality is largely controlled by hydrogeochemical processes and their understanding is therefore important for groundwater quality assessments and protection of the water resources. A study was conducted in Bloemfontein town of South Africa to assess and compare the groundwater chemistry and quality characteristics in an alluvial aquifer and single-plane fractured-rock aquifers. 9 groundwater samples were collected from monitoring boreholes drilled into the two aquifer systems during a once-off sampling exercise. Samples were collected through low-flow purging technique and analysed for major ions and trace elements. In order to describe the hydrochemical facies and identify dominant hydrogeochemical processes, the groundwater chemistry data are interpreted using stiff diagrams and principal component analysis (PCA), as complimentary tools. The fitness of the groundwater quality for domestic and irrigation uses is also assessed. Results show that the alluvial aquifer is characterised by a Na-HCO3 hydrochemical facie while fractured-rock aquifer has a Ca-HCO3 facie. The groundwater in both aquifers originally evolved from the dissolution of calcite rocks that are common on land surface environments. However the groundwater in the alluvial aquifer further goes through another evolution as driven by cation exchange process in which Na in the sediments exchanges with Ca2+ in the Ca-HCO3 hydrochemical type to result in the Na-HCO3 hydrochemical type. Despite the difference in the hydrogeochemical processes between the alluvial aquifer and single-plane fractured-rock aquifer, this did not influence the groundwater quality. The groundwater in the two aquifers is very hard as influenced by the elevated magnesium and calcium ions that evolve from dissolution of carbonate minerals which typically occurs in surface environments. Based on total dissolved levels (600-900 mg/L), groundwater quality of the two aquifer systems is classified to be of fair quality. The negative potential impacts of the groundwater quality for domestic uses are highlighted.

Keywords: alluvial aquifer, fractured-rock aquifer, groundwater quality, hydrogeochemical processes

Procedia PDF Downloads 208
26067 Comfort Evaluation of Summer Knitted Clothes of Tencel and Cotton Fabrics

Authors: Mona Mohamed Shawkt Ragab, Heba Mohamed Darwish

Abstract:

Context: Comfort properties of garments are crucial for the wearer, and with the increasing demand for cotton fabric, there is a need to explore alternative fabrics that can offer similar or superior comfort properties. This study focuses on comparing the comfort properties of tencel/cotton single jersey fabric and cotton single jersey fabric, with the aim of identifying fabrics that are more suitable for summer clothes. Research Aim: The aim of this study is to evaluate the comfort properties of tencel/cotton single jersey fabric and cotton single jersey fabric, with the goal of identifying fabrics that can serve as alternatives to cotton, considering their comfort properties for summer clothing. Methodology: An experimental, analytical approach was employed in this study. Two circular knitting machines were used to produce the fabrics, one with a 24 inches gauge and the other with a 28 inches gauge. Both fabrics were knitted with three different loop lengths (3.05 mm, 2.9 mm, and 2.6 mm) to obtain loose, medium, and tight fabrics for evaluation. Various comfort properties, including air permeability, water vapor permeability, wickability, and thermal resistance, were measured for both fabric types. Findings: The study found a significant difference in comfort properties between tencel/cotton single jersey fabric and cotton single jersey fabric. Tencel/cotton fabric exhibited higher air permeability, water vapor permeability, and wickability compared to cotton fabric. These findings suggest that tencel fabric is more suitable for summer clothes due to its superior ventilation and absorption properties. Theoretical Importance: This study contributes to the exploration of alternative fabrics to cotton by evaluating their comfort properties. By identifying fabrics that offer better comfort properties than cotton, particularly in terms of water usage, the study provides valuable insights into sustainable fabric choices for the fashion industry. Data Collection and Analysis Procedures: The comfort properties of the fabrics were measured using appropriate testing methods. Paired comparison t-tests were conducted to determine the significant differences between tencel/cotton fabric and cotton fabric in the measured properties. Correlation coefficients were also calculated to examine the relationships between the factors under study. Question Addressed: The study addresses the question of whether tencel/cotton single jersey fabric can serve as an alternative to cotton fabric for summer clothes, considering their comfort properties. Conclusion: The study concludes that tencel/cotton single jersey fabric offers superior comfort properties compared to cotton single jersey fabric, making it a suitable alternative for summer clothes. The findings also highlight the importance of considering fabric properties, such as air permeability, water vapor permeability, and wickability, when selecting materials for garments to enhance wearer comfort. This research contributes to the search for sustainable alternatives to cotton and provides valuable insights for the fashion industry in making informed fabric choices.

Keywords: comfort properties, cotton fabric, tencel fabric, single jersey

Procedia PDF Downloads 81
26066 Dissimilarity-Based Coloring for Symbolic and Multivariate Data Visualization

Authors: K. Umbleja, M. Ichino, H. Yaguchi

Abstract:

In this paper, we propose a coloring method for multivariate data visualization by using parallel coordinates based on dissimilarity and tree structure information gathered during hierarchical clustering. The proposed method is an extension for proximity-based coloring that suffers from a few undesired side effects if hierarchical tree structure is not balanced tree. We describe the algorithm by assigning colors based on dissimilarity information, show the application of proposed method on three commonly used datasets, and compare the results with proximity-based coloring. We found our proposed method to be especially beneficial for symbolic data visualization where many individual objects have already been aggregated into a single symbolic object.

Keywords: data visualization, dissimilarity-based coloring, proximity-based coloring, symbolic data

Procedia PDF Downloads 173
26065 Application of Matrix Converter for the Power Control of a DFIG-Based Wind Turbine

Authors: E. Bounadja, M. O. Mahmoudi, A. Djahbar, Z. Boudjema

Abstract:

This paper presents a control approach of the doubly fed induction generator (DFIG) in conjunction with a direct AC-AC matrix converter used in generating mode. This device is intended to be implemented in a variable speed wind energy conversion system connected to the grid. Firstly, we developed a model of matrix converter, controlled by the Venturini modulation technique. In order to control the power exchanged between the stator of the DFIG and the grid, a control law is synthesized using a high order sliding mode controller. The use of this method provides very satisfactory performance for the DFIG control. The overall strategy has been validated on a 2-MW wind turbine driven a DFIG using the Matlab/Simulink.

Keywords: doubly fed induction generator (DFIG), matrix converter, high-order sliding mode controller, wind energy

Procedia PDF Downloads 526
26064 The Universal Cultural Associations in the Conceptual Metaphors Used in the Headlines of Arab News and Saudi Gazette Newspapers: A Critical Cognitive Study

Authors: Hind Hassan Arruwaite

Abstract:

Conceptual metaphor is a cognitive semantic tool that provides access to people's conceptual systems. The correlation in the human conceptual system surpasses limited time and specific cultures. The universal associations provide universal schemas that organize people's conceptualization of the world. The study aims to explore how the cultural associations used in conceptual metaphors create commonalities and harmony between people of the world. In the research methodology, the researcher implemented Critical Metaphor Analysis, Metaphor Candidate Identification and Metaphor Identification Procedure models to deliver qualitative and descriptive findings. The semantic tension was the key criterion in identifying metaphorically used words in the headlines. The research materials are the oil trade conceptual metaphors used in the headlines of Arab News and Saudi Gazette Newspapers. The data will be uploaded to the self-constructed corpus to examine electronic lists for identifying conceptual metaphors. The study investigates the types of conceptual metaphors used in the headlines of the newspapers, the cultural associations identified in the conceptual metaphors, and whether the identified cultural associations in conceptual metaphors create universal conceptual schemas. The study aligned with previous seminal works on conceptual metaphor theory in emphasizing the distinctive power of conceptual metaphors in exposing the cultural associations that unify people's perceptions. The correlation of people conceptualization provides universal schemas that involve elements of human sensorimotor experiences. The study contributes to exposing the shared cultural associations that ensure the commonality of all humankind's thinking mechanism.

Keywords: critical discourse analysis, critical metaphor analysis, conceptual metaphor theory, primary and specific metaphors, corpus-driven approach, universal associations, image schema, sensorimotor experience, oil trade

Procedia PDF Downloads 206
26063 The Impact of Data Science on Geography: A Review

Authors: Roberto Machado

Abstract:

We conducted a systematic review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology, analyzing 2,996 studies and synthesizing 41 of them to explore the evolution of data science and its integration into geography. By employing optimization algorithms, we accelerated the review process, significantly enhancing the efficiency and precision of literature selection. Our findings indicate that data science has developed over five decades, facing challenges such as the diversified integration of data and the need for advanced statistical and computational skills. In geography, the integration of data science underscores the importance of interdisciplinary collaboration and methodological innovation. Techniques like large-scale spatial data analysis and predictive algorithms show promise in natural disaster management and transportation route optimization, enabling faster and more effective responses. These advancements highlight the transformative potential of data science in geography, providing tools and methodologies to address complex spatial problems. The relevance of this study lies in the use of optimization algorithms in systematic reviews and the demonstrated need for deeper integration of data science into geography. Key contributions include identifying specific challenges in combining diverse spatial data and the necessity for advanced computational skills. Examples of connections between these two fields encompass significant improvements in natural disaster management and transportation efficiency, promoting more effective and sustainable environmental solutions with a positive societal impact.

Keywords: data science, geography, systematic review, optimization algorithms, supervised learning

Procedia PDF Downloads 39
26062 Social Construction of Sustainability and Quality of Life Indicators for Urban Passenger Transportation

Authors: Tzay-An Shiau, Kuan-Lin Ho

Abstract:

This study developed sustainability and quality of life indicators for urban passenger transportation by using Social Construction of Technology (SCOT). The initial indicators were proposed by referring to literatures and were summarized by using impact-based framework. Subsequently, the stakeholders were defined according to their interest, power and then classified into scientific, operational, policy making, policy monitoring and nonprofessional frames. The scientific frame consisted of nine scholars in transportation field. Ten representatives from Taipei Rapid Transit Corporation (TRTC), Taiwan Railways Administration (TRA) and bus operators were grouped into the operational frame. The policy making frame comprised of ten representatives from Department of Transportation, Taipei City Government (DOT, TCG), Department of Railways and Highways, Ministry of Transportation and Communication (DORH, MOTC), Directorate General of Highways, Ministry of Transportation and Communication (DGOH, MOTC) and Institute of Transportation, Ministry of Transportation and Communication (IOT, MOTC). The policy monitoring frame consisted of 15 representatives from Taipei City Councilor, legislator and reporter. The nonprofessional frame comprised of 72 Taipei citizens. The stakeholders were asked to evaluate the relative importance of indicators using Delphi survey method. Social construction of 14 transport sustainability indicators and 12 transport quality of life indicators were obtained.

Keywords: sustainability, quality of life, Social Construction of Technology (SCOT), stakeholder

Procedia PDF Downloads 466
26061 Conflicts of Interest in the Private Sector and the Significance of the Public Interest Test

Authors: Opemiposi Adegbulu

Abstract:

Conflicts of interest is an elusive, diverse and engaging subject, a cross-cutting problem of governance; all levels of governance, ranging from local to global, public to corporate or financial sectors. In all these areas, its mismanagement could lead to the distortion of decision-making processes, corrosion of trust and the weakening of administration. According to Professor Peters, an expert in the area, conflict of interest, a problem at the root of many scandals has “become a pervasive ethical concern in our professional, organisational, and political life”. Conflicts of interest corrode trust, and like in the public sector, trust is mandatory for the market, consumers/clients, shareholders and other stakeholders in the private sector. However, conflicts of interest in the private sector are distinct and must be treated in like manner when regulatory efforts are made to address them. The research looks at identifying conflicts of interest in the private sector and differentiating them from those in the public sector. The public interest is submitted as a criterion which allows for such differentiation. This is significant because it would for the use of tailor-made or sector-specific approaches to addressing this complex issue. This is conducted through extensive review of literature and theories on the definition of conflicts of interest. This study will employ theoretical, doctrinal and comparative methods. The nature of conflicts of interest in the private sector will be explored, through an analysis of the public sector where the notion of conflicts of interest appears more clearly identified, reasons, why they are of business ethics concern, will be advanced, and then, once again, looking at public sector solutions and other solutions, the study will identify ways of mitigating and managing conflicts in the private sector. An exploration of public sector conflicts of interest and solutions will be carried out because the typologies of conflicts of interest in both sectors appear very similar at the core and thus, lessons can be learnt with regards to the management of these issues in the private sector. Conflicts of interest corrode trust, and like in the public sector, trust is mandatory for the market, consumers/clients, shareholders and other stakeholders in the private sector. This research will then focus on some specific challenges to understanding and identifying conflicts of interest in the private sector; origin, diverging theories, the psychological barrier to the definition, similarities with public sector conflicts of interest due to the notions of corrosion of trust, ‘being in a particular kind of situation,’ etc. The notion of public interest will be submitted as a key element at the heart of the distinction between public sector and private sector conflicts of interests. It will then be proposed that the appreciation of the notion of conflicts of interest differ according to sector, country to country, based on the public interest test, using the United Kingdom (UK), the United States of America (US), France and the Philippines as illustrations.

Keywords: conflicts of interest, corporate governance, global governance, public interest

Procedia PDF Downloads 407
26060 Conflict around the Brownfield Reconversion of the Canadian Forces Base Rockcliffe in Ottawa: A Clash of Ambitions and Visions in Canadian Urban Sustainability

Authors: Kenza Benali

Abstract:

Over the past decade, a number of remarkable projects in urban brownfield reconversion emerged across Canada, including the reconversion of former military bases owned by the Canada Lands Company (CLC) into sustainable communities. However, unlike other developments, the regeneration project of the former Canadian Forces Base Rockcliffe in Ottawa – which was announced as one of the most ambitious Smart growth projects in Canada – faced serious obstacles in terms of social acceptance by the local community, particularly urban minorities composed of Francophones, Indigenous and vulnerable groups who live near or on the Base. This turn of events led to the project being postponed and even reconsidered. Through an analysis of its press coverage, this research aims to understand the causes of this urban conflict which lasted for nearly ten years. The findings reveal that the conflict is not limited to the “standard” issues common to most conflicts related to urban mega-projects in the world – e.g., proximity issues (threads to the quality of the surrounding neighbourhoods; noise, traffic, pollution, New-build gentrification) often associated with NIMBY phenomena. In this case, the local actors questioned the purpose of the project (for whom and for what types of uses is it conceived?), its local implementation (to what extent are the local history and existing environment taken into account?), and the degree of implication of the local population in the decision-making process (with whom is the project built?). Moreover, the interests of the local actors have “jumped scales” and transcend the micro-territorial level of their daily life to take on a national and even international dimension. They defined an alternative view of how this project, considered strategic by his location in the nation’s capital, should be a reference as well as an international showcase of Canadian ambition and achievement in terms of urban sustainability. This vision promoted, actually, a territorial and national identity approach - in which some cultural values are highly significant (respect of social justice, inclusivity, ethnical diversity, cultural heritage, etc.)- as a counterweight to planners’ vision which is criticized as a normative/ universalist logic that ignore the territorial peculiarities.

Keywords: smart growth, brownfield reconversion, sustainable neighborhoods, Canada Lands Company, Canadian Forces Base Rockcliffe, urban conflicts

Procedia PDF Downloads 385
26059 Comprehensive Strategy for Healthy City from Local Practice Networking among Citizens, Industry, University and Municipality

Authors: Yuki Hara

Abstract:

Healthy assets are recognized as important for all people in the world through experiencing COVID-19. Each part of life and work is important to be changed against the preceding wide-spreading of COVID-19. Furthermore, it is necessary to innovate the whole structure of a city upon the sum of the parts. This study aims at creating a comprehensive strategy from a small practice of making healthier lives with collaborating local actors for a city. This paper employs action research as the research framework. The core practice is the 'Ken’iku Festival' at Ken’iku Festival Committee. The field locates the urban-rural fringe in the northwest part of Fujisawa city, Kanagawa prefecture, Japan. The data is collected through the author's practices for three years from the observations and interviews at meetings and discussions among stakeholders, texts in municipal reports, books, and movies, 3 questionnaires for customers and stakeholders at the Ken’iku Festival. These data are analysed by qualitative methods. The results show that couples in their 40s with children and couples or friends over the 70s are at the heart of promoting healthy lifestyles. In contrast, 40% of the visitors at the festival are the people who have no idea or no interest in healthier actions, which the committee has to suggest healthy activities through more pleasing services. The committee could organize staff and local actors as the core parties involved through gradually expanding its tasks relating to the local practices. This private sectoral activity from health promotion is covering a part of the whole-city planning of Fujisawa municipality by including many people over organisations into one community. This paper concludes from local practice networking through the festival that a comprehensive strategy for a healthy city is both a practical approach easily applied to each partner and one of the holistic services.

Keywords: communal practice network, healthy cities, health & development, health promotion, with and after COVID-19

Procedia PDF Downloads 137
26058 Development of an EEG-Based Real-Time Emotion Recognition System on Edge AI

Authors: James Rigor Camacho, Wansu Lim

Abstract:

Over the last few years, the development of new wearable and processing technologies has accelerated in order to harness physiological data such as electroencephalograms (EEGs) for EEG-based applications. EEG has been demonstrated to be a source of emotion recognition signals with the highest classification accuracy among physiological signals. However, when emotion recognition systems are used for real-time classification, the training unit is frequently left to run offline or in the cloud rather than working locally on the edge. That strategy has hampered research, and the full potential of using an edge AI device has yet to be realized. Edge AI devices are computers with high performance that can process complex algorithms. It is capable of collecting, processing, and storing data on its own. It can also analyze and apply complicated algorithms like localization, detection, and recognition on a real-time application, making it a powerful embedded device. The NVIDIA Jetson series, specifically the Jetson Nano device, was used in the implementation. The cEEGrid, which is integrated to the open-source brain computer-interface platform (OpenBCI), is used to collect EEG signals. An EEG-based real-time emotion recognition system on Edge AI is proposed in this paper. To perform graphical spectrogram categorization of EEG signals and to predict emotional states based on input data properties, machine learning-based classifiers were used. Until the emotional state was identified, the EEG signals were analyzed using the K-Nearest Neighbor (KNN) technique, which is a supervised learning system. In EEG signal processing, after each EEG signal has been received in real-time and translated from time to frequency domain, the Fast Fourier Transform (FFT) technique is utilized to observe the frequency bands in each EEG signal. To appropriately show the variance of each EEG frequency band, power density, standard deviation, and mean are calculated and employed. The next stage is to identify the features that have been chosen to predict emotion in EEG data using the K-Nearest Neighbors (KNN) technique. Arousal and valence datasets are used to train the parameters defined by the KNN technique.Because classification and recognition of specific classes, as well as emotion prediction, are conducted both online and locally on the edge, the KNN technique increased the performance of the emotion recognition system on the NVIDIA Jetson Nano. Finally, this implementation aims to bridge the research gap on cost-effective and efficient real-time emotion recognition using a resource constrained hardware device, like the NVIDIA Jetson Nano. On the cutting edge of AI, EEG-based emotion identification can be employed in applications that can rapidly expand the research and implementation industry's use.

Keywords: edge AI device, EEG, emotion recognition system, supervised learning algorithm, sensors

Procedia PDF Downloads 109