Search results for: binary outcomes
4105 Program Level Learning Outcomes in Music and Technology: Toward Improved Assessment and Better Communication
Authors: Susan Lewis
Abstract:
The assessment of learning outcomes at the program level has attracted much international interest from the perspectives of quality assurance and ongoing curricular redesign and renewal. This paper examines program-level learning outcomes in the field of music and technology, an area of study that has seen an explosion in program development over the past fifteen years. The Audio Engineering Society (AES) maintains an online directory of educational institutions worldwide, yielding the most comprehensive inventory of programs and courses in music and technology. The inventory includes courses, programs, and degrees in music and technology, music and computer science, music production, and the music industry. This paper focuses on published student learning outcomes for undergraduate degrees in music and technology and analyses commonalities at institutions in North America, the United Kingdom, and Europe. The results of a survey of student learning outcomes at twenty institutions indicates a focus on three distinct student learning outcomes: (1) cross-disciplinary knowledge in the fields of music and technology; (2) the practical application of training through the professional industry; and (3) the acquisition of skills in communication and collaboration. The paper then analyses assessment mechanisms for tracking student learning and achievement of learning outcomes at these institutions. The results indicate highly variable assessment practices. Conclusions offer recommendations for enhancing assessment techniques and better communicating learning outcomes to students.Keywords: quality assurance, student learning; learning outcomes, music and technology
Procedia PDF Downloads 1854104 A Picture is worth a Billion Bits: Real-Time Image Reconstruction from Dense Binary Pixels
Authors: Tal Remez, Or Litany, Alex Bronstein
Abstract:
The pursuit of smaller pixel sizes at ever increasing resolution in digital image sensors is mainly driven by the stringent price and form-factor requirements of sensors and optics in the cellular phone market. Recently, Eric Fossum proposed a novel concept of an image sensor with dense sub-diffraction limit one-bit pixels (jots), which can be considered a digital emulation of silver halide photographic film. This idea has been recently embodied as the EPFL Gigavision camera. A major bottleneck in the design of such sensors is the image reconstruction process, producing a continuous high dynamic range image from oversampled binary measurements. The extreme quantization of the Poisson statistics is incompatible with the assumptions of most standard image processing and enhancement frameworks. The recently proposed maximum-likelihood (ML) approach addresses this difficulty, but suffers from image artifacts and has impractically high computational complexity. In this work, we study a variant of a sensor with binary threshold pixels and propose a reconstruction algorithm combining an ML data fitting term with a sparse synthesis prior. We also show an efficient hardware-friendly real-time approximation of this inverse operator. Promising results are shown on synthetic data as well as on HDR data emulated using multiple exposures of a regular CMOS sensor.Keywords: binary pixels, maximum likelihood, neural networks, sparse coding
Procedia PDF Downloads 2014103 A Two-Stage Bayesian Variable Selection Method with the Extension of Lasso for Geo-Referenced Data
Authors: Georgiana Onicescu, Yuqian Shen
Abstract:
Due to the complex nature of geo-referenced data, multicollinearity of the risk factors in public health spatial studies is a commonly encountered issue, which leads to low parameter estimation accuracy because it inflates the variance in the regression analysis. To address this issue, we proposed a two-stage variable selection method by extending the least absolute shrinkage and selection operator (Lasso) to the Bayesian spatial setting, investigating the impact of risk factors to health outcomes. Specifically, in stage I, we performed the variable selection using Bayesian Lasso and several other variable selection approaches. Then, in stage II, we performed the model selection with only the selected variables from stage I and compared again the methods. To evaluate the performance of the two-stage variable selection methods, we conducted a simulation study with different distributions for the risk factors, using geo-referenced count data as the outcome and Michigan as the research region. We considered the cases when all candidate risk factors are independently normally distributed, or follow a multivariate normal distribution with different correlation levels. Two other Bayesian variable selection methods, Binary indicator, and the combination of Binary indicator and Lasso were considered and compared as alternative methods. The simulation results indicated that the proposed two-stage Bayesian Lasso variable selection method has the best performance for both independent and dependent cases considered. When compared with the one-stage approach, and the other two alternative methods, the two-stage Bayesian Lasso approach provides the highest estimation accuracy in all scenarios considered.Keywords: Lasso, Bayesian analysis, spatial analysis, variable selection
Procedia PDF Downloads 1434102 The Twain Shall Meet: First Year Writing Skills in Senior Year Project Design
Authors: Sana Sayed
Abstract:
The words objectives, outcomes, and assessment are commonplace in academia. Educators, especially those who use their emotional intelligence as a useful teaching tool, strive to find creative and innovative ways to connect to their students while meeting the objectives, outcomes, and assessment measures for their respective courses. However, what happens to these outcomes once the objectives have been met, students have completed a specific course, and generic letter grades have been generated? How can their knowledge and acquired skills be assessed over the course of semesters, throughout their years of study, and until their final year right before they graduate? Considering the courses students complete for different departments in various disciplines, how can these outcomes be measured, or at least maintained, across the curriculum? This research-driven paper uses the key course outcomes of first year, required writing courses and traces them in two senior level, required civil engineering design courses at the American University of Sharjah, which is located in the United Arab Emirates. The purpose of this research is two-fold: (1) to assess specific learning outcomes using a case study that focuses on courses from two different disciplines during two very distinctive years of study, and (2) to demonstrate how learning across the curriculum fosters life-long proficiencies among graduating students that are aligned with a university’s mission statement.Keywords: assessment, learning across the curriculum, objectives, outcomes
Procedia PDF Downloads 3024101 The Foundation Binary-Signals Mechanics and Actual-Information Model of Universe
Authors: Elsadig Naseraddeen Ahmed Mohamed
Abstract:
In contrast to the uncertainty and complementary principle, it will be shown in the present paper that the probability of the simultaneous occupation event of any definite values of coordinates by any definite values of momentum and energy at any definite instance of time can be described by a binary definite function equivalent to the difference between their numbers of occupation and evacuation epochs up to that time and also equivalent to the number of exchanges between those occupation and evacuation epochs up to that times modulus two, these binary definite quantities can be defined at all point in the time’s real-line so it form a binary signal represent a complete mechanical description of physical reality, the time of these exchanges represent the boundary of occupation and evacuation epochs from which we can calculate these binary signals using the fact that the time of universe events actually extends in the positive and negative of time’s real-line in one direction of extension when these number of exchanges increase, so there exists noninvertible transformation matrix can be defined as the matrix multiplication of invertible rotation matrix and noninvertible scaling matrix change the direction and magnitude of exchange event vector respectively, these noninvertible transformation will be called actual transformation in contrast to information transformations by which we can navigate the universe’s events transformed by actual transformations backward and forward in time’s real-line, so these information transformations will be derived as an elements of a group can be associated to their corresponded actual transformations. The actual and information model of the universe will be derived by assuming the existence of time instance zero before and at which there is no coordinate occupied by any definite values of momentum and energy, and then after that time, the universe begin its expanding in spacetime, this assumption makes the need for the existence of Laplace’s demon who at one moment can measure the positions and momentums of all constituent particle of the universe and then use the law of classical mechanics to predict all future and past of universe’s events, superfluous, we only need for the establishment of our analog to digital converters to sense the binary signals that determine the boundaries of occupation and evacuation epochs of the definite values of coordinates relative to its origin by the definite values of momentum and energy as present events of the universe from them we can predict approximately in high precision it's past and future events.Keywords: binary-signal mechanics, actual-information model of the universe, actual-transformation, information-transformation, uncertainty principle, Laplace's demon
Procedia PDF Downloads 1754100 The Influence of the Normative Gender Binary in Diversity Management: A Multi-Method Study on Gender Diversity of Diversity Management
Authors: Robin C. Ladwig
Abstract:
Diversity Management, as a substantial element of Human Resource Management, aims to secure the economic benefit that assumingly comes with a diverse workforce. Consequently, diversity managers focus on the protection of employees and securing equality measurements to assure organisational gender diversity. Gender diversity as one aspect of Diversity Management seems to adhere to gender binarism and cis-normativity. Workplaces are gendered spaces which are echoing the binary gender-normativity presented in Diversity Management, sold under the label of gender diversity. While the expectation of Diversity Management implies the inclusion of a multiplicity of marginalised groups, such as trans and gender diverse people, in current literature and practice, the reality is curated by gender binarism and cis-normativity. The qualitative multi-method research showed a lack of knowledge about trans and gender diverse matters within the profession of Diversity Management and Human Resources. The semi-structured interviews with trans and gender diverse individuals from various backgrounds and occupations in Australia exposed missing considerations of trans and gender diverse experiences in the inclusivity and gender equity of various workplaces. Even if practitioners consider trans and gender diverse matters under gender diversity, the practical execution is limited to gender binary structures and cis-normative actions as the photo-elicit questionnaire with diversity managers, human resource officers, and personnel management demonstrates. Diversity Management should approach a broader source of informed practice by extending their business focus to the knowledge of humanity studies. Humanity studies could include diversity, queer, or gender studies to increase the inclusivity of marginalised groups such as trans and gender diverse employees and people. Furthermore, the definition of gender diversity should be extended beyond the gender binary and cis-normative experience. People may lose trust in Diversity Management as a supportive ally of marginalised employees if the understanding of inclusivity is limited to a gender binary and cis-normativity value system that misrepresents the richness of gender diversity.Keywords: cis-normativity, diversity management, gender binarism, trans and gender diversity
Procedia PDF Downloads 2024099 The Influence of Imposter Phenomenon on the Experiences of Intimacy in Non-Binary Young Adults
Authors: Muskan Jain, Baiju Gopal
Abstract:
Objectives: Intimacy in interpersonal relationships is integral to psychological health and everyday wellbeing; the focus is on intimacy, which can be described as feelings of closeness, connection, and belonging within relationships, which is influenced by an individual's gender identity as well as life experiences. The study aims to explore the experiences of intimacy of the non-binary gender; this marginalized community has increased risks of developing the imposter phenomenon. The study explores the influence of IP on the development and sustenance of intimacy in relationships. Methods: The present study accumulates detailed narratives from 10 non-binary young adults ages 18 to 25 in metropolitan cities of India. Thematic analysis was used for the data analysis. Results: Seven major themes have emerged revolving around internalized criticism and self-depreciating behavior, which causes distance between partners. The four themes that result in the internalization of criticism are lack of social stability, invalidation by social units, adverse life experiences, and estrangement due to gender identity. Three themes that encapsulate major difficulties in relationships are limited self-disclosure, inhibition of physical needs, and fear of taking space. The findings have been critically compared and contrasted with the existing body of literature in the domain, which sets the agenda for further inquiry. Conclusion: It is important for future studies to capture the experiences of non-binary genders in India to provide better therapeutic support in order to assist them in forming meaningful and authentic relationships, thus increasing overall wellbeing.Keywords: imposter phenomenon, intimacy, internalized criticism, marginalized community
Procedia PDF Downloads 574098 Predictive Value of Coagulopathy in Patients with Isolated Blunt Traumatic Brain Injury: A Cohort of Pakistani Population
Authors: Muhammad Waqas, Shahan Waheed, Mohsin Qadeer, Ehsan Bari, Salman Ahmed, Iqra Patoli
Abstract:
Objective: To determine the value of aPTT, platelets and INR as the predictor of unfavorable outcomes in patients with blunt isolated traumatic brain injury. Methods: This was an observational cohort study conducted in a tertiary care facility from 1st January 2008 to 31st December 2012. All the patients with isolated traumatic brain injury presenting within 24 hours of injury were included in the study. Coagulation parameters at presentation were recorded and Glasgow Outcome Scale calculated on last follow up. Outcomes were dichotomized into favorable and unfavorable outcomes. Relationship of coagulopathy with GOS and unfavorable outcomes was calculated using Spearman`s correlation and area under curve ROC analysis. Results: 121 patients were included in the study. The incidence of coagulopathy was found to be 6 %. aPTT was found to a significantly associated with unfavorable outcomes with an AUC = 0.702 (95%CI = 0.602-0.802). Predictive value of platelets and INR was not found to be significant. Conclusion: Incidence of coagulopathy was found to be low in current population compared to data from the West. aPTT was found to be a good predictor of unfavorable outcomes compared with other parameters of coagulation.Keywords: aPTT, coagulopathy, unfavorable outcomes, parameters
Procedia PDF Downloads 4804097 Nature Writing in Margaret Atwood’s 'The Testaments'
Authors: Natalia Fontes De Oliveira
Abstract:
Nature and women have a long age association that has persisted throughout history, cultures, literature, and arts. Women’s physiological functions of reproduction and childbearing are viewed as closer to nature as a binary opposition to men, who have metaphorically and historically been associated with culture. To liberate from strictures of phallogocentric rhetoric, a radical critique of the categories of nature and culture must be undertaken. This paper proposes that nature writing in Margaret Atwood’s The Testaments is used subversively as a form of rebellion to disrupt the metaphorical relationship between women and nature. In tune with ecofeminist concerns, the imagery rewrites patriarchal paradigms of binary oppositions as the protagonists narrate a complex and plural relationship between nature and women.Keywords: ecofeminism, Margaret Atwood, nature writing, women's writing
Procedia PDF Downloads 1654096 A Physical Theory of Information vs. a Mathematical Theory of Communication
Authors: Manouchehr Amiri
Abstract:
This article introduces a general notion of physical bit information that is compatible with the basics of quantum mechanics and incorporates the Shannon entropy as a special case. This notion of physical information leads to the Binary data matrix model (BDM), which predicts the basic results of quantum mechanics, general relativity, and black hole thermodynamics. The compatibility of the model with holographic, information conservation, and Landauer’s principles are investigated. After deriving the “Bit Information principle” as a consequence of BDM, the fundamental equations of Planck, De Broglie, Beckenstein, and mass-energy equivalence are derived.Keywords: physical theory of information, binary data matrix model, Shannon information theory, bit information principle
Procedia PDF Downloads 1714095 National Directorate of Employment Training and Agricultural-Small and Medium Enterprises Performance in Nigeria
Authors: Festus M. Epetimehin
Abstract:
This study was conducted to identify the effect of National Directorate of Employment (NDE) training on the profit of Agricultural-Small and Medium Enterprises (SMEs) and to evaluate the factors that influenced farmers' participation in NDE training, as well as the type and frequency of training farmers and other agro-allied entrepreneurs in Nigeria. Using a multi-stage sampling procedure, a total of 384 respondents were sampled, including 192 beneficiaries and 192 non-beneficiaries in Oyo and Lagos States, respectively. Data were analysed using Binary Logit regression and Propensity Score Matching techniques. According to the binary logit analysis, respondents’ gender, availability to extension services, and the location of respondent’s operation were determinant factors influencing NDE training enrolment. All identified factors are related to the probability of respondents’ involvement in a positive way. Propensity score matching revealed that Agricultural-SMEs who participated in the NDE program boosted their profit by N341,072.18. The positive outcome of the effect implies that NDE training enhances Agri-SME performance in Nigeria. The study concluded that greater funding should be provided for the NDE for performance-enhancing training of the Agri-SMEs.Keywords: PSM, binary logit model, Agri-SME
Procedia PDF Downloads 964094 Physical Activity Interventions and Maternal Health Outcomes in Nigeria: A Meta-Analysis of Randomized Controlled Trials
Authors: Jamilu Lawal Ajiya
Abstract:
Background: Physical activity is essential for improving maternal health outcomes, particularly in low- and middle-income countries like Nigeria. Objective: The aim is to evaluate the effectiveness of physical activity interventions on maternal health outcomes among Nigerian pregnant women. Methods: Systematic review and meta-analysis of randomized controlled trials (RCTs) conducted in Nigeria, published in English, and focusing on physical activity and maternal health outcomes. Results: Ten RCTs (N=1,200) were included. Physical activity interventions significantly reduced the risk of gestational diabetes, hypertension and preterm birth. Also, the study found that brisk walking and aerobic exercise were more effective than yoga. Conclusion: Physical activity interventions improve maternal health outcomes among Nigerian pregnant women. Policy changes and public health programs should prioritize physical activity promotion during pregnancy. This study informs healthcare providers, policymakers, and researchers on the effectiveness of physical activity interventions in improving maternal health outcomes in Nigeria.Keywords: physical activity, maternal health, Nigeria, randomized controlled trials
Procedia PDF Downloads 234093 Thermodynamic Behaviour of Binary Mixtures of 1, 2-Dichloroethane with Some Cyclic Ethers: Experimental Results and Modelling
Authors: Fouzia Amireche-Ziar, Ilham Mokbel, Jacques Jose
Abstract:
The vapour pressures of the three binary mixtures: 1, 2- dichloroethane + 1,3-dioxolane, + 1,4-dioxane or + tetrahydropyrane, are carried out at ten temperatures ranging from 273 to 353.15 K. An accurate static device was employed for these measurements. The VLE data were reduced using the Redlich-Kister equation by taking into consideration the vapour pressure non-ideality in terms of the second molar virial coefficient. The experimental data were compared to the results predicted with the DISQUAC and Dortmund UNIFAC group contribution models for the total pressures P and the excess molar Gibbs energies GE.Keywords: disquac model, dortmund UNIFAC model, excess molar Gibbs energies GE, VLE
Procedia PDF Downloads 2584092 Computer Simulation of Hydrogen Superfluidity through Binary Mixing
Authors: Sea Hoon Lim
Abstract:
A superfluid is a fluid of bosons that flows without resistance. In order to be a superfluid, a substance’s particles must behave like bosons, yet remain mobile enough to be considered a superfluid. Bosons are low-temperature particles that can be in all energy states at the same time. If bosons were to be cooled down, then the particles will all try to be on the lowest energy state, which is called the Bose Einstein condensation. The temperature when bosons start to matter is when the temperature has reached its critical temperature. For example, when Helium reaches its critical temperature of 2.17K, the liquid density drops and becomes a superfluid with zero viscosity. However, most materials will solidify -and thus not remain fluids- at temperatures well above the temperature at which they would otherwise become a superfluid. Only a few substances currently known to man are capable of at once remaining a fluid and manifesting boson statistics. The most well-known of these is helium and its isotopes. Because hydrogen is lighter than helium, and thus expected to manifest Bose statistics at higher temperatures than helium, one might expect hydrogen to also be a superfluid. As of today, however, no one has yet been able to produce a bulk, hydrogen superfluid. The reason why hydrogen did not form a superfluid in the past is its intermolecular interactions. As a result, hydrogen molecules are much more likely to crystallize than their helium counterparts. The key to creating a hydrogen superfluid is therefore finding a way to reduce the effect of the interactions among hydrogen molecules, postponing the solidification to lower temperature. In this work, we attempt via computer simulation to produce bulk superfluid hydrogen through binary mixing. Binary mixture is a technique of mixing two pure substances in order to avoid crystallization and enhance super fluidity. Our mixture here is KALJ H2. We then sample the partition function using this Path Integral Monte Carlo (PIMC), which is well-suited for the equilibrium properties of low-temperature bosons and captures not only the statistics but also the dynamics of Hydrogen. Via this sampling, we will then produce a time evolution of the substance and see if it exhibits superfluid properties.Keywords: superfluidity, hydrogen, binary mixture, physics
Procedia PDF Downloads 3164091 Team Workforce Diversity and Team Outcomes: A Meta-Analytic Review
Authors: Hyeondal Jeong, Yoonjung Baek
Abstract:
This study was carried out a meta-analysis on team workforce diversity and team outcomes. Using data from 3,534 teams in 13 studies conducted in team-level settings, we examined whether contextual factors at research local and team-size, influenced team outcomes of team workforce diversity. This meta-analytic examines the team workforce diversity and team outcomes. 13 studies included in the analysis are studies published from 2009 to 2014. We first examined the correlations between all types of diversity and team performance, significant result (Fisher`s Z = .112, k = 32, 95% CI = 0.039 to 0.183). After the analysis was conducted to moderating effect of research local (Republic of Korea=1, other area=0) and team-size. As a result, research local moderating effect had a significant but team-size was not supported. Based on the above findings suggest implications and future research directions.Keywords: team workforce diversity, team outcomes, meta- analytic, cross-cultural research
Procedia PDF Downloads 3114090 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications
Authors: H. Hruschka
Abstract:
This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models
Procedia PDF Downloads 1994089 Isothermal Vapour-Liquid Equilibria of Binary Mixtures of 1, 2-Dichloroethane with Some Cyclic Ethers: Experimental Results and Modelling
Authors: Fouzia Amireche-Ziar, Ilham Mokbel, Jacques Jose
Abstract:
The vapour pressures of the three binary mixtures: 1, 2- dichloroethane + 1,3-dioxolane, + 1,4-dioxane or + tetrahydropyrane, are carried out at ten temperatures ranging from 273 to 353.15 K. An accurate static device was employed for these measurements. The VLE data were reduced using the Redlich-Kister equation by taking into consideration the vapour pressure non-ideality in terms of the second molar virial coefficient. The experimental data were compared to the results predicted with the DISQUAC and Dortmund UNIFAC group contribution models for the total pressures P and the excess molar Gibbs energies GE.Keywords: disquac model, dortmund UNIFAC model, excess molar Gibbs energies GE, VLE
Procedia PDF Downloads 2284088 Design of Lead-Lag Based Internal Model Controller for Binary Distillation Column
Authors: Rakesh Kumar Mishra, Tarun Kumar Dan
Abstract:
Lead-Lag based Internal Model Control method is proposed based on Internal Model Control (IMC) strategy. In this paper, we have designed the Lead-Lag based Internal Model Control for binary distillation column for SISO process (considering only bottom product). The transfer function has been taken from Wood and Berry model. We have find the composition control and disturbance rejection using Lead-Lag based IMC and comparing with the response of simple Internal Model Controller.Keywords: SISO, lead-lag, internal model control, wood and berry, distillation column
Procedia PDF Downloads 6464087 Cardiokey: A Binary and Multi-Class Machine Learning Approach to Identify Individuals Using Electrocardiographic Signals on Wearable Devices
Authors: S. Chami, J. Chauvin, T. Demarest, Stan Ng, M. Straus, W. Jahner
Abstract:
Biometrics tools such as fingerprint and iris are widely used in industry to protect critical assets. However, their vulnerability and lack of robustness raise several worries about the protection of highly critical assets. Biometrics based on Electrocardiographic (ECG) signals is a robust identification tool. However, most of the state-of-the-art techniques have worked on clinical signals, which are of high quality and less noisy, extracted from wearable devices like a smartwatch. In this paper, we are presenting a complete machine learning pipeline that identifies people using ECG extracted from an off-person device. An off-person device is a wearable device that is not used in a medical context such as a smartwatch. In addition, one of the main challenges of ECG biometrics is the variability of the ECG of different persons and different situations. To solve this issue, we proposed two different approaches: per person classifier, and one-for-all classifier. The first approach suggests making binary classifier to distinguish one person from others. The second approach suggests a multi-classifier that distinguishes the selected set of individuals from non-selected individuals (others). The preliminary results, the binary classifier obtained a performance 90% in terms of accuracy within a balanced data. The second approach has reported a log loss of 0.05 as a multi-class score.Keywords: biometrics, electrocardiographic, machine learning, signals processing
Procedia PDF Downloads 1424086 Exploring Students’ Self-Evaluation on Their Learning Outcomes through an Integrated Cumulative Grade Point Average Reporting Mechanism
Authors: Suriyani Ariffin, Nor Aziah Alias, Khairil Iskandar Othman, Haslinda Yusoff
Abstract:
An Integrated Cumulative Grade Point Average (iCGPA) is a mechanism and strategy to ensure the curriculum of an academic programme is constructively aligned to the expected learning outcomes and student performance based on the attainment of those learning outcomes that is reported objectively in a spider web. Much effort and time has been spent to develop a viable mechanism and trains academics to utilize the platform for reporting. The question is: How well do learners conceive the idea of their achievement via iCGPA and whether quality learner attributes have been nurtured through the iCGPA mechanism? This paper presents the architecture of an integrated CGPA mechanism purported to address a holistic evaluation from the evaluation of courses learning outcomes to aligned programme learning outcomes attainment. The paper then discusses the students’ understanding of the mechanism and evaluation of their achievement from the generated spider web. A set of questionnaires were distributed to a group of students with iCGPA reporting and frequency analysis was used to compare the perspectives of students on their performance. In addition, the questionnaire also explored how they conceive the idea of an integrated, holistic reporting and how it generates their motivation to improve. The iCGPA group was found to be receptive to what they have achieved throughout their study period. They agreed that the achievement level generated from their spider web allows them to develop intervention and enhance the programme learning outcomes before they graduate.Keywords: learning outcomes attainment, iCGPA, programme learning outcomes, spider web, iCGPA reporting skills
Procedia PDF Downloads 2074085 Evaluation and Compression of Different Language Transformer Models for Semantic Textual Similarity Binary Task Using Minority Language Resources
Authors: Ma. Gracia Corazon Cayanan, Kai Yuen Cheong, Li Sha
Abstract:
Training a language model for a minority language has been a challenging task. The lack of available corpora to train and fine-tune state-of-the-art language models is still a challenge in the area of Natural Language Processing (NLP). Moreover, the need for high computational resources and bulk data limit the attainment of this task. In this paper, we presented the following contributions: (1) we introduce and used a translation pair set of Tagalog and English (TL-EN) in pre-training a language model to a minority language resource; (2) we fine-tuned and evaluated top-ranking and pre-trained semantic textual similarity binary task (STSB) models, to both TL-EN and STS dataset pairs. (3) then, we reduced the size of the model to offset the need for high computational resources. Based on our results, the models that were pre-trained to translation pairs and STS pairs can perform well for STSB task. Also, having it reduced to a smaller dimension has no negative effect on the performance but rather has a notable increase on the similarity scores. Moreover, models that were pre-trained to a similar dataset have a tremendous effect on the model’s performance scores.Keywords: semantic matching, semantic textual similarity binary task, low resource minority language, fine-tuning, dimension reduction, transformer models
Procedia PDF Downloads 2114084 Health Promotion Intervention to Enhance Health Outcomes for Older Adults
Authors: Elizabeth Waleola Afolabi-Soyemi
Abstract:
As the population of older adults continues to grow, improving health outcomes for this demographic has become an increasingly important public health goal. Health promotion interventions have been developed to address the unique health needs and challenges faced by older adults. This abstract reviews the literature on health promotion interventions for older adults and their effectiveness in improving health outcomes. Various interventions have been found to be effective, including physical activity programs, nutrition education, medication management, and social support programs. These interventions have been shown to improve outcomes such as functional status, quality of life, and disease management. Despite the success of these interventions, there are still barriers to their implementation, such as a lack of access to resources and inadequate funding. Further research is needed to identify effective strategies for overcoming these barriers and to develop more tailored interventions for specific populations of older adults. Overall, health promotion interventions have great potential to improve the health outcomes and quality of life of older adults and should be a priority for public health efforts.Keywords: health, humanity, health promotion, older adults
Procedia PDF Downloads 984083 Breaking Sensitivity Barriers: Perovskite Based Gas Sensors With Dimethylacetamide-Dimethyl Sulfoxide Solvent Mixture Strategy
Authors: Endalamaw Ewnu Kassa, Ade Kurniawan, Ya-Fen Wu, Sajal Biring
Abstract:
Perovskite-based gas sensors represent a highly promising materials within the realm of gas sensing technology, with a particular focus on detecting ammonia (NH3) due to its potential hazards. Our work conducted thorough comparison of various solvents, including dimethylformamide (DMF), DMF-dimethyl sulfoxide (DMSO), dimethylacetamide (DMAC), and DMAC-DMSO, for the preparation of our perovskite solution (MAPbI3). Significantly, we achieved an exceptional response at 10 ppm of ammonia gas by employing a binary solvent mixture of DMAC-DMSO. In contrast to prior reports that relied on single solvents for MAPbI3 precursor preparation, our approach using mixed solvents demonstrated a marked improvement in gas sensing performance. We attained enhanced surface coverage, a reduction in pinhole occurrences, and precise control over grain size in our perovskite films through the careful selection and mixtures of appropriate solvents. This study shows a promising potential of employing binary and multi-solvent mixture strategies as a means to propel advancements in gas sensor technology, opening up new opportunities for practical applications in environmental monitoring and industrial safety.Keywords: sensors, binary solvents, ammonia, sensitivity, grain size, pinholes, surface coverage
Procedia PDF Downloads 1074082 Aggregation of Fractal Aggregates Inside Fractal Cages in Irreversible Diffusion Limited Cluster Aggregation Binary Systems
Authors: Zakiya Shireen, Sujin B. Babu
Abstract:
Irreversible diffusion-limited cluster aggregation (DLCA) of binary sticky spheres was simulated by modifying the Brownian Cluster Dynamics (BCD). We randomly distribute N spheres in a 3D box of size L, the volume fraction is given by Φtot = (π/6)N/L³. We identify NA and NB number of spheres as species A and B in our system both having identical size. In these systems, both A and B particles undergo Brownian motion. Irreversible bond formation happens only between intra-species particles and inter-species interact only through hard-core repulsions. As we perform simulation using BCD we start to observe binary gels. In our study, we have observed that species B always percolate (cluster size equal to L) as expected for the monomeric case and species A does not percolate below a critical ratio which is different for different volume fractions. We will also show that the accessible volume of the system increases when compared to the monomeric case, which means that species A is aggregating inside the cage created by B. We have also observed that for moderate Φtot the system undergoes a transition from flocculation region to percolation region indicated by the change in fractal dimension from 1.8 to 2.5. For smaller ratio of A, it stays in the flocculation regime even though B have already crossed over to the percolation regime. Thus, we observe two fractal dimension in the same system.Keywords: BCD, fractals, percolation, sticky spheres
Procedia PDF Downloads 2804081 Features Reduction Using Bat Algorithm for Identification and Recognition of Parkinson Disease
Authors: P. Shrivastava, A. Shukla, K. Verma, S. Rungta
Abstract:
Parkinson's disease is a chronic neurological disorder that directly affects human gait. It leads to slowness of movement, causes muscle rigidity and tremors. Gait serve as a primary outcome measure for studies aiming at early recognition of disease. Using gait techniques, this paper implements efficient binary bat algorithm for an early detection of Parkinson's disease by selecting optimal features required for classification of affected patients from others. The data of 166 people, both fit and affected is collected and optimal feature selection is done using PSO and Bat algorithm. The reduced dataset is then classified using neural network. The experiments indicate that binary bat algorithm outperforms traditional PSO and genetic algorithm and gives a fairly good recognition rate even with the reduced dataset.Keywords: parkinson, gait, feature selection, bat algorithm
Procedia PDF Downloads 5454080 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4
Authors: Ryan A. Black, Stacey A. McCaffrey
Abstract:
Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.Keywords: instrument development, item response theory, latent trait theory, psychometrics
Procedia PDF Downloads 3564079 Binary Programming for Manufacturing Material and Manufacturing Process Selection Using Genetic Algorithms
Authors: Saleem Z. Ramadan
Abstract:
The material selection problem is concerned with the determination of the right material for a certain product to optimize certain performance indices in that product such as mass, energy density, and power-to-weight ratio. This paper is concerned about optimizing the selection of the manufacturing process along with the material used in the product under performance indices and availability constraints. In this paper, the material selection problem is formulated using binary programming and solved by genetic algorithm. The objective function of the model is to minimize the total manufacturing cost under performance indices and material and manufacturing process availability constraints.Keywords: optimization, material selection, process selection, genetic algorithm
Procedia PDF Downloads 4194078 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling
Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal
Abstract:
Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining
Procedia PDF Downloads 1724077 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm
Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn
Abstract:
Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct
Procedia PDF Downloads 2254076 Comparative Performance of Artificial Bee Colony Based Algorithms for Wind-Thermal Unit Commitment
Authors: P. K. Singhal, R. Naresh, V. Sharma
Abstract:
This paper presents the three optimization models, namely New Binary Artificial Bee Colony (NBABC) algorithm, NBABC with Local Search (NBABC-LS), and NBABC with Genetic Crossover (NBABC-GC) for solving the Wind-Thermal Unit Commitment (WTUC) problem. The uncertain nature of the wind power is incorporated using the Weibull probability density function, which is used to calculate the overestimation and underestimation costs associated with the wind power fluctuation. The NBABC algorithm utilizes a mechanism based on the dissimilarity measure between binary strings for generating the binary solutions in WTUC problem. In NBABC algorithm, an intelligent scout bee phase is proposed that replaces the abandoned solution with the global best solution. The local search operator exploits the neighboring region of the current solutions, whereas the integration of genetic crossover with the NBABC algorithm increases the diversity in the search space and thus avoids the problem of local trappings encountered with the NBABC algorithm. These models are then used to decide the units on/off status, whereas the lambda iteration method is used to dispatch the hourly load demand among the committed units. The effectiveness of the proposed models is validated on an IEEE 10-unit thermal system combined with a wind farm over the planning period of 24 hours.Keywords: artificial bee colony algorithm, economic dispatch, unit commitment, wind power
Procedia PDF Downloads 375