Search results for: computing paradigm
530 Experiences of Timing Analysis of Parallel Embedded Software
Authors: Muhammad Waqar Aziz, Syed Abdul Baqi Shah
Abstract:
The execution time analysis is fundamental to the successful design and execution of real-time embedded software. In such analysis, the Worst-Case Execution Time (WCET) of a program is a key measure, on the basis of which system tasks are scheduled. The WCET analysis of embedded software is also needed for system understanding and to guarantee its behavior. WCET analysis can be performed statically (without executing the program) or dynamically (through measurement). Traditionally, research on the WCET analysis assumes sequential code running on single-core platforms. However, as computation is steadily moving towards using a combination of parallel programs and multi-core hardware, new challenges in WCET analysis need to be addressed. In this article, we report our experiences of performing the WCET analysis of Parallel Embedded Software (PES) running on multi-core platform. The primary purpose was to investigate how WCET estimates of PES can be computed statically, and how they can be derived dynamically. Our experiences, as reported in this article, include the challenges we faced, possible suggestions to these challenges and the workarounds that were developed. This article also provides observations on the benefits and drawbacks of deriving the WCET estimates using the said methods and provides useful recommendations for further research in this area.Keywords: embedded software, worst-case execution-time analysis, static flow analysis, measurement-based analysis, parallel computing
Procedia PDF Downloads 324529 The Role of Knowledge Management in Global Software Engineering
Authors: Samina Khalid, Tehmina Khalil, Smeea Arshad
Abstract:
Knowledge management is essential ingredient of successful coordination in globally distributed software engineering. Various frameworks, KMSs, and tools have been proposed to foster coordination and communication between virtual teams but practical implementation of these solutions has not been found. Organizations have to face challenges to implement knowledge management system. For this purpose at first, a literature review is arranged to investigate about challenges that restrict organizations to implement KMS and then by taking in account these challenges a problem of need of integrated solution in the form of standardized KMS that can easily store tacit and explicit knowledge, has traced down to facilitate coordination and collaboration among virtual teams. Literature review has been already shown that knowledge is a complex perception with profound meanings, and one of the most important resources that contributes to the competitive advantage of an organization. In order to meet the different challenges caused by not properly managing knowledge related to projects among virtual teams in GSE, we suggest making use of the cloud computing model. In this research a distributed architecture to support KM storage is proposed called conceptual framework of KM as a service in cloud. Framework presented is enhanced and conceptual framework of KM is embedded into that framework to store projects related knowledge for future use.Keywords: management, Globsl software development, global software engineering
Procedia PDF Downloads 527528 Applicability of Fuzzy Logic for Intrusion Detection in Mobile Adhoc Networks
Authors: Ruchi Makani, B. V. R. Reddy
Abstract:
Mobile Adhoc Networks (MANETs) are gaining popularity due to their potential of providing low-cost mobile connectivity solutions to real-world communication problems. Integrating Intrusion Detection Systems (IDS) in MANETs is a tedious task by reason of its distinctive features such as dynamic topology, de-centralized authority and highly controlled/limited resource environment. IDS primarily use automated soft-computing techniques to monitor the inflow/outflow of traffic packets in a given network to detect intrusion. Use of machine learning techniques in IDS enables system to make decisions on intrusion while continuous keep learning about their dynamic environment. An appropriate IDS model is essential to be selected to expedite this application challenges. Thus, this paper focused on fuzzy-logic based machine learning IDS technique for MANETs and presented their applicability for achieving effectiveness in identifying the intrusions. Further, the selection of appropriate protocol attributes and fuzzy rules generation plays significant role for accuracy of the fuzzy-logic based IDS, have been discussed. This paper also presents the critical attributes of MANET’s routing protocol and its applicability in fuzzy logic based IDS.Keywords: AODV, mobile adhoc networks, intrusion detection, anomaly detection, fuzzy logic, fuzzy membership function, fuzzy inference system
Procedia PDF Downloads 178527 Effect of Culture and Parenting Styles on Ambivalent Sexism in Mexican Population
Authors: Ilse Gonzalez-Rivera, Rolando Diaz-Loving
Abstract:
Family, and parents in particular, are the main agents of socialization of children since they transmit values, beliefs, and cultural norms based on their own guidelines, so that children acquire the knowledge on how to interact with others in terms of the interaction with their parents. One way to measure socialization parenting is through parenting styles. Parenting styles are the set of parental behaviors that have a direct effect on the development of specific behaviors of children. The ideal parenting style depends on the cultural characteristics where people develop. In Mexico, the hierarchical structure of the family is built on a model in which men are dominant over women and their power is legitimized. This research explores the effect of parenting styles and the culture of the ambivalent sexism in the Mexican population. 150 men and 150 women participated. The instrument of individualism-collectivism was used to measure culture; participants also answered the instrument of ambivalent sexism and the parenting styles questionnaire. Regression analyses were done using sexism as the dependent variable and individualism-collectivism and parenting styles as independent variables. In addition, an analysis of variance between parental styles and gender of the participants was performed. The results indicate that the permissive style and authoritarian style are predictors of ambivalent sexism and higher levels of collectivism predict higher levels of sexism in both men and women. It is also found that parents tend to use authoritarian parenting style with women and permissive style with males. These results confirm the findings of other studies that indicate that parenting is an important variable that influences the interaction of adults. On the other hand, the effect of collectivism on sexism may be related to the fact that gender Mexican rules are rigid and for people with higher levels of collectivism, the social rules are more important than individual interests. In conclusion, these results indicate that both culture and parenting styles contribute to the maintenance of the status quo and prejudice towards women. Therefore, it is necessary to create proposals that break with this cultural paradigm and to further develop democratic styles of parenting with the aim of reducing prejudice and the legitimization of gender roles.Keywords: culture, gender, parenting style, sexism
Procedia PDF Downloads 261526 Principles to Design Urbanism in Cinema; An Aesthetic Study on Identity and Representation of a City in a Movie
Authors: Dorsa Moayedi
Abstract:
‘The Cities’ and Cinema have a history going as far back as silent films; however, the standards of picturing a city in a film are somewhat vague. ‘Genius Loci’ of a city can be easily described with parameters that architects have detected; nevertheless, the genius loci of an ‘urban movie’ is untouched. Cities have been among the provocative matters that pushed filmmakers to ponder upon them and to picture them along with their urban identity thoroughly in their artworks, though the impacts of the urban life on the plot and characters is neglected, and so a city in a movie is usually restricted to ‘the place where the story happens’. Cities and urban life are among those that are in constant change and ongoing expansion; therefore, they are always fresh and ready to challenge people with their existence. Thus, the relationship between the city and cinema is metamorphic, though it could be defined and explored. The dominant research on the idea of urbanism has been conducted by outstanding scholars of architecture, like Christian Norberg-Schulz, and the studies on Cinema have been done by theorists of cinema, like Christian Metz, who have mastered defining their own realm; still, the idea to mingle the domains to reach a unified theory which could be applied to ‘urban movies’ is barely worked on. In this research, we have sought mutual grounds to discuss ‘urbanism in cinema,’ the grounds that cinema could benefit from and get to a more accurate audio-visual representation of a city, in accordance with the ideas of Christopher Alexander and the term he coined ‘The Timeless Way of Building.’ We concentrate on movies that are dependent on urban life, mainly those that possess the names of cities, like ‘Nashville (1975), Manhattan (1979), Fargo (1996), Midnight in Paris (2011) or Roma (2018), according to the ideas of urban design and narratives of cinema. Contrary to what has often been assumed, cinema and architecture could be defined in line with similar parameters, and architectural terms could be applied to the research done on movies. Our findings indicate that the theories of Christopher Alexander can best fit the paradigm to study an ‘Urban Movie’, definitions of a timeless building, elaborate on the characteristics of a design that could be applied to definitions of an urban movie, and set a prototype for further filmmaking regarding the urban life.Keywords: city, urbanism, urban movies, identity, representation
Procedia PDF Downloads 66525 Cloud Support for Scientific Workflow Execution: Prototyping Solutions for Remote Sensing Applications
Authors: Sofiane Bendoukha, Daniel Moldt, Hayat Bendoukha
Abstract:
Workflow concepts are essential for the development of remote sensing applications. They can help users to manage and process satellite data and execute scientific experiments on distributed resources. The objective of this paper is to introduce an approach for the specification and the execution of complex scientific workflows in Cloud-like environments. The approach strives to support scientists during the modeling, the deployment and the monitoring of their workflows. This work takes advantage from Petri nets and more pointedly the so-called reference nets formalism, which provides a robust modeling/implementation technique. RENEWGRASS is a tool that we implemented and integrated into the Petri nets editor and simulator RENEW. It provides an easy way to support not experienced scientists during the specification of their workflows. It allows both modeling and enactment of image processing workflows from the remote sensing domain. Our case study is related to the implementation of vegetation indecies. We have implemented the Normalized Differences Vegetation Index (NDVI) workflow. Additionally, we explore the integration possibilities of the Cloud technology as a supplementary layer for the deployment of the current implementation. For this purpose, we discuss migration patterns of data and applications and propose an architecture.Keywords: cloud computing, scientific workflows, petri nets, RENEWGRASS
Procedia PDF Downloads 447524 Gender Justice and Feminist Self-Management Practices in the Solidarity Economy: A Quantitative Analysis of the Factors that Impact Enterprises Formed by Women in Brazil
Authors: Maria de Nazaré Moraes Soares, Silvia Maria Dias Pedro Rebouças, José Carlos Lázaro
Abstract:
The Solidarity Economy (SE) acts in the re-articulation of the economic field to the other spheres of social action. The significant participation of women in SE resulted in the formation of a national network of self-managed enterprises in Brazil: The Solidarity and Feminist Economy Network (SFEN). The objective of the research is to identify factors of gender justice and feminist self-management practices that adhere to the reality of women in SE enterprises. The conceptual apparatus related to feminist studies in this research covers Nancy Fraser approaches on gender justice, and Patricia Yancey Martin approaches on feminist management practices, and authors of postcolonial feminism such as Mohanty and Maria Lugones, who lead the discussion to peripheral contexts, a necessary perspective when observing the women’s movement in SE. The research has a quantitative nature in the phases of data collection and analysis. The data collection was performed through two data sources: the database mapped in Brazil in 2010-2013 by the National Information System in Solidary Economy and 150 questionnaires with women from 16 enterprises in SFEN, in a state of Brazilian northeast. The data were analyzed using the multivariate statistical technique of Factor Analysis. The results show that the factors that define gender justice and feminist self-management practices in SE are interrelated in several levels, proving statistically the intersectional condition of the issue of women. The evidence from the quantitative analysis allowed us to understand the dimensions of gender justice and feminist management practices intersectionality; in this sense, the non-distribution of domestic work interferes in non-representation of women in public spaces, especially in peripheral contexts. The study contributes with important reflections to the studies of this area and can be complemented in the future with a qualitative research that approaches the perspective of women in the context of the SE self-management paradigm.Keywords: feminist management practices, gender justice, self-management, solidarity economy
Procedia PDF Downloads 129523 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination
Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan
Abstract:
The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.Keywords: confidence interval, handwriting, kernel density estimator, KDE, logistic regression LoR, repeatability, reproducibility
Procedia PDF Downloads 124522 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data
Authors: Michelangelo Sofo, Giuseppe Labianca
Abstract:
In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm
Procedia PDF Downloads 23521 An Approximate Formula for Calculating the Fundamental Mode Period of Vibration of Practical Building
Authors: Abdul Hakim Chikho
Abstract:
Most international codes allow the use of an equivalent lateral load method for designing practical buildings to withstand earthquake actions. This method requires calculating an approximation to the fundamental mode period of vibrations of these buildings. Several empirical equations have been suggested to calculate approximations to the fundamental periods of different types of structures. Most of these equations are knowing to provide an only crude approximation to the required fundamental periods and repeating the calculation utilizing a more accurate formula is usually required. In this paper, a new formula to calculate a satisfactory approximation of the fundamental period of a practical building is proposed. This formula takes into account the mass and the stiffness of the building therefore, it is more logical than the conventional empirical equations. In order to verify the accuracy of the proposed formula, several examples have been solved. In these examples, calculating the fundamental mode periods of several farmed buildings utilizing the proposed formula and the conventional empirical equations has been accomplished. Comparing the obtained results with those obtained from a dynamic computer has shown that the proposed formula provides a more accurate estimation of the fundamental periods of practical buildings. Since the proposed method is still simple to use and requires only a minimum computing effort, it is believed to be ideally suited for design purposes.Keywords: earthquake, fundamental mode period, design, building
Procedia PDF Downloads 284520 On the Use of Machine Learning for Tamper Detection
Authors: Basel Halak, Christian Hall, Syed Abdul Father, Nelson Chow Wai Kit, Ruwaydah Widaad Raymode
Abstract:
The attack surface on computing devices is becoming very sophisticated, driven by the sheer increase of interconnected devices, reaching 50B in 2025, which makes it easier for adversaries to have direct access and perform well-known physical attacks. The impact of increased security vulnerability of electronic systems is exacerbated for devices that are part of the critical infrastructure or those used in military applications, where the likelihood of being targeted is very high. This continuously evolving landscape of security threats calls for a new generation of defense methods that are equally effective and adaptive. This paper proposes an intelligent defense mechanism to protect from physical tampering, it consists of a tamper detection system enhanced with machine learning capabilities, which allows it to recognize normal operating conditions, classify known physical attacks and identify new types of malicious behaviors. A prototype of the proposed system has been implemented, and its functionality has been successfully verified for two types of normal operating conditions and further four forms of physical attacks. In addition, a systematic threat modeling analysis and security validation was carried out, which indicated the proposed solution provides better protection against including information leakage, loss of data, and disruption of operation.Keywords: anti-tamper, hardware, machine learning, physical security, embedded devices, ioT
Procedia PDF Downloads 153519 Selective Effect of Occipital Alpha Transcranial Alternating Current Stimulation in Perception and Working Memory
Authors: Andreina Giustiniani, Massimiliano Oliveri
Abstract:
Rhythmic activity in different frequencies could subserve distinct functional roles during visual perception and visual mental imagery. In particular, alpha band activity is thought to play a role in active inhibition of both task-irrelevant regions and processing of non-relevant information. In the present blind placebo-controlled study we applied alpha transcranial alternating current stimulation (tACS) in the occipital cortex both during a basic visual perception and a visual working memory task. To understand if the role of alpha is more related to a general inhibition of distractors or to an inhibition of task-irrelevant regions, we added a non visual distraction to both the tasks.Sixteen adult volunteers performed both a simple perception and a working memory task during 10 Hz tACS. The electrodes were placed over the left and right occipital cortex, the current intensity was 1 mA peak-to-baseline. Sham stimulation was chosen as control condition and in order to elicit the skin sensation similar to the real stimulation, electrical stimulation was applied for short periods (30 s) at the beginning of the session and then turned off. The tasks were split in two sets, in one set distracters were included and in the other set, there were no distracters. Motor interference was added by changing the answer key after subjects completed the first set of trials.The results show that alpha tACS improves working memory only when no motor distracters are added, suggesting a role of alpha tACS in inhibiting non-relevant regions rather than in a general inhibition of distractors. Additionally, we found that alpha tACS does not affect accuracy and hit rates during the visual perception task. These results suggest that alpha activity in the occipital cortex plays a different role in perception and working memory and it could optimize performance in tasks in which attention is internally directed, as in this working memory paradigm, but only when there is not motor distraction. Moreover, alpha tACS improves working memory performance by means of inhibition of task-irrelevant regions while it does not affect perception.Keywords: alpha activity, interference, perception, working memory
Procedia PDF Downloads 256518 Bridging Cultures in Distance Education: A Confluence of Critical Pedagogy of Place and Indigenous Education Philosophy (Case-Study Reference in Fiji and Vanuatu)
Authors: Dan Frederick Orcherton
Abstract:
This research explores the fusion of "Critical Pedagogy of Place" and "Indigenous Education Philosophy" to create a holistic pedagogical framework within Instructional Theory, focusing on its application in Distance Education, specifically within two Pacific Island cultures. The study's objectives included investigating culturally relevant instructional techniques, strategies, and technologies for the Itaukei (Indigenous Fijian) and ni-Vanuatu cultures, enhancing appreciation for culturally sensitive pedagogical methods. Methodologically, a qualitative inquiry phenomenological approach was employed within a constructivist paradigm, utilizing a comprehensive qualitative scoping review and online literature search. Key findings include the prioritization of cultural inclusivity and indigenous knowledge integration in both indigenous education philosophies and various instructional approaches. Learner-centered methods like constructivist andragogy and the learning cycle are applicable and effective in distance education within these cultures, aligning with indigenous learners' values and preferences. Place-based education and critical pedagogy of place are particularly pertinent, fostering a deeper connection between education, local environments, and social justice. Integrating digital technologies in culturally responsive education bridges geographical gaps and preserves cultural knowledge. Lastly, blending Western and Indigenous Science, influenced by a Two-Eyed Seeing approach, informs pedagogy by combining Western and Indigenous Science. This research underscores the importance of acknowledging cultural diversity and respecting indigenous knowledge in distance education. It highlights the value of learner-centered approaches, place-based education, and technology integration. The study enriches the educational experience within the Itaukei and niVanuatu cultures and provides insights for educators and policymakers aiming to bridge cultural gaps in distance education.Keywords: critical pedegory of place, Itaukei (indigenous Fijian) and ni-Vanuatu cultures, placed-based education, indigenous knowledge, distance education
Procedia PDF Downloads 18517 Fault-Tolerant Control Study and Classification: Case Study of a Hydraulic-Press Model Simulated in Real-Time
Authors: Jorge Rodriguez-Guerra, Carlos Calleja, Aron Pujana, Iker Elorza, Ana Maria Macarulla
Abstract:
Society demands more reliable manufacturing processes capable of producing high quality products in shorter production cycles. New control algorithms have been studied to satisfy this paradigm, in which Fault-Tolerant Control (FTC) plays a significant role. It is suitable to detect, isolate and adapt a system when a harmful or faulty situation appears. In this paper, a general overview about FTC characteristics are exposed; highlighting the properties a system must ensure to be considered faultless. In addition, a research to identify which are the main FTC techniques and a classification based on their characteristics is presented in two main groups: Active Fault-Tolerant Controllers (AFTCs) and Passive Fault-Tolerant Controllers (PFTCs). AFTC encompasses the techniques capable of re-configuring the process control algorithm after the fault has been detected, while PFTC comprehends the algorithms robust enough to bypass the fault without further modifications. The mentioned re-configuration requires two stages, one focused on detection, isolation and identification of the fault source and the other one in charge of re-designing the control algorithm by two approaches: fault accommodation and control re-design. From the algorithms studied, one has been selected and applied to a case study based on an industrial hydraulic-press. The developed model has been embedded under a real-time validation platform, which allows testing the FTC algorithms and analyse how the system will respond when a fault arises in similar conditions as a machine will have on factory. One AFTC approach has been picked up as the methodology the system will follow in the fault recovery process. In a first instance, the fault will be detected, isolated and identified by means of a neural network. In a second instance, the control algorithm will be re-configured to overcome the fault and continue working without human interaction.Keywords: fault-tolerant control, electro-hydraulic actuator, fault detection and isolation, control re-design, real-time
Procedia PDF Downloads 177516 Assessment of Students Skills in Error Detection in SQL Classes using Rubric Framework - An Empirical Study
Authors: Dirson Santos De Campos, Deller James Ferreira, Anderson Cavalcante Gonçalves, Uyara Ferreira Silva
Abstract:
Rubrics to learning research provide many evaluation criteria and expected performance standards linked to defined student activity for learning and pedagogical objectives. Despite the rubric being used in education at all levels, academic literature on rubrics as a tool to support research in SQL Education is quite rare. There is a large class of SQL queries is syntactically correct, but certainly, not all are semantically correct. Detecting and correcting errors is a recurring problem in SQL education. In this paper, we usthe Rubric Abstract Framework (RAF), which consists of steps, that allows us to map the information to measure student performance guided by didactic objectives defined by the teacher as long as it is contextualized domain modeling by rubric. An empirical study was done that demonstrates how rubrics can mitigate student difficulties in finding logical errors and easing teacher workload in SQL education. Detecting and correcting logical errors is an important skill for students. Researchers have proposed several ways to improve SQL education because understanding this paradigm skills are crucial in software engineering and computer science. The RAF instantiation was using in an empirical study developed during the COVID-19 pandemic in database course. The pandemic transformed face-to-face and remote education, without presential classes. The lab activities were conducted remotely, which hinders the teaching-learning process, in particular for this research, in verifying the evidence or statements of knowledge, skills, and abilities (KSAs) of students. Various research in academia and industry involved databases. The innovation proposed in this paper is the approach used where the results obtained when using rubrics to map logical errors in query formulation have been analyzed with gains obtained by students empirically verified. The research approach can be used in the post-pandemic period in both classroom and distance learning.Keywords: rubric, logical error, structured query language (SQL), empirical study, SQL education
Procedia PDF Downloads 190515 The Design of a Smartbrush Oral Health Installation for Aged Care Centres in Australia
Authors: Lukasz Grzegorz Broda, Taiwo Oseni, Andrew Stranieri, Rodrigo Marino, Ronelle Welton, Mark Yates
Abstract:
The oral health of residents in aged care centres in Australia is poor, contributing to infections, hospital admissions, and increased suffering. Although the use of electric toothbrushes has been deployed in many centres, smartbrushes that record and transmit information about brushing patterns and duration are not routinely deployed. Yet, the use of smartbrushes for aged care residents promises better oral care. Thus, a study aimed at investigating the appropriateness and suitability of a smartbrush for aged care residents is currently underway. Due to the peculiarity of the aged care setting, the incorporation of smartbrushes into residents’ care does require careful planning and design considerations. This paper describes an initial design process undertaken through the use of an actor to understand the important elements to be incorporated whilst installing a smartbrush for use in aged care settings. The design covers the configuration settings of the brush and app, including ergonomic factors related to brush and smartphone placement. A design science approach led to an installation re-design and a revised protocol for the planned study, the ultimate aim being to design installations to enhance perceived usefulness, ease of use, and attitudes towards the incorporation of smartbrushes for improving oral health care for aged care residents.Keywords: smartbrush, applied computing, life and medical sciences, health informatics
Procedia PDF Downloads 171514 A Low Cost Non-Destructive Grain Moisture Embedded System for Food Safety and Quality
Authors: Ritula Thakur, Babankumar S. Bansod, Puneet Mehta, S. Chatterji
Abstract:
Moisture plays an important role in storage, harvesting and processing of food grains and related agricultural products. It is an important characteristic of most agricultural products for maintenance of quality. Accurate knowledge of the moisture content can be of significant value in maintaining quality and preventing contamination of cereal grains. The present work reports the design and development of microcontroller based low cost non-destructive moisture meter, which uses complex impedance measurement method for moisture measurement of wheat using parallel plate capacitor arrangement. Moisture can conveniently be sensed by measuring the complex impedance using a small parallel-plate capacitor sensor filled with the kernels in-between the two plates of sensor, exciting the sensor at 30 KHz and 100 KHz frequencies. The effects of density and temperature variations were compensated by providing suitable compensations in the developed algorithm. The results were compared with standard dry oven technique and the developed method was found to be highly accurate with less than 1% error. The developed moisture meter is low cost, highly accurate, non-destructible method for determining the moisture of grains utilizing the fast computing capabilities of microcontroller.Keywords: complex impedance, moisture content, electrical properties, safety of food
Procedia PDF Downloads 462513 Nelder-Mead Parametric Optimization of Elastic Metamaterials with Artificial Neural Network Surrogate Model
Authors: Jiaqi Dong, Qing-Hua Qin, Yi Xiao
Abstract:
Some of the most fundamental challenges of elastic metamaterials (EMMs) optimization can be attributed to the high consumption of computational power resulted from finite element analysis (FEA) simulations that render the optimization process inefficient. Furthermore, due to the inherent mesh dependence of FEA, minuscule geometry features, which often emerge during the later stages of optimization, induce very fine elements, resulting in enormously high time consumption, particularly when repetitive solutions are needed for computing the objective function. In this study, a surrogate modelling algorithm is developed to reduce computational time in structural optimization of EMMs. The surrogate model is constructed based on a multilayer feedforward artificial neural network (ANN) architecture, trained with prepopulated eigenfrequency data prepopulated from FEA simulation and optimized through regime selection with genetic algorithm (GA) to improve its accuracy in predicting the location and width of the primary elastic band gap. With the optimized ANN surrogate at the core, a Nelder-Mead (NM) algorithm is established and its performance inspected in comparison to the FEA solution. The ANNNM model shows remarkable accuracy in predicting the band gap width and a reduction of time consumption by 47%.Keywords: artificial neural network, machine learning, mechanical metamaterials, Nelder-Mead optimization
Procedia PDF Downloads 128512 Towards a Deconstructive Text: Beyond Language and the Politics of Absences in Samuel Beckett’s Waiting for Godot
Authors: Afia Shahid
Abstract:
The writing of Samuel Beckett is associated with meaning in the meaninglessness and the production of what he calls ‘literature of unword’. The casual escape from the world of words in the form of silences and pauses, in his play Waiting for Godot, urges to ask question of their existence and ultimately leads to investigate the theory behind their use in the play. This paper proposes that these absences (silence and pause) in Beckett’s play force to think ‘beyond’ language. This paper asks how silence and pause in Beckett’s text speak for the emergence of poststructuralist text. It aims to identify the significant features of the philosophy of deconstruction in the play of Beckett to demystify the hostile complicity between literature and philosophy. With the interpretive paradigm of poststructuralism this research focuses on the text as a research data. It attempts to delineate the relationship between poststructuralist theoretical concerns and text of Beckett. Keeping in view the theoretical concerns of Poststructuralist theorist Jacques Derrida, the main concern of the discussion is directed towards the notion of ‘beyond’ language into the absences that are aimed at silencing the existing discourse with the ‘radical irony’ of this anti-formal art that contains its own denial and thus represents the idea of ceaseless questioning and radical contradiction in art and any text. This article asks how text of Beckett vibrates with loud silence and has disrupted language to demonstrate the emptiness of words and thus exploring the limitless void of absences. Beckett’s text resonates with silence and pause that is neither negation nor affirmation rather a poststructuralist’s suspension of reality that is ever changing with the undecidablity of all meanings. Within the theoretical notion of Derrida’s Différance this study interprets silence and pause in Beckett’s art. The silence and pause behave like Derrida’s Différance and have questioned their own existence in the text to deconstruct any definiteness and finality of reality to extend an undecidable threshold of poststructuralists that aims to evade the ‘labyrinth of language’.Keywords: Différance, language, pause, poststructuralism, silence, text
Procedia PDF Downloads 209511 The Data-Driven Localized Wave Solution of the Fokas-Lenells Equation Using Physics-Informed Neural Network
Authors: Gautam Kumar Saharia, Sagardeep Talukdar, Riki Dutta, Sudipta Nandy
Abstract:
The physics-informed neural network (PINN) method opens up an approach for numerically solving nonlinear partial differential equations leveraging fast calculating speed and high precession of modern computing systems. We construct the PINN based on a strong universal approximation theorem and apply the initial-boundary value data and residual collocation points to weekly impose initial and boundary conditions to the neural network and choose the optimization algorithms adaptive moment estimation (ADAM) and Limited-memory Broyden-Fletcher-Golfard-Shanno (L-BFGS) algorithm to optimize learnable parameter of the neural network. Next, we improve the PINN with a weighted loss function to obtain both the bright and dark soliton solutions of the Fokas-Lenells equation (FLE). We find the proposed scheme of adjustable weight coefficients into PINN has a better convergence rate and generalizability than the basic PINN algorithm. We believe that the PINN approach to solve the partial differential equation appearing in nonlinear optics would be useful in studying various optical phenomena.Keywords: deep learning, optical soliton, physics informed neural network, partial differential equation
Procedia PDF Downloads 70510 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm
Authors: Xiang Jianhong, Wang Cong, Wang Linyu
Abstract:
With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.Keywords: telemedicine, fetal ECG, compressed sensing, joint sparse reconstruction, block sparse signal
Procedia PDF Downloads 128509 The Implications of Instrumental Animal Protection for the Legal and Moral Status of Animals
Authors: Ankita Shanker, Angus Nurse
Abstract:
The notion of animal rights is an emerging trend in various spaces, including judicial and societal discourse. But one of the key purposes of recognizing the fundamental rights of anyone is their de-objectification. Animals are a prime example of a group that has rights that are neither recognized nor protected in any meaningful way, and anything that purports differently fails to ameliorate this because it still objectifies animals. Animals are currently treated by law and society as commodities with primarily (though not exclusively) instrumental value to some other rights-holder, such as humans or nature. So most protections that are afforded to them are done so in furtherance of the interests that they allegedly further, be it social morality or environmental protection. Animal rights are thus often seen as an application or extension of the rights of humans or, more commonly, the rights of nature. What this means is that animal rights are not always protected or even recognized in their own regard, but as stemming from some other reason, or worse, instrumentally as means to some other ends. This has two identifiable effects from a legal perspective: animal rights are not seen as inherently justified and are not seen as inherently valuable. Which in turn means that there can be no fundamental protection of animal rights. In other words, judicial protection does not always entail protection of animal ‘rights’ qua animal rights, which is needed for any meaningful protections to be afforded to animals. But the effects of this legal paradigm do not end at the legal status of animals. Because this status, in turn, affects how persons and the societies of which they form part see animals as a part of the rights of others, such as humans or nature, or as valuable only insofar as they further these rights, as opposed to as individuals with inherent worth and value deserving of protection regardless of their instrumental usefulness to these other objectives. This does nothing to truly de-objectify animals. Because even though most people would agree that animals are not objects, they continue to treat them as such wherever it serves them. For individuals and society to resolve, this inconsistency between stance and actions is for them to believe that animals are more than objects on a psychological and societal level. In this paper, we examine the implications of this perception of animals and their rights on the legal protections afforded to them and on the minds of individuals and civil society. We also argue that a change in the legal and societal status of animals can be brought about only through judicial, psychological, and sociological acknowledgment that animals have inherent value and deserve protection on this basis. Animal rights derived in such a way would not need to place reliance on other justifications and would not be subject to subjugation to other rights should a conflict arise.Keywords: animal rights law, animal protection laws, psycho-socio-legal studies, animal rights, human rights, rights of nature
Procedia PDF Downloads 108508 Breast Cancer Risk is Predicted Using Fuzzy Logic in MATLAB Environment
Authors: S. Valarmathi, P. B. Harathi, R. Sridhar, S. Balasubramanian
Abstract:
Machine learning tools in medical diagnosis is increasing due to the improved effectiveness of classification and recognition systems to help medical experts in diagnosing breast cancer. In this study, ID3 chooses the splitting attribute with the highest gain in information, where gain is defined as the difference between before the split versus after the split. It is applied for age, location, taluk, stage, year, period, martial status, treatment, heredity, sex, and habitat against Very Serious (VS), Very Serious Moderate (VSM), Serious (S) and Not Serious (NS) to calculate the gain of information. The ranked histogram gives the gain of each field for the breast cancer data. The doctors use TNM staging which will decide the risk level of the breast cancer and play an important decision making field in fuzzy logic for perception based measurement. Spatial risk area (taluk) of the breast cancer is calculated. Result clearly states that Coimbatore (North and South) was found to be risk region to the breast cancer than other areas at 20% criteria. Weighted value of taluk was compared with criterion value and integrated with Map Object to visualize the results. ID3 algorithm shows the high breast cancer risk regions in the study area. The study has outlined, discussed and resolved the algorithms, techniques / methods adopted through soft computing methodology like ID3 algorithm for prognostic decision making in the seriousness of the breast cancer.Keywords: ID3 algorithm, breast cancer, fuzzy logic, MATLAB
Procedia PDF Downloads 519507 Effect of Birks Constant and Defocusing Parameter on Triple-to-Double Coincidence Ratio Parameter in Monte Carlo Simulation-GEANT4
Authors: Farmesk Abubaker, Francesco Tortorici, Marco Capogni, Concetta Sutera, Vincenzo Bellini
Abstract:
This project concerns with the detection efficiency of the portable triple-to-double coincidence ratio (TDCR) at the National Institute of Metrology of Ionizing Radiation (INMRI-ENEA) which allows direct activity measurement and radionuclide standardization for pure-beta emitter or pure electron capture radionuclides. The dependency of the simulated detection efficiency of the TDCR, by using Monte Carlo simulation Geant4 code, on the Birks factor (kB) and defocusing parameter has been examined especially for low energy beta-emitter radionuclides such as 3H and 14C, for which this dependency is relevant. The results achieved in this analysis can be used for selecting the best kB factor and the defocusing parameter for computing theoretical TDCR parameter value. The theoretical results were compared with the available ones, measured by the ENEA TDCR portable detector, for some pure-beta emitter radionuclides. This analysis allowed to improve the knowledge of the characteristics of the ENEA TDCR detector that can be used as a traveling instrument for in-situ measurements with particular benefits in many applications in the field of nuclear medicine and in the nuclear energy industry.Keywords: Birks constant, defocusing parameter, GEANT4 code, TDCR parameter
Procedia PDF Downloads 148506 Effect of Classroom Acoustic Factors on Language and Cognition in Bilinguals and Children with Mild to Moderate Hearing Loss
Authors: Douglas MacCutcheon, Florian Pausch, Robert Ljung, Lorna Halliday, Stuart Rosen
Abstract:
Contemporary classrooms are increasingly inclusive of children with mild to moderate disabilities and children from different language backgrounds (bilinguals, multilinguals), but classroom environments and standards have not yet been adapted adequately to meet these challenges brought about by this inclusivity. Additionally, classrooms are becoming noisier as a learner-centered as opposed to teacher-centered teaching paradigm is adopted, which prioritizes group work and peer-to-peer learning. Challenging listening conditions with distracting sound sources and background noise are known to have potentially negative effects on children, particularly those that are prone to struggle with speech perception in noise. Therefore, this research investigates two groups vulnerable to these environmental effects, namely children with a mild to moderate hearing loss (MMHLs) and sequential bilinguals learning in their second language. In the MMHL study, this group was assessed on speech-in-noise perception, and a number of receptive language and cognitive measures (auditory working memory, auditory attention) and correlations were evaluated. Speech reception thresholds were found to be predictive of language and cognitive ability, and the nature of correlations is discussed. In the bilinguals study, sequential bilingual children’s listening comprehension, speech-in-noise perception, listening effort and release from masking was evaluated under a number of different ecologically valid acoustic scenarios in order to pinpoint the extent of the ‘native language benefit’ for Swedish children learning in English, their second language. Scene manipulations included target-to-distractor ratios and introducing spatially separated noise. This research will contribute to the body of findings from which educational institutions can draw when designing or adapting educational environments in inclusive schools.Keywords: sequential bilinguals, classroom acoustics, mild to moderate hearing loss, speech-in-noise, release from masking
Procedia PDF Downloads 326505 Spirituality, Sense of Community and Economic Welfare: A Case of Mawlynnong Village, India
Authors: Ricky A. J. Syngkon, Santi Gopal Maji
Abstract:
Decent work and inclusive economic growth, social development, environmental protection, eradication of poverty and hunger as well as clean water and sanitation are the rudiments of 2030 agenda of sustainable development goals of the United Nations. On the other hand, spirituality is deeply entwined in the fabric of daily lives that helps in shaping attitudes, opinions, and behaviors of common people and ensuring quality of lives and overall sustainable development through protection of environment and natural resources. Mawlynnong, a small village in North-Eastern part of India, is a vivid example of how spirituality influences the development of sense of community leading to upliftment of the economic conditions of the people. Mawlynnong as a small hamlet has been in existence for a couple of centuries and it was acknowledged as the cleanest village of Asia in 2004 by BBC and National Geographic and subsequently endorsed by UNESCO in 2006. Consequently, it has attracted large number of tourists over the years from India and other parts of the world. This paper tries to explore how spirituality leads to a sense of community and the economic benefits for the people. Further, this paper also tries to find out the answer whether such an informal collective effort is sustainable or not for achieving solidarity economy. The study is based on both primary and secondary data collected from the local people and the State Government records. The findings of the study indicate that over the last one and a half decade the tourist footfall has increased to a great extent in Mawlynnong and this has brought about a paradigm shift in the occupational structure of its inhabitants from plantation to service sector particularly tourism and tourism related activities. As a result, from the economic standpoint, it is observed that life is much better off now as compared to before. But from the socio-cultural standpoint, the study finds a drift in terms of the cohesiveness and community bonding which was the hallmark of this village. This drift puts a question mark about the sustainability of such practices and consequently the development of solidarity economy.Keywords: spirituality, sense of community, economic welfare, solidarity economy, Mawlynnong village
Procedia PDF Downloads 140504 Noise Source Identification on Urban Construction Sites Using Signal Time Delay Analysis
Authors: Balgaisha G. Mukanova, Yelbek B. Utepov, Aida G. Nazarova, Alisher Z. Imanov
Abstract:
The problem of identifying local noise sources on a construction site using a sensor system is considered. Mathematical modeling of detected signals on sensors was carried out, considering signal decay and signal delay time between the source and detector. Recordings of noises produced by construction tools were used as a dependence of noise on time. Synthetic sensor data was constructed based on these data, and a model of the propagation of acoustic waves from a point source in the three-dimensional space was applied. All sensors and sources are assumed to be located in the same plane. A source localization method is checked based on the signal time delay between two adjacent detectors and plotting the direction of the source. Based on the two direct lines' crossline, the noise source's position is determined. Cases of one dominant source and the case of two sources in the presence of several other sources of lower intensity are considered. The number of detectors varies from three to eight detectors. The intensity of the noise field in the assessed area is plotted. The signal of a two-second duration is considered. The source is located for subsequent parts of the signal with a duration above 0.04 sec; the final result is obtained by computing the average value.Keywords: acoustic model, direction of arrival, inverse source problem, sound localization, urban noises
Procedia PDF Downloads 62503 Insulin Resistance in Children and Adolescents in Relation to Body Mass Index, Waist Circumference and Body Fat Weight
Authors: E. Vlachopapadopoulou, E. Dikaiakou, E. Anagnostou, I. Panagiotopoulos, E. Kaloumenou, M. Kafetzi, A. Fotinou, S. Michalacos
Abstract:
Aim: To investigate the relation and impact of Body Mass Index (BMI), Waist Circumference (WC) and Body Fat Weight (BFW) on insulin resistance (MATSUDA INDEX < 2.5) in children and adolescents. Methods: Data from 95 overweight and obese children (47 boys and 48 girls) with mean age 10.7 ± 2.2 years were analyzed. ROC analysis was used to investigate the predictive ability of BMI, WC and BFW for insulin resistance and find the optimal cut-offs. The overall performance of the ROC analysis was quantified by computing area under the curve (AUC). Results: ROC curve analysis indicated that the optimal-cut off of WC for the prediction of insulin resistance was 97 cm with sensitivity equal to 75% and specificity equal to 73.1%. AUC was 0.78 (95% CI: 0.63-0.92, p=0.001). The sensitivity and specificity of obesity for the discrimination of participants with insulin resistance from those without insulin resistance were equal to 58.3% and 75%, respectively (AUC=0.67). BFW had a borderline predictive ability for insulin resistance (AUC=0.58, 95% CI: 0.43-0.74, p=0.101). The predictive ability of WC was equivalent with the correspondence predictive ability of BMI (p=0.891). Obese subjects had 4.2 times greater odds for having insulin resistance (95% CI: 1.71-10.30, p < 0.001), while subjects with WC more than 97 had 8.1 times greater odds for having insulin resistance (95% CI: 2.14-30.86, p=0.002). Conclusion: BMI and WC are important clinical factors that have significant clinical relation with insulin resistance in children and adolescents. The cut off of 97 cm for WC can identify children with greater likelihood for insulin resistance.Keywords: body fat weight, body mass index, insulin resistance, obese children, waist circumference
Procedia PDF Downloads 320502 The Harmonious Blend of Digitalization and 3D Printing: Advancing Aerospace Jet Pump Development
Authors: Subrata Sarkar
Abstract:
The aerospace industry is experiencing a profound product development transformation driven by the powerful integration of digitalization and 3D printing technologies. This paper delves into the significant impact of this convergence on aerospace innovation, specifically focusing on developing jet pumps for fuel systems. This case study is a compelling example of the immense potential of these technologies. In response to the industry's increasing demand for lighter, more efficient, and customized components, the combined capabilities of digitalization and 3D printing are reshaping how we envision, design, and manufacture critical aircraft parts, offering a distinct paradigm in aerospace engineering. Consider the development of a jet pump for a fuel system, a task that presents unique and complex challenges. Despite its seemingly simple design, the jet pump's development is hindered by many demanding operating conditions. The qualification process for these pumps involves many analyses and tests, leading to substantial delays and increased costs in fuel system development. However, by harnessing the power of automated simulations and integrating legacy design, manufacturing, and test data through digitalization, we can optimize the jet pump's design and performance, thereby revolutionizing product development. Furthermore, 3D printing's ability to create intricate structures using various materials, from lightweight polymers to high-strength alloys, holds the promise of highly efficient and durable jet pumps. The combined impact of digitalization and 3D printing extends beyond design, as it also reduces material waste and advances sustainability goals, aligning with the industry's increasing commitment to environmental responsibility. In conclusion, the convergence of digitalization and 3D printing is not just a technological advancement but a gateway to a new era in aerospace product development, particularly in the design of jet pumps. This revolution promises to redefine how we create aerospace components, making them safer, more efficient, and environmentally responsible. As we stand at the forefront of this technological revolution, aerospace companies must embrace these technologies as a choice and a strategic imperative for those striving to lead in innovation and sustainability in the 21st century.Keywords: jet pump, digitalization, 3D printing, aircraft fuel system.
Procedia PDF Downloads 56501 A Low-Latency Quadratic Extended Domain Modular Multiplier for Bilinear Pairing Based on Non-Least Positive Multiplication
Authors: Yulong Jia, Xiang Zhang, Ziyuan Wu, Shiji Hu
Abstract:
The calculation of bilinear pairing is the core of the SM9 algorithm, which relies on the underlying prime domain algorithm and the quadratic extension domain algorithm. Among the field algorithms, modular multiplication operation is the most time-consuming part. Therefore, the underlying modular multiplication algorithm is optimized to maximize the operation speed of bilinear pairings. This paper uses a modular multiplication method based on non-least positive (NLP) combined with Karatsuba and schoolbook multiplication to improve the Montgomery algorithm. At the same time, according to the characteristics of multiplication operation in the quadratic extension domain, a quadratic extension domain FP2-NLP modular multiplication algorithm for bilinear pairings is proposed, which effectively reduces the operation time of modular multiplication in the quadratic extension domain. The sub-expanded domain Fp₂ -NLP modular multiplication algorithm effectively reduces the operation time of modular multiplication under the second-expanded domain. The multiplication unit in the quadratic extension domain is implemented using SMIC55nm process, and two different implementation architectures are designed to cope with different application scenarios. Compared with the existing related literature, The output latency of this design can reach a minimum of 15 cycles. The shortest time for calculating the (AB+CD)r⁻¹ mod form is 37.5ns, and the comprehensive area-time product (AT) is 11400. The final R-ate pairing algorithm hardware accelerator consumes 2670k equivalent logic gates and 1.8ms computing time in 55nm process.Keywords: sm9, hardware, NLP, Montgomery
Procedia PDF Downloads 7