Search results for: algorithmic bias
578 A Greedy Alignment Algorithm Supporting Medication Reconciliation
Authors: David Tresner-Kirsch
Abstract:
Reconciling patient medication lists from multiple sources is a critical task supporting the safe delivery of patient care. Manual reconciliation is a time-consuming and error-prone process, and recently attempts have been made to develop efficiency- and safety-oriented automated support for professionals performing the task. An important capability of any such support system is automated alignment – finding which medications from a list correspond to which medications from a different source, regardless of misspellings, naming differences (e.g. brand name vs. generic), or changes in treatment (e.g. switching a patient from one antidepressant class to another). This work describes a new algorithmic solution to this alignment task, using a greedy matching approach based on string similarity, edit distances, concept extraction and normalization, and synonym search derived from the RxNorm nomenclature. The accuracy of this algorithm was evaluated against a gold-standard corpus of 681 medication records; this evaluation found that the algorithm predicted alignments with 99% precision and 91% recall. This performance is sufficient to support decision support applications for medication reconciliation.Keywords: clinical decision support, medication reconciliation, natural language processing, RxNorm
Procedia PDF Downloads 283577 Current Drainage Attack Correction via Adjusting the Attacking Saw-Function Asymmetry
Authors: Yuri Boiko, Iluju Kiringa, Tet Yeap
Abstract:
Current drainage attack suggested previously is further studied in regular settings of closed-loop controlled Brushless DC (BLDC) motor with Kalman filter in the feedback loop. Modeling and simulation experiments are conducted in a Matlab environment, implementing the closed-loop control model of BLDC motor operation in position sensorless mode under Kalman filter drive. The current increase in the motor windings is caused by the controller (p-controller in our case) affected by false data injection of substitution of the angular velocity estimates with distorted values. Operation of multiplication to distortion coefficient, values of which are taken from the distortion function synchronized in its periodicity with the rotor’s position change. A saw function with a triangular tooth shape is studied herewith for the purpose of carrying out the bias injection with current drainage consequences. The specific focus here is on how the asymmetry of the tooth in the saw function affects the flow of current drainage. The purpose is two-fold: (i) to produce and collect the signature of an asymmetric saw in the attack for further pattern recognition process, and (ii) to determine conditions of improving stealthiness of such attack via regulating asymmetry in saw function used. It is found that modification of the symmetry in the saw tooth affects the periodicity of current drainage modulation. Specifically, the modulation frequency of the drained current for a fully asymmetric tooth shape coincides with the saw function modulation frequency itself. Increasing the symmetry parameter for the triangle tooth shape leads to an increase in the modulation frequency for the drained current. Moreover, such frequency reaches the switching frequency of the motor windings for fully symmetric triangular shapes, thus becoming undetectable and improving the stealthiness of the attack. Therefore, the collected signatures of the attack can serve for attack parameter identification via the pattern recognition route.Keywords: bias injection attack, Kalman filter, BLDC motor, control system, closed loop, P-controller, PID-controller, current drainage, saw-function, asymmetry
Procedia PDF Downloads 78576 Potential Contribution of Blue Oceans for Growth of Universities: Case of Faculties of Agriculture in Public Universities in Zimbabwe
Authors: Wonder Ngezimana, Benjamin Alex Madzivire
Abstract:
As new public universities are being applauded for being promulgated in Zimbabwe, there is need for comprehensive plan for ensuring sustainable competitive advantages in their niche mandated areas. Unhealthy competition between university faculties for enrolment hinders growth of the newly established universities faculties, especially in the agricultural sciences related disciplines. Blue ocean metaphor is based on creation of competitor-free market unlike 'red oceans', which are well explored and crowded with competitors. This study seeks to explore the potential contribution of blue oceans strategy (BOS) for growth of universities with bias towards faculties of agriculture in public universities in Zimbabwe. Case studies with agricultural sciences related disciplines were selected across three universities for interviewing. Data was collected through 10 open ended questions on academics in different management positions within university faculties of agriculture. Summative analysis was thereafter used during coding and interpretation of the data. Study findings show that there are several important elements for making offerings more comprehendible towards fostering faculty growth and performance with bias towards student enrolment. The results points towards BOS form of value innovations with various elements to consider in faculty offerings. To create valued innovation beyond the red oceans, the cases in this study have to be modelled to foster changes in enrolment, modes of delivery, certification, being research oriented with excellence in teaching, ethics, service to the community and entrepreneurship. There is, therefore, need to rethink strategy towards reshaping inclusive enrolment, industry relevance, affiliations, lifelong learning, sustainable student welfare, ubuntu, exchange programmes, research excellence, alumni support and entrepreneurship. Innovative strategic collaborations and partnerships, anchored on technology boost the strategic offerings henceforth leveraging on various offerings in this study. Areas of further study include the amplitude of blue oceans shown in the university faculty offerings and implementation strategies of BOS.Keywords: blue oceans strategy, collaborations, faculty offerings, value innovations
Procedia PDF Downloads 143575 A Neuron Model of Facial Recognition and Detection of an Authorized Entity Using Machine Learning System
Authors: J. K. Adedeji, M. O. Oyekanmi
Abstract:
This paper has critically examined the use of Machine Learning procedures in curbing unauthorized access into valuable areas of an organization. The use of passwords, pin codes, user’s identification in recent times has been partially successful in curbing crimes involving identities, hence the need for the design of a system which incorporates biometric characteristics such as DNA and pattern recognition of variations in facial expressions. The facial model used is the OpenCV library which is based on the use of certain physiological features, the Raspberry Pi 3 module is used to compile the OpenCV library, which extracts and stores the detected faces into the datasets directory through the use of camera. The model is trained with 50 epoch run in the database and recognized by the Local Binary Pattern Histogram (LBPH) recognizer contained in the OpenCV. The training algorithm used by the neural network is back propagation coded using python algorithmic language with 200 epoch runs to identify specific resemblance in the exclusive OR (XOR) output neurons. The research however confirmed that physiological parameters are better effective measures to curb crimes relating to identities.Keywords: biometric characters, facial recognition, neural network, OpenCV
Procedia PDF Downloads 255574 A Corpus-Linguistic Analysis of Online Iranian News Coverage on Syrian Revolution
Authors: Amaal Ali Al-Gamde
Abstract:
The Syrian revolution is a major issue in the Middle East, which draws in world powers and receives a great focus in international mass media since 2011. The heavy global reliance on cyber news and digital sources plays a key role in conveying a sense of bias to a wide range of online readers. Thus, based on the assumption that media discourse possesses ideological implications, this study investigates the representation of Syrian revolution in online media. The paper explores the discursive constructions of anti and pro-government powers in Syrian revolution in 1000,000-word corpus of Fars online reports (an Iranian news agency), issued between 2013 and 2015. Taking a corpus assisted discourse analysis approach, the analysis investigates three types of lexicosemantic relations, the semantic macrostructures within which the two social actors are framed, the lexical collocations characterizing the news discourse and the discourse prosodies they tell about the two sides of the conflict. The study utilizes computer-based approaches, sketch engine and AntConc software to minimize the bias of the subjective analysis. The analysis moves from the insights of lexical frequencies and keyness scores to examine themes and the collocational patterns. The findings reveal the Fars agency’s ideological mode of representations in reporting events of Syrian revolution in two ways. The first is by stereotyping the opposition groups under the umbrella of terrorism, using words such as (law breakers, foreign-backed groups, militant groups, terrorists) to legitimize the atrocities of security forces against protesters and enhance horror among civilians. The second is through emphasizing the power of the government and depicting it as the defender of the Arab land by foregrounding the discourse of international conspiracy against Syria. The paper concludes discussing the potential importance of triangulating corpus linguistic tools with critical discourse analysis to elucidate more about discourses and reality.Keywords: discourse prosody, ideology, keyness, semantic macrostructure
Procedia PDF Downloads 130573 Multi-Objective Four-Dimensional Traveling Salesman Problem in an IoT-Based Transport System
Authors: Arindam Roy, Madhushree Das, Apurba Manna, Samir Maity
Abstract:
In this research paper, an algorithmic approach is developed to solve a novel multi-objective four-dimensional traveling salesman problem (MO4DTSP) where different paths with various numbers of conveyances are available to travel between two cities. NSGA-II and Decomposition algorithms are modified to solve MO4DTSP in an IoT-based transport system. This IoT-based transport system can be widely observed, analyzed, and controlled by an extensive distribution of traffic networks consisting of various types of sensors and actuators. Due to urbanization, most of the cities are connected using an intelligent traffic management system. Practically, for a traveler, multiple routes and vehicles are available to travel between any two cities. Thus, the classical TSP is reformulated as multi-route and multi-vehicle i.e., 4DTSP. The proposed MO4DTSP is designed with traveling cost, time, and customer satisfaction as objectives. In reality, customer satisfaction is an important parameter that depends on travel costs and time reflects in the present model.Keywords: multi-objective four-dimensional traveling salesman problem (MO4DTSP), decomposition, NSGA-II, IoT-based transport system, customer satisfaction
Procedia PDF Downloads 109572 The Use of Fractional Brownian Motion in the Generation of Bed Topography for Bodies of Water Coupled with the Lattice Boltzmann Method
Authors: Elysia Barker, Jian Guo Zhou, Ling Qian, Steve Decent
Abstract:
A method of modelling topography used in the simulation of riverbeds is proposed in this paper, which removes the need for datapoints and measurements of physical terrain. While complex scans of the contours of a surface can be achieved with other methods, this requires specialised tools, which the proposed method overcomes by using fractional Brownian motion (FBM) as a basis to estimate the real surface within a 15% margin of error while attempting to optimise algorithmic efficiency. This removes the need for complex, expensive equipment and reduces resources spent modelling bed topography. This method also accounts for the change in topography over time due to erosion, sediment transport, and other external factors which could affect the topography of the ground by updating its parameters and generating a new bed. The lattice Boltzmann method (LBM) is used to simulate both stationary and steady flow cases in a side-by-side comparison over the generated bed topography using the proposed method and a test case taken from an external source. The method, if successful, will be incorporated into the current LBM program used in the testing phase, which will allow an automatic generation of topography for the given situation in future research, removing the need for bed data to be specified.Keywords: bed topography, FBM, LBM, shallow water, simulations
Procedia PDF Downloads 98571 Causal Estimation for the Left-Truncation Adjusted Time-Varying Covariates under the Semiparametric Transformation Models of a Survival Time
Authors: Yemane Hailu Fissuh, Zhongzhan Zhang
Abstract:
In biomedical researches and randomized clinical trials, the most commonly interested outcomes are time-to-event so-called survival data. The importance of robust models in this context is to compare the effect of randomly controlled experimental groups that have a sense of causality. Causal estimation is the scientific concept of comparing the pragmatic effect of treatments conditional to the given covariates rather than assessing the simple association of response and predictors. Hence, the causal effect based semiparametric transformation model was proposed to estimate the effect of treatment with the presence of possibly time-varying covariates. Due to its high flexibility and robustness, the semiparametric transformation model which shall be applied in this paper has been given much more attention for estimation of a causal effect in modeling left-truncated and right censored survival data. Despite its wide applications and popularity in estimating unknown parameters, the maximum likelihood estimation technique is quite complex and burdensome in estimating unknown parameters and unspecified transformation function in the presence of possibly time-varying covariates. Thus, to ease the complexity we proposed the modified estimating equations. After intuitive estimation procedures, the consistency and asymptotic properties of the estimators were derived and the characteristics of the estimators in the finite sample performance of the proposed model were illustrated via simulation studies and Stanford heart transplant real data example. To sum up the study, the bias of covariates was adjusted via estimating the density function for truncation variable which was also incorporated in the model as a covariate in order to relax the independence assumption of failure time and truncation time. Moreover, the expectation-maximization (EM) algorithm was described for the estimation of iterative unknown parameters and unspecified transformation function. In addition, the causal effect was derived by the ratio of the cumulative hazard function of active and passive experiments after adjusting for bias raised in the model due to the truncation variable.Keywords: causal estimation, EM algorithm, semiparametric transformation models, time-to-event outcomes, time-varying covariate
Procedia PDF Downloads 123570 Tax Expenditures: A Review and Analysis
Authors: Khalid Javed
Abstract:
This study examines a feature of the budget process called the tax expenditure budget. The tax expenditure concept relies heavily on a normative notion that shielding certain. Taxpayer income from taxation deprives government of its rightful revenues. This view is inconsistent with the proposition that income belongs to the taxpayers and that tax liability is determined through the democratic process, not through arbitrary, bureaucratic Assumptions. Furthermore, the methodology of the tax expenditure budget is problematic as its expansive tax base treats the multiple taxation of saving as the norm. By using an expansive view of income as the underlying assumption of the tax expenditure concept, this viewpoint institutionalizes a particular bias into the decision-making process.Keywords: revenue, expenditure, tax budget, propostion
Procedia PDF Downloads 294569 Intelligent Staff Scheduling: Optimizing the Solver with Tabu Search
Authors: Yu-Ping Chiu, Dung-Ying Lin
Abstract:
Traditional staff scheduling methods, relying on employee experience, often lead to inefficiencies and resource waste. The challenges of transferring scheduling expertise and adapting to changing labor regulations further complicate this process. Manual approaches become increasingly impractical as companies accumulate complex scheduling rules over time. This study proposes an algorithmic optimization approach to address these issues, aiming to expedite scheduling while ensuring strict compliance with labor regulations and company policies. The method focuses on generating optimal schedules that minimize weighted company objectives within a compressed timeframe. Recognizing the limitations of conventional commercial software in modeling and solving complex real-world scheduling problems efficiently, this research employs Tabu Search with both long-term and short-term memory structures. The study will present numerical results and managerial insights to demonstrate the effectiveness of this approach in achieving intelligent and efficient staff scheduling.Keywords: intelligent memory structures, mixed integer programming, meta-heuristics, staff scheduling problem, tabu search
Procedia PDF Downloads 23568 Impacts of Land Use and Land Cover Change on Stream Flow and Sediment Yield of Genale Dawa Dam III Watershed, Ethiopia
Authors: Aklilu Getahun Sulito
Abstract:
Land Use and Land Cover change dynamics is a result of complex interactions betweenseveral bio- physical and socio-economic conditions. The impacts of the landcoverchange on stream flow and sediment yield were analyzed statistically usingthehydrological model, SWAT. Genale Dawa Dam III watershed is highly af ectedbydeforestation, over grazing, and agricultural land expansion. This study was aimedusingSWAT model for the assessment of impacts of land use land cover change on sediment yield, evaluating stream flow on wet &dry seasons and spatial distribution sediment yieldfrom sub-basins of the Genale Dawa Dam III watershed. Land use land cover maps(LULC) of 2000, 2008 and 2016 were used with same corresponding climate data. During the study period most parts of the forest, dense forest evergreen and grass landchanged to cultivated land. The cultivated land increased by 26.2%but forest land, forest evergreen lands and grass lands decreased by 21.33%, 11.59 % and 7.28 %respectively, following that the mean annual sediment yield of watershed increased by 7.37ton/haover16 years period (2000 – 2016). The analysis of stream flow for wet and dry seasonsshowed that the steam flow increased by 25.5% during wet season, but decreasedby29.6% in the dry season. The result an average annual spatial distribution of sediment yield increased by 7.73ton/ha yr -1 from (2000_2016). The calibration results for bothstream flow and sediment yield showed good agreement between observed and simulateddata with the coef icient of determination of 0.87 and 0.84, Nash-Sutclif e ef iciencyequality to 0.83 and 0.78 and percentage bias of -7.39% and -10.90%respectively. Andthe result for validation for both stream flow and sediment showed good result withCoef icient of determination equality to 0.83 and 0.80, Nash-Sutclif e ef iciency of 0.78and 0.75 and percentage bias of 7.09% and 3.95%. The result obtained fromthe model based on the above method was the mean annual sediment load at Genale DawaDamIIIwatershed increase from 2000 to 2016 for the reason that of the land uses change. Sotouse the Genale Dawa Dam III the land use management practices are neededinthefuture to prevent further increase of sediment yield of the watershed.Keywords: Genale Dawa Dam III watershed, land use land cover change, SWAT, spatial distribution, sediment yield, stream flow
Procedia PDF Downloads 54567 A Brave New World of Privacy: Empirical Insights into the Metaverse’s Personalization Dynamics
Authors: Cheng Xu
Abstract:
As the metaverse emerges as a dynamic virtual simulacrum of reality, its implications on user privacy have become a focal point of interest. While previous discussions have ventured into metaverse privacy dynamics, a glaring empirical gap persists, especially concerning the effects of personalization in the context of news recommendation services. This study stands at the forefront of addressing this void, meticulously examining how users' privacy concerns shift within the metaverse's personalization context. Through a pre-registered randomized controlled experiment, participants engaged in a personalization task across both the metaverse and traditional online platforms. Upon completion of this task, a comprehensive news recommendation service provider offers personalized news recommendations to the users. Our empirical findings reveal that the metaverse inherently amplifies privacy concerns compared to traditional settings. However, these concerns are notably mitigated when users have a say in shaping the algorithms that drive these recommendations. This pioneering research not only fills a significant knowledge gap but also offers crucial insights for metaverse developers and policymakers, emphasizing the nuanced role of user input in shaping algorithm-driven privacy perceptions.Keywords: metaverse, privacy concerns, personalization, digital interaction, algorithmic recommendations
Procedia PDF Downloads 116566 Influence of Wavelengths on Photosensitivity of Copper Phthalocyanine Based Photodetectors
Authors: Lekshmi Vijayan, K. Shreekrishna Kumar
Abstract:
We demonstrated an organic field effect transistor based photodetector using phthalocyanine as the active material that exhibited high photosensitivity under varying light wavelengths. The thermally grown SiO₂ layer on silicon wafer act as a substrate. The critical parameters, such as photosensitivity, responsivity and detectivity, are comparatively high and were 3.09, 0.98AW⁻¹ and 4.86 × 10¹⁰ Jones, respectively, under a bias of 5 V and a monochromatic illumination intensity of 4mW cm⁻². The photodetector has a linear I-V curve with a low dark current. On comparing photoresponse of copper phthalocyanine at four different wavelengths, 560 nm shows better photoresponse and the highest value of photosensitivity is also obtained.Keywords: photodetector, responsivity, photosensitivity, detectivity
Procedia PDF Downloads 176565 Development of Tools for Multi Vehicles Simulation with Robot Operating System and ArduPilot
Authors: Pierre Kancir, Jean-Philippe Diguet, Marc Sevaux
Abstract:
One of the main difficulties in developing multi-robot systems (MRS) is related to the simulation and testing tools available. Indeed, if the differences between simulations and real robots are too significant, the transition from the simulation to the robot won’t be possible without another long development phase and won’t permit to validate the simulation. Moreover, the testing of different algorithmic solutions or modifications of robots requires a strong knowledge of current tools and a significant development time. Therefore, the availability of tools for MRS, mainly with flying drones, is crucial to enable the industrial emergence of these systems. This research aims to present the most commonly used tools for MRS simulations and their main shortcomings and presents complementary tools to improve the productivity of designers in the development of multi-vehicle solutions focused on a fast learning curve and rapid transition from simulations to real usage. The proposed contributions are based on existing open source tools as Gazebo simulator combined with ROS (Robot Operating System) and the open-source multi-platform autopilot ArduPilot to bring them to a broad audience.Keywords: ROS, ArduPilot, MRS, simulation, drones, Gazebo
Procedia PDF Downloads 209564 A Novel Way to Create Qudit Quantum Error Correction Codes
Authors: Arun Moorthy
Abstract:
Quantum computing promises to provide algorithmic speedups for a number of tasks; however, similar to classical computing, effective error-correcting codes are needed. Current quantum computers require costly equipment to control each particle, so having fewer particles to control is ideal. Although traditional quantum computers are built using qubits (2-level systems), qudits (more than 2-levels) are appealing since they can have an equivalent computational space using fewer particles, meaning fewer particles need to be controlled. Currently, qudit quantum error-correction codes are available for different level qudit systems; however, these codes have sometimes overly specific constraints. When building a qudit system, it is important for researchers to have access to many codes to satisfy their requirements. This project addresses two methods to increase the number of quantum error correcting codes available to researchers. The first method is generating new codes for a given set of parameters. The second method is generating new error-correction codes by using existing codes as a starting point to generate codes for another level (i.e., a 5-level system code on a 2-level system). So, this project builds a website that researchers can use to generate new error-correction codes or codes based on existing codes.Keywords: qudit, error correction, quantum, qubit
Procedia PDF Downloads 159563 Regulatory and Economic Challenges of AI Integration in Cyber Insurance
Authors: Shreyas Kumar, Mili Shangari
Abstract:
Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware
Procedia PDF Downloads 32562 Algorithmic Generation of Carbon Nanochimneys
Authors: Sorin Muraru
Abstract:
Computational generation of carbon nanostructures is still a very demanding process. This work provides an alternative to manual molecular modeling through an algorithm meant to automate the design of such structures. Specifically, carbon nanochimneys are obtained through the bonding of a carbon nanotube with the smaller edge of an open carbon nanocone. The methods of connection rely on mathematical, geometrical and chemical properties. Non-hexagonal rings are used in order to perform the correct bonding of dangling bonds. Once obtained, they are useful for thermal transport, gas storage or other applications such as gas separation. The carbon nanochimneys are meant to produce a less steep connection between structures such as the carbon nanotube and graphene sheet, as in the pillared graphene, but can also provide functionality on its own. The method relies on connecting dangling bonds at the edges of the two carbon nanostructures, employing the use of two different types of auxiliary structures on a case-by-case basis. The code is implemented in Python 3.7 and generates an output file in the .pdb format containing all the system’s coordinates. Acknowledgment: This work was supported by a grant of the Executive Agency for Higher Education, Research, Development and innovation funding (UEFISCDI), project number PN-III-P1-1.1-TE-2016-24-2, contract TE 122/2018.Keywords: carbon nanochimneys, computational, carbon nanotube, carbon nanocone, molecular modeling, carbon nanostructures
Procedia PDF Downloads 169561 Building in-Addition-School-Family Partnership
Authors: Lulu Sun
Abstract:
In Addition is an after-school mathematics program in which students and their parents build mathematical confidence and competence by solving problems they curious about. It is a program consists of mix-grade from 4th to 6th grade of 10 to 20 students, including math problem solving and other activities. This partnership will focus on the relationship between the In-Addition and the parents’ engagement; in this kind of partnership, it has the In-Addition program teaching and the family engagement. This partnership is purpose to building cooperation between the program and parents, strengthening the links between the program and families.Keywords: program-family, family engagement, positive bias, partnership
Procedia PDF Downloads 192560 A Study on Inference from Distance Variables in Hedonic Regression
Authors: Yan Wang, Yasushi Asami, Yukio Sadahiro
Abstract:
In urban area, several landmarks may affect housing price and rents, hedonic analysis should employ distance variables corresponding to each landmarks. Unfortunately, the effects of distances to landmarks on housing prices are generally not consistent with the true price. These distance variables may cause magnitude error in regression, pointing a problem of spatial multicollinearity. In this paper, we provided some approaches for getting the samples with less bias and method on locating the specific sampling area to avoid the multicollinerity problem in two specific landmarks case.Keywords: landmarks, hedonic regression, distance variables, collinearity, multicollinerity
Procedia PDF Downloads 452559 Consent and the Construction of Unlawfulness
Authors: Susanna Menis
Abstract:
The context of this study revolves around the theme of consent and the construction of unlawfulness in judicial decisions. It aims to explore the formation of societal perceptions of unlawfulness within the context of consensual sexual acts leading to harmful consequences. This study investigates how judges create legal rules that reflect social solidarity and protect against violence. Specifically, the research aims to understand the justification behind criminalising consensual sexual activity when categorised under different offences. The main question addressed in this study will evaluate the way judges create legal rules that they believe reflect social solidarity and protect against violence. The study employs a historical genealogy approach as its methodology. This approach allows for tracing back the original formation of societal perspectives on unlawfulness, thus highlighting the socially constructed nature of the present understanding. The data for this study will be collected through an extensive literature review, examining historical legal cases and documents that shape the understanding of unlawfulness. This will provide a comprehensive view of how social attitudes toward private sexual relations influenced the creation of legal rules. The theoretical importance of this research lies in its contribution to socio-legal scholarship. This study adds to the existing knowledge on the topic by exploring questions of unconscious bias and its origins. The findings shed light on how and why individuals possess unconscious biases, particularly within the judicial system. In conclusion, this study investigates judicial decisions concerning consensual sexual acts and the construction of unlawfulness. By employing a historical genealogy approach, the research sheds light on how judges create legal rules that reflect social solidarity and aim to protect against violence. The theoretical importance of this study lies in its contribution to understanding unconscious bias and its origins within the judicial system. Through data collection and analysis procedures, this study aims to provide valuable insights into the formation of social attitudes towards private sexual relations and its impact on legal rulings.Keywords: consent, sexual offences, offences against the person, legal genealogy, social construct
Procedia PDF Downloads 68558 Digital Material Characterization Using the Quantum Fourier Transform
Authors: Felix Givois, Nicolas R. Gauger, Matthias Kabel
Abstract:
The efficient digital material characterization is of great interest to many fields of application. It consists of the following three steps. First, a 3D reconstruction of 2D scans must be performed. Then, the resulting gray-value image of the material sample is enhanced by image processing methods. Finally, partial differential equations (PDE) are solved on the segmented image, and by averaging the resulting solutions fields, effective properties like stiffness or conductivity can be computed. Due to the high resolution of current CT images, the latter is typically performed with matrix-free solvers. Among them, a solver that uses the explicit formula of the Green-Eshelby operator in Fourier space has been proposed by Moulinec and Suquet. Its algorithmic, most complex part is the Fast Fourier Transformation (FFT). In our talk, we will discuss the potential quantum advantage that can be obtained by replacing the FFT with the Quantum Fourier Transformation (QFT). We will especially show that the data transfer for noisy intermediate-scale quantum (NISQ) devices can be improved by using appropriate boundary conditions for the PDE, which also allows using semi-classical versions of the QFT. In the end, we will compare the results of the QFT-based algorithm for simple geometries with the results of the FFT-based homogenization method.Keywords: most likelihood amplitude estimation (MLQAE), numerical homogenization, quantum Fourier transformation (QFT), NISQ devises
Procedia PDF Downloads 75557 Validating Condition-Based Maintenance Algorithms through Simulation
Authors: Marcel Chevalier, Léo Dupont, Sylvain Marié, Frédérique Roffet, Elena Stolyarova, William Templier, Costin Vasile
Abstract:
Industrial end-users are currently facing an increasing need to reduce the risk of unexpected failures and optimize their maintenance. This calls for both short-term analysis and long-term ageing anticipation. At Schneider Electric, we tackle those two issues using both machine learning and first principles models. Machine learning models are incrementally trained from normal data to predict expected values and detect statistically significant short-term deviations. Ageing models are constructed by breaking down physical systems into sub-assemblies, then determining relevant degradation modes and associating each one to the right kinetic law. Validating such anomaly detection and maintenance models is challenging, both because actual incident and ageing data are rare and distorted by human interventions, and incremental learning depends on human feedback. To overcome these difficulties, we propose to simulate physics, systems, and humans -including asset maintenance operations- in order to validate the overall approaches in accelerated time and possibly choose between algorithmic alternatives.Keywords: degradation models, ageing, anomaly detection, soft sensor, incremental learning
Procedia PDF Downloads 125556 SOI-Multi-FinFET: Impact of Fins Number Multiplicity on Corner Effect
Authors: A.N. Moulay Khatir, A. Guen-Bouazza, B. Bouazza
Abstract:
SOI-Multifin-FET shows excellent transistor characteristics, ideal sub-threshold swing, low drain induced barrier lowering (DIBL) without pocket implantation and negligible body bias dependency. In this work, we analyzed this combination by a three-dimensional numerical device simulator to investigate the influence of fins number on corner effect by analyzing its electrical characteristics and potential distribution in the oxide and the silicon in the section perpendicular to the flow of the current for SOI-single-fin FET, three-fin and five-fin, and we provide a comparison with a Trigate SOI Multi-FinFET structure.Keywords: SOI, FinFET, corner effect, dual-gate, tri-gate, Multi-Fin FET
Procedia PDF Downloads 474555 Influence of Temperature on Properties of MOSFETs
Authors: Azizi Cherifa, O. Benzaoui
Abstract:
The thermal aspects in the design of power circuits often deserve as much attention as pure electric components aspects as the operating temperature has a direct influence on their static and dynamic characteristics. MOSFET is fundamental in the circuits, it is the most widely used device in the current production of semiconductor components using their honorable performance. The aim of this contribution is devoted to the effect of the temperature on the properties of MOSFETs. The study enables us to calculate the drain current as function of bias in both linear and saturated modes. The effect of temperature is evaluated using a numerical simulation, using the laws of mobility and saturation velocity of carriers as a function of temperature.Keywords: temperature, MOSFET, mobility, transistor
Procedia PDF Downloads 345554 A Stochastic Volatility Model for Optimal Market-Making
Authors: Zubier Arfan, Paul Johnson
Abstract:
The electronification of financial markets and the rise of algorithmic trading has sparked a lot of interest from the mathematical community, for the market making-problem in particular. The research presented in this short paper solves the classic stochastic control problem in order to derive the strategy for a market-maker. It also shows how to calibrate and simulate the strategy with real limit order book data for back-testing. The ambiguity of limit-order priority in back-testing is dealt with by considering optimistic and pessimistic priority scenarios. The model, although it does outperform a naive strategy, assumes constant volatility, therefore, is not best suited to the LOB data. The Heston model is introduced to describe the price and variance process of the asset. The Trader's constant absolute risk aversion utility function is optimised by numerically solving a 3-dimensional Hamilton-Jacobi-Bellman partial differential equation to find the optimal limit order quotes. The results show that the stochastic volatility market-making model is more suitable for a risk-averse trader and is also less sensitive to calibration error than the constant volatility model.Keywords: market-making, market-microsctrucure, stochastic volatility, quantitative trading
Procedia PDF Downloads 149553 Evidence of a Negativity Bias in the Keywords of Scientific Papers
Authors: Kseniia Zviagintseva, Brett Buttliere
Abstract:
Science is fundamentally a problem-solving enterprise, and scientists pay more attention to the negative things, that cause them dissonance and negative affective state of uncertainty or contradiction. While this is agreed upon by philosophers of science, there are few empirical demonstrations. Here we examine the keywords from those papers published by PLoS in 2014 and show with several sentiment analyzers that negative keywords are studied more than positive keywords. Our dataset is the 927,406 keywords of 32,870 scientific articles in all fields published in 2014 by the journal PLOS ONE (collected from Altmetric.com). Counting how often the 47,415 unique keywords are used, we can examine whether those negative topics are studied more than positive. In order to find the sentiment of the keywords, we utilized two sentiment analysis tools, Hu and Liu (2004) and SentiStrength (2014). The results below are for Hu and Liu as these are the less convincing results. The average keyword was utilized 19.56 times, with half of the keywords being utilized only 1 time and the maximum number of uses being 18,589 times. The keywords identified as negative were utilized 37.39 times, on average, with the positive keywords being utilized 14.72 times and the neutral keywords - 19.29, on average. This difference is only marginally significant, with an F value of 2.82, with a p of .05, but one must keep in mind that more than half of the keywords are utilized only 1 time, artificially increasing the variance and driving the effect size down. To examine more closely, we looked at those top 25 most utilized keywords that have a sentiment. Among the top 25, there are only two positive words, ‘care’ and ‘dynamics’, in position numbers 5 and 13 respectively, with all the rest being identified as negative. ‘Diseases’ is the most studied keyword with 8,790 uses, with ‘cancer’ and ‘infectious’ being the second and fourth most utilized sentiment-laden keywords. The sentiment analysis is not perfect though, as the words ‘diseases’ and ‘disease’ are split by taking 1st and 3rd positions. Combining them, they remain as the most common sentiment-laden keyword, being utilized 13,236 times. More than just splitting the words, the sentiment analyzer logs ‘regression’ and ‘rat’ as negative, and these should probably be considered false positives. Despite these potential problems, the effect is apparent, as even the positive keywords like ‘care’ could or should be considered negative, since this word is most commonly utilized as a part of ‘health care’, ‘critical care’ or ‘quality of care’ and generally associated with how to improve it. All in all, the results suggest that negative concepts are studied more, also providing support for the notion that science is most generally a problem-solving enterprise. The results also provide evidence that negativity and contradiction are related to greater productivity and positive outcomes.Keywords: bibliometrics, keywords analysis, negativity bias, positive and negative words, scientific papers, scientometrics
Procedia PDF Downloads 186552 Fractional-Order Modeling of GaN High Electron Mobility Transistors for Switching Applications
Authors: Anwar H. Jarndal, Ahmed S. Elwakil
Abstract:
In this paper, a fraction-order model for pad parasitic effect of GaN HEMT on Si substrate is developed and validated. Open de-embedding structure is used to characterize and de-embed substrate loading parasitic effects. Unbiased device measurements are implemented to extract parasitic inductances and resistances. The model shows very good simulation for S-parameter measurements under different bias conditions. It has been found that this approach can improve the simulation of intrinsic part of the transistor, which is very important for small- and large-signal modeling process.Keywords: fractional-order modeling, GaNHEMT, si-substrate, open de-embedding structure
Procedia PDF Downloads 355551 On Parameter Estimation of Simultaneous Linear Functional Relationship Model for Circular Variables
Authors: N. A. Mokhtar, A. G. Hussin, Y. Z. Zubairi
Abstract:
This paper proposes a new simultaneous simple linear functional relationship model by assuming equal error variances. We derive the maximum likelihood estimate of the parameters in the simultaneous model and the covariance. We show by simulation study the small bias values of the parameters suggest the suitability of the estimation method. As an illustration, the proposed simultaneous model is applied to real data of the wind direction and wave direction measured by two different instruments.Keywords: simultaneous linear functional relationship model, Fisher information matrix, parameter estimation, circular variables
Procedia PDF Downloads 365550 Ethical Issues in AI: Analyzing the Gap Between Theory and Practice - A Case Study of AI and Robotics Researchers
Authors: Sylvie Michel, Emmanuelle Gagnou, Joanne Hamet
Abstract:
New major ethical dilemmas are posed by artificial intelligence. This article identifies an existing gap between the ethical questions that AI/robotics researchers grapple with in their research practice and those identified by literature review. The objective is to understand which ethical dilemmas are identified or concern AI researchers in order to compare them with the existing literature. This will enable to conduct training and awareness initiatives for AI researchers, encouraging them to consider these questions during the development of AI. Qualitative analyses were conducted based on direct observation of an AI/Robotics research team focused on collaborative robotics over several months. Subsequently, semi-structured interviews were conducted with 16 members of the team. The entire process took place during the first semester of 2023. The observations were analyzed using an analytical framework, and the interviews were thematically analyzed using Nvivo software. While the literature identifies three primary ethical concerns regarding AI—transparency, bias, and responsibility—the results firstly demonstrate that AI researchers are primarily concerned with the publication and valorization of their work, with the initial ethical concerns revolving around this matter. Questions arise regarding the extent to which to "market" publications and the usefulness of some publications. Research ethics are a central consideration for these teams. Secondly, another result shows that the researchers studied adopt a consequentialist ethics (though not explicitly formulated as such). They ponder the consequences of their development in terms of safety (for humans in relation to Robots/AI), worker autonomy in relation to the robot, and the role of work in society (can robots take over jobs?). Lastly, results indicate that the ethical dilemmas highlighted in the literature (responsibility, transparency, bias) do not explicitly appear in AI/Robotics research. AI/robotics researchers raise specific and pragmatic ethical questions, primarily concerning publications initially and consequentialist considerations afterward. Results demonstrate that these concerns are distant from the existing literature. However, the dilemmas highlighted in the literature also deserve to be explicitly contemplated by researchers. This article proposes that the journals these researchers target should mandate ethical reflection for all presented works. Furthermore, results suggest offering awareness programs in the form of short educational sessions for researchers.Keywords: ethics, artificial intelligence, research, robotics
Procedia PDF Downloads 79549 Theoretical Reflections on Metaphor and Cohesion and the Coherence of Face-To-Face Interactions
Authors: Afef Badri
Abstract:
The role of metaphor in creating the coherence and the cohesion of discourse in online interactive talk has almost received no attention. This paper intends to provide some theoretical reflections on metaphorical coherence as a jointly constructed process that evolves in online, face-to-face interactions. It suggests that the presence of a global conceptual structure in a conversation makes it conceptually cohesive. Yet, coherence remains a process largely determined by other variables (shared goals, communicative intentions, and framework of understanding). Metaphorical coherence created by these variables can be useful in detecting bias in media reporting.Keywords: coherence, cohesion, face-to-face interactions, metaphor
Procedia PDF Downloads 246