Search results for: development of code blue simulation module
22197 Good Faith and Accession in the New Civil Code
Authors: Adelina Vrancianu
Abstract:
The problem of artificial real accession will be analyzed in this study both in terms of old and current Civil Code provisions and in terms of comparative law, European legal and Canadian systems. The current Civil Code from 2009 has brought new changes about the application and solutions regarding artificial real accession. The hypothesis in which a person is making works with his own materials on the real estate belonging to another person is developed and analyzed in detail from national and international point of view in relation with the good faith. The scope of this analysis is to point out what are the changes issued from case-law and which ones are new, inspired from other law systems in regard to the good/bad faith. The new civil code has promoted a definition for this notion. Is this definition a new one inspired from the comparative law or is it inspired from the case-law? Is it explained for every case scenario of accession or is a general notion? The study tries to respond to these questions and to present the new aspects in the area. has reserved a special place for the situation of execution of works with own materials exceeding the border with violation of another’s right of property, where the variety of solutions brings into discussion the case of expropriation for private interest. The new Civil Code is greatly influenced by the Civil Code from Quebec in comparison with the old code of French influence. The civil reform was needed and has brought into attention new solutions inspired from the Canadian system which has mitigated the permanent conflict between the constructor and the immovable owner.Keywords: accession, good faith, new civil code, comparative law
Procedia PDF Downloads 46222196 156vdc to 110vac Sinusoidal Inverter Simulation and Implementation
Authors: Phinyo Mueangmeesap
Abstract:
This paper describes about pure sinusoidal inverter simulation and implementation from high voltage DC (156 Vdc). This simulation is to study and improve the efficiency of the inverter. By reducing the loss of power from boost converter in current inverter. The simulation is done by using the H-bridge circuit with pulse width modulate (PWM) signal and low-pass filter circuit. To convert the DC into AC. This paper used the PSCad for simulation. The result of simulation can be used to create prototype inverter by converting 156 Vdc to 110Vac. The inverter gives the output signal similar to the output from a simulation.Keywords: inverter simulation, PWM signal, single-phase inverter, sinusoidal inverter
Procedia PDF Downloads 41222195 AI Ethical Values as Dependent on the Role and Perspective of the Ethical AI Code Founder- A Mapping Review
Authors: Moshe Davidian, Shlomo Mark, Yotam Lurie
Abstract:
With the rapid development of technology and the concomitant growth in the capability of Artificial Intelligence (AI) systems and their power, the ethical challenges involved in these systems are also evolving and increasing. In recent years, various organizations, including governments, international institutions, professional societies, civic organizations, and commercial companies, have been choosing to address these various challenges by publishing ethical codes for AI systems. However, despite the apparent agreement that AI should be “ethical,” there is debate about the definition of “ethical artificial intelligence.” This study investigates the various AI ethical codes and their key ethical values. From the vast collection of codes that exist, it analyzes and compares 25 ethical codes that were found to be representative of different types of organizations. In addition, as part of its literature review, the study overviews data collected in three recent reviews of AI codes. The results of the analyses demonstrate a convergence around seven key ethical values. However, the key finding is that the different AI ethical codes eventually reflect the type of organization that designed the code; i.e., the organizations’ role as regulator, user, or developer affects the view of what ethical AI is. The results show a relationship between the organization’s role and the dominant values in its code. The main contribution of this study is the development of a list of the key values for all AI systems and specific values that need to impact the development and design of AI systems, but also allowing for differences according to the organization for which the system is being developed. This will allow an analysis of AI values in relation to stakeholders.Keywords: artificial intelligence, ethical codes, principles, values
Procedia PDF Downloads 10722194 Exploration of Artificial Neural Network and Response Surface Methodology in Removal of Industrial Effluents
Authors: Rakesh Namdeti
Abstract:
Toxic dyes found in industrial effluent must be treated before being disposed of due to their harmful impact on human health and aquatic life. Thus, Musa acuminata (Banana Leaves) was employed in the role of a biosorbent in this work to get rid of methylene blue derived from a synthetic solution. The effects of five process parameters, such as temperature, pH, biosorbent dosage, and initial methylene blue concentration, using a central composite design (CCD), and the percentage of dye clearance were investigated. The response was modelled using a quadratic model based on the CCD. The analysis of variance revealed the most influential element on experimental design response (ANOVA). The temperature of 44.30C, pH of 7.1, biosorbent dose of 0.3 g, starting methylene blue concentration of 48.4 mg/L, and 84.26 percent dye removal were the best conditions for Musa acuminata (Banana leave powder). At these ideal conditions, the experimental percentage of biosorption was 76.93. The link between the estimated results of the developed ANN model and the experimental results defined the success of ANN modeling. As a result, the study's experimental results were found to be quite close to the model's predicted outcomes.Keywords: Musa acuminata, central composite design, methylene blue, artificial neural network
Procedia PDF Downloads 7622193 Mariculture Trials of the Philippine Blue Sponge Xestospongia sp.
Authors: Clairecynth Yu, Geminne Manzano
Abstract:
The mariculture potential of the Philippine blue sponge, Xestospongia sp. was assessed through the pilot sponge culture in the open-sea at two different biogeographic regions in the Philippines. Thirty explants were randomly allocated for the Puerto Galera, Oriental Mindoro culture setup and the other nine were transported to Lucero, Bolinao, Pangasinan. Two different sponge culture methods of the sponge explants- the lantern and the wall method, were employed to assess the production of the Renieramycin M. Both methods have shown to be effective in growing the sponge explants and that the Thin Layer Chromatography (TLC) results have shown that Renieramycin M is present on the sponges. The effect of partial harvesting in the growth and survival rates of the blue sponge in the Puerto Galera setup was also determined. Results showed that a higher growth rate was observed on the partially harvested explants on both culture methods as compared to the unharvested explants.Keywords: chemical ecology, porifera, sponge, Xestospongia sp.
Procedia PDF Downloads 27322192 Statistical Analysis of Rainfall Change over the Blue Nile Basin
Authors: Hany Mustafa, Mahmoud Roushdi, Khaled Kheireldin
Abstract:
Rainfall variability is an important feature of semi-arid climates. Climate change is very likely to increase the frequency, magnitude, and variability of extreme weather events such as droughts, floods, and storms. The Blue Nile Basin is facing extreme climate change-related events such as floods and droughts and its possible impacts on ecosystem, livelihood, agriculture, livestock, and biodiversity are expected. Rainfall variability is a threat to food production in the Blue Nile Basin countries. This study investigates the long-term variations and trends of seasonal and annual precipitation over the Blue Nile Basin for 102-year period (1901-2002). Six statistical trend analysis of precipitation was performed with nonparametric Mann-Kendall test and Sen's slope estimator. On the other hands, four statistical absolute homogeneity tests: Standard Normal Homogeneity Test, Buishand Range test, Pettitt test and the Von Neumann ratio test were applied to test the homogeneity of the rainfall data, using XLSTAT software, which results of p-valueless than alpha=0.05, were significant. The percentages of significant trends obtained for each parameter in the different seasons are presented. The study recommends adaptation strategies to be streamlined to relevant policies, enhancing local farmers’ adaptive capacity for facing future climate change effects.Keywords: Blue Nile basin, climate change, Mann-Kendall test, trend analysis
Procedia PDF Downloads 54922191 Comparative Study of Dose Calculation Accuracy in Bone Marrow Using Monte Carlo Method
Authors: Marzieh Jafarzadeh, Fatemeh Rezaee
Abstract:
Introduction: The effect of ionizing radiation on human health can be effective for genomic integrity and cell viability. It also increases the risk of cancer and malignancy. Therefore, X-ray behavior and absorption dose calculation are considered. One of the applicable tools for calculating and evaluating the absorption dose in human tissues is Monte Carlo simulation. Monte Carlo offers a straightforward way to simulate and integrate, and because it is simple and straightforward, Monte Carlo is easy to use. The Monte Carlo BEAMnrc code is one of the most common diagnostic X-ray simulation codes used in this study. Method: In one of the understudy hospitals, a certain number of CT scan images of patients who had previously been imaged were extracted from the hospital database. BEAMnrc software was used for simulation. The simulation of the head of the device with the energy of 0.09 MeV with 500 million particles was performed, and the output data obtained from the simulation was applied for phantom construction using CT CREATE software. The percentage of depth dose (PDD) was calculated using STATE DOSE was then compared with international standard values. Results and Discussion: The ratio of surface dose to depth dose (D/Ds) in the measured energy was estimated to be about 4% to 8% for bone and 3% to 7% for bone marrow. Conclusion: MC simulation is an efficient and accurate method for simulating bone marrow and calculating the absorbed dose.Keywords: Monte Carlo, absorption dose, BEAMnrc, bone marrow
Procedia PDF Downloads 21222190 A Study of Agile Based Approaches to Improve Software Quality
Authors: Gurmeet Kaur
Abstract:
Agile software development methods are being recognized as popular, and efficient approach to the development of software system that has a short delivery period with high quality also that meets customer requirements with zero defect. In agile software development, quality means quality of code where in the quality is maintained through the use of methods or approaches like refactoring, test driven development, behavior driven development, acceptance test driven development, and demand driven development. Software quality is measured in term of metrics such as the number of defects during development of software. Usage of above mentioned methods or approaches, reduces the possibilities of defects in developed software, and hence improve quality. This paper focuses on study of agile based quality methods or approaches for software development that ensures improved quality of software as well as reduced cost, and customer satisfaction. Procedia PDF Downloads 17222189 Synthesis and Spectrophotometric Study of Omeprazole Charge Transfer Complexes with Bromothymol Blue, Methyl Orange, and Picric Acid
Authors: Saeeda Nadir Ali, Najma Sultana, Muhammad Saeed Arayne
Abstract:
Charge transfer complexes of omeprazole with bromothymol blue, methyl orange, and picric acid in the Beer’s law ranges 7-56, 6-48, and 10-80 µg mL-1, exhibiting stoichiometric ratio 1:1, and maximum wavelength 400, 420 and 373 nm respectively have been studied in aqueous medium. ICH guidelines were followed for validation study. Spectroscopic parameters including oscillator’s strength, dipole moment, ionization potential, energy of complexes, resonance energy, association constant and Gibb’s free energy changes have also been investigated and Benesi-Hildebrand plot in each case has been obtained. In addition, the methods were fruitfully employed for omeprazole determination in pharmaceutical formulations with no excipients obstruction during analysis. Solid omeprazole complexes with all the acceptors were synthesized and then structure was elucidated by IR and 1H NMR spectroscopy.Keywords: omeprazole, bromothymol blue, methyl orange and picric acid, charge transfer complexes
Procedia PDF Downloads 54022188 Ecological impacts of Cage Farming: A Case Study of Lake Victoria, Kenya
Authors: Mercy Chepkirui, Reuben Omondi, Paul Orina, Albert Getabu, Lewis Sitoki, Jonathan Munguti
Abstract:
Globally, the decline in capture fisheries as a result of the growing population and increasing awareness of the nutritional benefits of white meat has led to the development of aquaculture. This is anticipated to meet the increasing call for more food for the human population, which is likely to increase further by 2050. Statistics showed that more than 50% of the global future fish diet will come from aquaculture. Aquaculture began commercializing some decades ago; this is accredited to technological advancement from traditional to modern cultural systems, including cage farming. Cage farming technology has been rapidly growing since its inception in Lake Victoria, Kenya. Currently, over 6,000 cages have been set up in Kenyan waters, and this offers an excellent opportunity for recognition of Kenya’s government tactic to eliminate food insecurity and malnutrition, create employment and promote a Blue Economy. However, being an open farming enterprise is likely to emit large bulk of waste hence altering the ecosystem integrity of the lake. This is through increased chlorophyll-a pigments, alteration of the plankton community, macroinvertebrates, fish genetic pollution, transmission of fish diseases and pathogens. Cage farming further increases the nutrient loads leading to the production of harmful algal blooms, thus negatively affecting aquatic and human life. Despite the ecological transformation, cage farming provides a platform for the achievement of the Sustainable Development Goals of 2030, especially the achievement of food security and nutrition. Therefore, there is a need for Integrated Multitrophic Aquaculture as part of Blue Transformation for ecosystem monitoring.Keywords: aquaculture, ecosystem, blue economy, food security
Procedia PDF Downloads 7822187 Virtual Reality Based 3D Video Games and Speech-Lip Synchronization Superseding Algebraic Code Excited Linear Prediction
Authors: P. S. Jagadeesh Kumar, S. Meenakshi Sundaram, Wenli Hu, Yang Yung
Abstract:
In 3D video games, the dominance of production is unceasingly growing with a protruding level of affordability in terms of budget. Afterward, the automation of speech-lip synchronization technique is customarily onerous and has advanced a critical research subject in virtual reality based 3D video games. This paper presents one of these automatic tools, precisely riveted on the synchronization of the speech and the lip movement of the game characters. A robust and precise speech recognition segment that systematized with Algebraic Code Excited Linear Prediction method is developed which unconventionally delivers lip sync results. The Algebraic Code Excited Linear Prediction algorithm is constructed on that used in code-excited linear prediction, but Algebraic Code Excited Linear Prediction codebooks have an explicit algebraic structure levied upon them. This affords a quicker substitute to the software enactments of lip sync algorithms and thus advances the superiority of service factors abridged production cost.Keywords: algebraic code excited linear prediction, speech-lip synchronization, video games, virtual reality
Procedia PDF Downloads 47422186 Practical Challenges of Tunable Parameters in Matlab/Simulink Code Generation
Authors: Ebrahim Shayesteh, Nikolaos Styliaras, Alin George Raducu, Ozan Sahin, Daniel Pombo VáZquez, Jonas Funkquist, Sotirios Thanopoulos
Abstract:
One of the important requirements in many code generation projects is defining some of the model parameters tunable. This helps to update the model parameters without performing the code generation again. This paper studies the concept of embedded code generation by MATLAB/Simulink coder targeting the TwinCAT Simulink system. The generated runtime modules are then tested and deployed to the TwinCAT 3 engineering environment. However, defining the parameters tunable in MATLAB/Simulink code generation targeting TwinCAT is not very straightforward. This paper focuses on this subject and reviews some of the techniques tested here to make the parameters tunable in generated runtime modules. Three techniques are proposed for this purpose, including normal tunable parameters, callback functions, and mask subsystems. Moreover, some test Simulink models are developed and used to evaluate the results of proposed approaches. A brief summary of the study results is presented in the following. First of all, the parameters defined tunable and used in defining the values of other Simulink elements (e.g., gain value of a gain block) could be changed after the code generation and this value updating will affect the values of all elements defined based on the values of the tunable parameter. For instance, if parameter K=1 is defined as a tunable parameter in the code generation process and this parameter is used to gain a gain block in Simulink, the gain value for the gain block is equal to 1 in the gain block TwinCAT environment after the code generation. But, the value of K can be changed to a new value (e.g., K=2) in TwinCAT (without doing any new code generation in MATLAB). Then, the gain value of the gain block will change to 2. Secondly, adding a callback function in the form of “pre-load function,” “post-load function,” “start function,” and will not help to make the parameters tunable without performing a new code generation. This means that any MATLAB files should be run before performing the code generation. The parameters defined/calculated in this file will be used as fixed values in the generated code. Thus, adding these files as callback functions to the Simulink model will not make these parameters flexible since the MATLAB files will not be attached to the generated code. Therefore, to change the parameters defined/calculated in these files, the code generation should be done again. However, adding these files as callback functions forces MATLAB to run them before the code generation, and there is no need to define the parameters mentioned in these files separately. Finally, using a tunable parameter in defining/calculating the values of other parameters through the mask is an efficient method to change the value of the latter parameters after the code generation. For instance, if tunable parameter K is used in calculating the value of two other parameters K1 and K2 and, after the code generation, the value of K is updated in TwinCAT environment, the value of parameters K1 and K2 will also be updated (without any new code generation).Keywords: code generation, MATLAB, tunable parameters, TwinCAT
Procedia PDF Downloads 22722185 Design and Performance Improvement of Three-Dimensional Optical Code Division Multiple Access Networks with NAND Detection Technique
Authors: Satyasen Panda, Urmila Bhanja
Abstract:
In this paper, we have presented and analyzed three-dimensional (3-D) matrices of wavelength/time/space code for optical code division multiple access (OCDMA) networks with NAND subtraction detection technique. The 3-D codes are constructed by integrating a two-dimensional modified quadratic congruence (MQC) code with one-dimensional modified prime (MP) code. The respective encoders and decoders were designed using fiber Bragg gratings and optical delay lines to minimize the bit error rate (BER). The performance analysis of the 3D-OCDMA system is based on measurement of signal to noise ratio (SNR), BER and eye diagram for a different number of simultaneous users. Also, in the analysis, various types of noises and multiple access interference (MAI) effects were considered. The results obtained with NAND detection technique were compared with those obtained with OR and AND subtraction techniques. The comparison results proved that the NAND detection technique with 3-D MQC\MP code can accommodate more number of simultaneous users for longer distances of fiber with minimum BER as compared to OR and AND subtraction techniques. The received optical power is also measured at various levels of BER to analyze the effect of attenuation.Keywords: Cross Correlation (CC), Three dimensional Optical Code Division Multiple Access (3-D OCDMA), Spectral Amplitude Coding Optical Code Division Multiple Access (SAC-OCDMA), Multiple Access Interference (MAI), Phase Induced Intensity Noise (PIIN), Three Dimensional Modified Quadratic Congruence/Modified Prime (3-D MQC/MP) code
Procedia PDF Downloads 41222184 Estimating Interdependence of Social Statuses in a Cooperative Breeding Birds through Mathematical Modelling
Authors: Sinchan Ghosh, Fahad Al Basir, Santanu Ray, Sabyasachi Bhattacharya
Abstract:
The cooperatively breeding birds have two major ranks for the sexually mature birds. The breeders mate and produce offspring while the non-breeding helpers increase the chick production rate through help in mate-finding and allo-parenting. However, the chicks also cooperate to raise their younger siblings through warming, defending and food sharing. Although, the existing literatures describes the evolution of allo-parenting in birds but do not differentiate the significance of allo-parenting in sexually immature and mature helpers separately. This study addresses the significance of both immature and mature helpers’ contribution to the total sustainable bird population in a breeding site using Blue-tailed bee-eater as a test-bed species. To serve this purpose, a mathematical model has been built considering each social status and chicks as separate but interactive compartments. Also, to observe the dynamics of each social status with changing prey abundance, a prey population has been introduced as an additional compartment. The model was analyzed for stability condition and was validated using field-data. A simulation experiment was then performed to observe the change in equilibria with a varying helping rate from both the helpers. The result from the simulation experiment suggest that the cooperative breeding population changes its population sizes significantly with a change in helping rate from the sexually immature helpers. On the other hand, the mature helpers do not contribute to the stability of the population equilibrium as much as the immature helpers.Keywords: Blue-tailed bee eater, Altruism, Mathematical Ethology, Behavioural modelling
Procedia PDF Downloads 16222183 Distinguishing Borrowings from Code Mixes: An Analysis of English Lexical Items Used in the Print Media in Sri Lanka
Authors: Chamindi Dilkushi Senaratne
Abstract:
Borrowing is the morphological, syntactic and (usually) phonological integration of lexical items from one language into the structure of another language. Borrowings show complete linguistic integration and due to the frequency of use become fossilized in the recipient language differentiating them from switches and mixes. Code mixes are different to borrowings. Code mixing takes place when speakers use lexical items in casual conversation to serve a variety of functions. This study presents an analysis of lexical items used in English newspapers in Sri Lanka in 2017 which reveal characteristics of borrowing or code mixes. Both phenomena arise due to language contact. The study will also use data from social media websites that comment on newspaper articles available on the web. The study reiterates that borrowings are distinguishable from code mixes and that they are two different phenomena that occur in language contact situations. The study also shows how existing morphological processes are used to create new vocabulary in language use. The study sheds light into how existing morphological processes are used by the bilingual to be creative, innovative and convey a bilingual identity.Keywords: borrowing, code mixing, morphological processes
Procedia PDF Downloads 21922182 Using RASCAL Code to Analyze the Postulated UF6 Fire Accident
Authors: J. R. Wang, Y. Chiang, W. S. Hsu, S. H. Chen, J. H. Yang, S. W. Chen, C. Shih, Y. F. Chang, Y. H. Huang, B. R. Shen
Abstract:
In this research, the RASCAL code was used to simulate and analyze the postulated UF6 fire accident which may occur in the Institute of Nuclear Energy Research (INER). There are four main steps in this research. In the first step, the UF6 data of INER were collected. In the second step, the RASCAL analysis methodology and model was established by using these data. Third, this RASCAL model was used to perform the simulation and analysis of the postulated UF6 fire accident. Three cases were simulated and analyzed in this step. Finally, the analysis results of RASCAL were compared with the hazardous levels of the chemicals. According to the compared results of three cases, Case 3 has the maximum danger in human health.Keywords: RASCAL, UF₆, safety, hydrogen fluoride
Procedia PDF Downloads 22222181 Seismic Inversion to Improve the Reservoir Characterization: Case Study in Central Blue Nile Basin, Sudan
Authors: Safwat E. Musa, Nuha E. Mohamed, Nuha A. Bagi
Abstract:
In this study, several crossplots of the P-impedance with the lithology logs (gamma ray, neutron porosity, deep resistivity, water saturation and Vp/Vs curves) were made in three available wells, which were drilled in central part of the Blue Nile basin in depths varies from 1460 m to 1600 m. These crossplots were successful to discriminate between sand and shale when using P-Impedance values, and between the wet sand and the pay sand when using both P-impedance and Vp/Vs together. Also, some impedance sections were converted to porosity sections using linear formula to characterize the reservoir in terms of porosity. The used crossplots were created on log resolution, while the seismic resolution can identify only the reservoir, unless a 3D seismic angle stacks were available; then it would be easier to identify the pay sand with great confidence; through high resolution seismic inversion and geostatistical approach when using P-impedance and Vp/Vs volumes.Keywords: basin, Blue Nile, inversion, seismic
Procedia PDF Downloads 43022180 Insights on the Social-Economic Implications of the Blue Economy Concept on Coastal Tourism in Tonga
Authors: Amelia Faotusia
Abstract:
The blue economy concept was coined by Pacific nations in recognition of the importance of sustainably managing their extensive marine territories. This is especially important for major ocean-based economic sectors of Pacific economies, such as coastal tourism. There is an absence of research, however, on the key ways in which the blue economy concept has emerged in discourse and public policy in Pacific countries, as well as how it articulates with coastal tourism. This research helps to fill such a gap with a specific focus on Tonga through the application of a post-positivist research approach to conduct a desktop study of relevant national documents and qualitative interviews with relevant government staff, civil society organizations, and tourism operators. The findings of the research reflect the importance of institutional integration and partnerships for a successful blue economy transition and are presented in the form of two case studies corresponding to two sub-sectors of Tonga’s coastal tourism sector: (i) the whale-watching and swimming industry, and (ii) beach resorts and restaurants. A thematic analysis applied to the interview data of both cases then enabled the identification of key areas and issues for socio-economic policy intervention and recommendations in support of blue economy transitions in Tonga’s coastal tourism sector. Examples of the relevant areas and issues that emerged included the importance of foreign direct investment, local market access, community-based special management areas, as well as the need to address the anthropogenic impacts of tropical cyclones, whale tourism, plastic litter on coastal assets, and ecosystems. Policy and practical interventions in support of addressing such issues include a proposed restructuring of the whale-watching and swimming licensing system; integration of climate resilience, adaptation, and capacity building as priorities of local blue economy interventions; as well as strengthening of the economic sustainability dimension of blue economy policies. Finally, this research also revealed the need for further specificity and research on the influence and value of local Tongan culture and traditional knowledge, particularly within existing customary marine tenure systems, on Tonga’s national and sectoral blue economy policies and transitions.Keywords: blue economy, coastal tourism, integrated ocean management, ecosystem resilience
Procedia PDF Downloads 9122179 An Electrochemical DNA Biosensor Based on Oracet Blue as a Label for Detection of Helicobacter pylori
Authors: Saeedeh Hajihosseini, Zahra Aghili, Navid Nasirizadeh
Abstract:
An innovative method of a DNA electrochemical biosensor based on Oracet Blue (OB) as an electroactive label and gold electrode (AuE) for detection of Helicobacter pylori, was offered. A single–stranded DNA probe with a thiol modification was covalently immobilized on the surface of the AuE by forming an Au–S bond. Differential pulse voltammetry (DPV) was used to monitor DNA hybridization by measuring the electrochemical signals of reduction of the OB binding to double– stranded DNA (ds–DNA). Our results showed that OB–based DNA biosensor has a decent potential for detection of single–base mismatch in target DNA. Selectivity of the proposed DNA biosensor was further confirmed in the presence of non–complementary and complementary DNA strands. Under optimum conditions, the electrochemical signal had a linear relationship with the concentration of the target DNA ranging from 0.3 nmol L-1 to 240.0 nmol L-1, and the detection limit was 0.17 nmol L-1, whit a promising reproducibility and repeatability.Keywords: DNA biosensor, oracet blue, Helicobacter pylori, electrode (AuE)
Procedia PDF Downloads 26622178 Testing and Validation Stochastic Models in Epidemiology
Authors: Snigdha Sahai, Devaki Chikkavenkatappa Yellappa
Abstract:
This study outlines approaches for testing and validating stochastic models used in epidemiology, focusing on the integration and functional testing of simulation code. It details methods for combining simple functions into comprehensive simulations, distinguishing between deterministic and stochastic components, and applying tests to ensure robustness. Techniques include isolating stochastic elements, utilizing large sample sizes for validation, and handling special cases. Practical examples are provided using R code to demonstrate integration testing, handling of incorrect inputs, and special cases. The study emphasizes the importance of both functional and defensive programming to enhance code reliability and user-friendliness.Keywords: computational epidemiology, epidemiology, public health, infectious disease modeling, statistical analysis, health data analysis, disease transmission dynamics, predictive modeling in health, population health modeling, quantitative public health, random sampling simulations, randomized numerical analysis, simulation-based analysis, variance-based simulations, algorithmic disease simulation, computational public health strategies, epidemiological surveillance, disease pattern analysis, epidemic risk assessment, population-based health strategies, preventive healthcare models, infection dynamics in populations, contagion spread prediction models, survival analysis techniques, epidemiological data mining, host-pathogen interaction models, risk assessment algorithms for disease spread, decision-support systems in epidemiology, macro-level health impact simulations, socioeconomic determinants in disease spread, data-driven decision making in public health, quantitative impact assessment of health policies, biostatistical methods in population health, probability-driven health outcome predictions
Procedia PDF Downloads 622177 A Real-Time Simulation Environment for Avionics Software Development and Qualification
Authors: Ferdinando Montemari, Antonio Vitale, Nicola Genito, Luca Garbarino, Urbano Tancredi, Domenico Accardo, Michele Grassi, Giancarmine Fasano, Anna Elena Tirri
Abstract:
The development of guidance, navigation and control algorithms and avionic procedures requires the disposability of suitable analysis and verification tools, such as simulation environments, which support the design process and allow detecting potential problems prior to the flight test, in order to make new technologies available at reduced cost, time and risk. This paper presents a simulation environment for avionic software development and qualification, especially aimed at equipment for general aviation aircrafts and unmanned aerial systems. The simulation environment includes models for short and medium-range radio-navigation aids, flight assistance systems, and ground control stations. All the software modules are able to simulate the modeled systems both in fast-time and real-time tests, and were implemented following component oriented modeling techniques and requirement based approach. The paper describes the specific models features, the architectures of the implemented software systems and its validation process. Performed validation tests highlighted the capability of the simulation environment to guarantee in real-time the required functionalities and performance of the simulated avionics systems, as well as to reproduce the interaction between these systems, thus permitting a realistic and reliable simulation of a complete mission scenario.Keywords: ADS-B, avionics, NAVAIDs, real-time simulation, TCAS, UAS ground control station
Procedia PDF Downloads 22822176 Design and Implementation of Low-code Model-building Methods
Authors: Zhilin Wang, Zhihao Zheng, Linxin Liu
Abstract:
This study proposes a low-code model-building approach that aims to simplify the development and deployment of artificial intelligence (AI) models. With an intuitive way to drag and drop and connect components, users can easily build complex models and integrate multiple algorithms for training. After the training is completed, the system automatically generates a callable model service API. This method not only lowers the technical threshold of AI development and improves development efficiency but also enhances the flexibility of algorithm integration and simplifies the deployment process of models. The core strength of this method lies in its ease of use and efficiency. Users do not need to have a deep programming background and can complete the design and implementation of complex models with a simple drag-and-drop operation. This feature greatly expands the scope of AI technology, allowing more non-technical people to participate in the development of AI models. At the same time, the method performs well in algorithm integration, supporting many different types of algorithms to work together, which further improves the performance and applicability of the model. In the experimental part, we performed several performance tests on the method. The results show that compared with traditional model construction methods, this method can make more efficient use, save computing resources, and greatly shorten the model training time. In addition, the system-generated model service interface has been optimized for high availability and scalability, which can adapt to the needs of different application scenarios.Keywords: low-code, model building, artificial intelligence, algorithm integration, model deployment
Procedia PDF Downloads 2922175 Comparison of Efficient Production of Small Module Gears
Authors: Vaclav Musil, Robert Cep, Sarka Malotova, Jiri Hajnys, Frantisek Spalek
Abstract:
The new designs of satellite gears comprising a number of small gears pose high requirements on the precise production of small module gears. The objective of the experimental activity stated in this article was to compare the conventional rolling gear cutting technology with the modern wire electrical discharge machining (WEDM) technology for the production of small module gear m=0.6 mm (thickness of 2.5 mm and material 30CrMoV9). The WEDM technology lies in copying the profile of gearing from the rendered trajectory which is then transferred to the track of a wire electrode. During the experiment, we focused on the comparison of these production methods. Main measured parameters which significantly influence the lifetime and noise was chosen. The first parameter was to compare the precision of gearing profile in respect to the mathematic model. The second monitored parameter was the roughness and surface topology of the gear tooth side. The experiment demonstrated high accuracy of WEDM technology, but a low quality of machined surface.Keywords: precision of gearing, small module gears, surface topology, WEDM technology
Procedia PDF Downloads 23222174 Application of Biopolymer for Adsorption of Methylene Blue Dye from Simulated Effluent: A Green Method for Textile Industry Wastewater Treatment
Authors: Rabiya, Ramkrishna Sen
Abstract:
The textile industry releases huge volume of effluent containing reactive dyes in the nearby water bodies. These effluents are significant source of water pollution since most of the dyes are toxic in nature. Moreover, it scavenges the dissolved oxygen essential to the aquatic species. Therefore, it is necessary to treat the dye effluent before it is discharged in the nearby water bodies. The present study focuses on removing the basic dye methylene blue from simulated wastewater using biopolymer. The biopolymer was partially purified from the culture of Bacillus licheniformis by ultrafiltration. Based on the elution profile of the biopolymer from ion exchange column, it was found to be a negatively charged molecule. Its net anionic nature allows the biopolymer to adsorb positively charged molecule, methylene blue. The major factors which influence the removal of dye by the biopolymer such as incubation time, pH, initial dye concentration were evaluated. The methylene blue uptake by the biopolymer is more (14.84 mg/g) near neutral pH than in acidic pH (12.05mg/g) of the water. At low pH, the lower dissociation of the dye molecule as well as the low negative charge available on the biopolymer reduces the interaction between the biopolymer and dye. The optimum incubation time for maximum removal of dye was found to be 60 min. The entire study was done with 25 mL of dye solution in 100 mL flask at 25 °C with an amount of 11g/L of biopolymer. To study the adsorption isotherm, the dye concentration was varied in the range of 25mg/L to 205mg/L. The dye uptake by the biopolymer against the equilibrium concentration was plotted. The plot indicates that the adsorption of dye by biopolymer follows the Freundlich adsorption isotherm (R-square 0.99). Hence, these studies indicate the potential use of biopolymer for the removal of basic dye from textile wastewater in an ecofriendly and sustainable way.Keywords: biopolymer, methylene blue dye, textile industry, wastewater
Procedia PDF Downloads 14122173 A Neurosymbolic Learning Method for Uplink LTE-A Channel Estimation
Authors: Lassaad Smirani
Abstract:
In this paper we propose a Neurosymbolic Learning System (NLS) as a channel estimator for Long Term Evolution Advanced (LTE-A) uplink. The proposed system main idea based on Neural Network has modules capable of performing bidirectional information transfer between symbolic module and connectionist module. We demonstrate various strengths of the NLS especially the ability to integrate theoretical knowledge (rules) and experiential knowledge (examples), and to make an initial knowledge base (rules) converted into a connectionist network. Also to use empirical knowledge witch by learning will have the ability to revise the theoretical knowledge and acquire new one and explain it, and finally the ability to improve the performance of symbolic or connectionist systems. Compared with conventional SC-FDMA channel estimation systems, The performance of NLS in terms of complexity and quality is confirmed by theoretical analysis and simulation and shows that this system can make the channel estimation accuracy improved and bit error rate decreased.Keywords: channel estimation, SC-FDMA, neural network, hybrid system, BER, LTE-A
Procedia PDF Downloads 39422172 Error Detection and Correction for Onboard Satellite Computers Using Hamming Code
Authors: Rafsan Al Mamun, Md. Motaharul Islam, Rabana Tajrin, Nabiha Noor, Shafinaz Qader
Abstract:
In an attempt to enrich the lives of billions of people by providing proper information, security and a way of communicating with others, the need for efficient and improved satellites is constantly growing. Thus, there is an increasing demand for better error detection and correction (EDAC) schemes, which are capable of protecting the data onboard the satellites. The paper is aimed towards detecting and correcting such errors using a special algorithm called the Hamming Code, which uses the concept of parity and parity bits to prevent single-bit errors onboard a satellite in Low Earth Orbit. This paper focuses on the study of Low Earth Orbit satellites and the process of generating the Hamming Code matrix to be used for EDAC using computer programs. The most effective version of Hamming Code generated was the Hamming (16, 11, 4) version using MATLAB, and the paper compares this particular scheme with other EDAC mechanisms, including other versions of Hamming Codes and Cyclic Redundancy Check (CRC), and the limitations of this scheme. This particular version of the Hamming Code guarantees single-bit error corrections as well as double-bit error detections. Furthermore, this version of Hamming Code has proved to be fast with a checking time of 5.669 nanoseconds, that has a relatively higher code rate and lower bit overhead compared to the other versions and can detect a greater percentage of errors per length of code than other EDAC schemes with similar capabilities. In conclusion, with the proper implementation of the system, it is quite possible to ensure a relatively uncorrupted satellite storage system.Keywords: bit-flips, Hamming code, low earth orbit, parity bits, satellite, single error upset
Procedia PDF Downloads 13022171 Synthesis of Graphene Oxide/Chitosan Nanocomposite for Methylene Blue Adsorption
Authors: S. Melvin Samuel, Jayanta Bhattacharya
Abstract:
In the present study, a graphene oxide/chitosan (GO-CS) composite material was prepared and used as an adsorbent for the removal of methylene blue (MB) from aqueous solution. The synthesized GO-CS adsorbent was characterized by Fourier transform infrared spectroscopy (FT-IR), X-ray diffraction (XRD), scanning electron microscopes (SEM), transmission electron microscopy (TEM), Raman spectroscopy and thermogravimetric analysis (TGA). The removal of MB was conducted in batch mode. The effect of parameters influencing the adsorption of MB such as pH of the solution, initial MB concentration, shaking speed, contact time and adsorbent dosage were studied. The results showed that the GO-CS composite material has high adsorption capacity of 196 mg/g of MB solution at pH 9.0. Further, the adsorption of MB on GO-CS followed pseudo second order kinetics and equilibrium adsorption data well fitted by the Langmuir isotherm model. The study suggests that the GO-CS is a favorable adsorbent for the removal of MB from aqueous solution.Keywords: Methylene blue, Graphene oxide-chitosan, Isotherms, Kinetics.
Procedia PDF Downloads 18922170 Engineering Thermal-Hydraulic Simulator Based on Complex Simulation Suite “Virtual Unit of Nuclear Power Plant”
Authors: Evgeny Obraztsov, Ilya Kremnev, Vitaly Sokolov, Maksim Gavrilov, Evgeny Tretyakov, Vladimir Kukhtevich, Vladimir Bezlepkin
Abstract:
Over the last decade, a specific set of connected software tools and calculation codes has been gradually developed. It allows simulating I&C systems, thermal-hydraulic, neutron-physical and electrical processes in elements and systems at the Unit of NPP (initially with WWER (pressurized water reactor)). In 2012 it was called a complex simulation suite “Virtual Unit of NPP” (or CSS “VEB” for short). Proper application of this complex tool should result in a complex coupled mathematical computational model. And for a specific design of NPP, it is called the Virtual Power Unit (or VPU for short). VPU can be used for comprehensive modelling of a power unit operation, checking operator's functions on a virtual main control room, and modelling complicated scenarios for normal modes and accidents. In addition, CSS “VEB” contains a combination of thermal hydraulic codes: the best-estimate (two-liquid) calculation codes KORSAR and CORTES and a homogenous calculation code TPP. So to analyze a specific technological system one can build thermal-hydraulic simulation models with different detalization levels up to a nodalization scheme with real geometry. And the result at some points is similar to the notion “engineering/testing simulator” described by the European utility requirements (EUR) for LWR nuclear power plants. The paper is dedicated to description of the tools mentioned above and an example of the application of the engineering thermal-hydraulic simulator in analysis of the boron acid concentration in the primary coolant (changed by the make-up and boron control system).Keywords: best-estimate code, complex simulation suite, engineering simulator, power plant, thermal hydraulic, VEB, virtual power unit
Procedia PDF Downloads 38022169 The Implementation of Character Education in Code Riverbanks, Special Region of Yogyakarta, Indonesia
Authors: Ulil Afidah, Muhamad Fathan Mubin, Firdha Aulia
Abstract:
Code riverbanks Yogyakarta is a settlement area with middle to lower social classes. Socio-economic situation is affecting the behavior of society. This research aimed to find and explain the implementation and the assessment of character education which were done in elementary schools in Code riverside, Yogyakarta region of Indonesia. This research is a qualitative research which the subjects were the kids of Code riverbanks, Yogyakarta. The data were collected through interviews and document studies and analyzed qualitatively using the technique of interactive analysis model of Miles and Huberman. The results show that: (1) The learning process of character education was done by integrating all aspects such as democratic and interactive learning session also introducing role model to the students. 2) The assessment of character education was done by teacher based on teaching and learning process and an activity in outside the classroom that was the criterion on three aspects: Cognitive, affective and psychomotor.Keywords: character, Code riverbanks, education, Yogyakarta
Procedia PDF Downloads 24822168 JaCoText: A Pretrained Model for Java Code-Text Generation
Authors: Jessica Lopez Espejel, Mahaman Sanoussi Yahaya Alassan, Walid Dahhane, El Hassane Ettifouri
Abstract:
Pretrained transformer-based models have shown high performance in natural language generation tasks. However, a new wave of interest has surged: automatic programming language code generation. This task consists of translating natural language instructions to a source code. Despite the fact that well-known pre-trained models on language generation have achieved good performance in learning programming languages, effort is still needed in automatic code generation. In this paper, we introduce JaCoText, a model based on Transformer neural network. It aims to generate java source code from natural language text. JaCoText leverages the advantages of both natural language and code generation models. More specifically, we study some findings from state of the art and use them to (1) initialize our model from powerful pre-trained models, (2) explore additional pretraining on our java dataset, (3) lead experiments combining the unimodal and bimodal data in training, and (4) scale the input and output length during the fine-tuning of the model. Conducted experiments on CONCODE dataset show that JaCoText achieves new state-of-the-art results.Keywords: java code generation, natural language processing, sequence-to-sequence models, transformer neural networks
Procedia PDF Downloads 284