Search results for: Tool Integration
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7133

Search results for: Tool Integration

5693 Cost-Effective Mechatronic Gaming Device for Post-Stroke Hand Rehabilitation

Authors: A. Raj Kumar, S. Bilaloglu

Abstract:

Stroke is a leading cause of adult disability worldwide. We depend on our hands for our activities of daily living(ADL). Although many patients regain the ability to walk, they continue to experience long-term hand motor impairments. As the number of individuals with young stroke is increasing, there is a critical need for effective approaches for rehabilitation of hand function post-stroke. Motor relearning for dexterity requires task-specific kinesthetic, tactile and visual feedback. However, when a stroke results in both sensory and motor impairment, it becomes difficult to ascertain when and what type of sensory substitutions can facilitate motor relearning. In an ideal situation, real-time task-specific data on the ability to learn and data-driven feedback to assist such learning will greatly assist rehabilitation for dexterity. We have found that kinesthetic and tactile information from the unaffected hand can assist patients re-learn the use of optimal fingertip forces during a grasp and lift task. Measurement of fingertip grip force (GF), load forces (LF), their corresponding rates (GFR and LFR), and other metrics can be used to gauge the impairment level and progress during learning. Currently ATI mini force-torque sensors are used in research settings to measure and compute the LF, GF, and their rates while grasping objects of different weights and textures. Use of the ATI sensor is cost prohibitive for deployment in clinical or at-home rehabilitation. A cost effective mechatronic device is developed to quantify GF, LF, and their rates for stroke rehabilitation purposes using off-the-shelf components such as load cells, flexi-force sensors, and an Arduino UNO microcontroller. A salient feature of the device is its integration with an interactive gaming environment to render a highly engaging user experience. This paper elaborates the integration of kinesthetic and tactile sensing through computation of LF, GF and their corresponding rates in real time, information processing, and interactive interfacing through augmented reality for visual feedback.

Keywords: feedback, gaming, kinesthetic, rehabilitation, tactile

Procedia PDF Downloads 232
5692 Comet Assay: A Promising Tool for the Risk Assessment and Clinical Management of Head and Neck Tumors

Authors: Sarim Ahmad

Abstract:

The Single Cell Gel Electrophoresis Assay (SCGE, known as comet assay) is a potential, uncomplicated, sensitive and state-of-the-art technique for quantitating DNA damage at individual cell level and repair from in vivo and in vitro samples of eukaryotic cells and some prokaryotic cells, being popular in its widespread use in various areas including human biomonitoring, genotoxicology, ecological monitoring and as a tool for research into DNA damage or repair in different cell types in response to a range of DNA damaging agents, cancer risk and therapy. The method involves the encapsulation of cells in a low-melting-point agarose suspension, lysis of the cells in neutral or alkaline (pH > 13) conditions, and electrophoresis of the suspended lysed cells, resulting in structures resembling comets as observed by fluorescence microscopy; the intensity of the comet tail relative to the head reflects the number of DNA breaks. The likely basis for this is that loops containing a break lose their supercoiling and become free to extend towards the anode. This is followed by visual analysis with staining of DNA and calculating fluorescence to determine the extent of DNA damage. This can be performed by manual scoring or automatically by imaging software. The assay can, therefore, predict an individual’s tumor sensitivity to radiation and various chemotherapeutic drugs and further assess the oxidative stress within tumors and to detect the extent of DNA damage in various cancerous and precancerous lesions of oral cavity.

Keywords: comet assay, single cell gel electrophoresis, DNA damage, early detection test

Procedia PDF Downloads 280
5691 Clinch Process Simulation Using Diffuse Elements

Authors: Benzegaou Ali, Brani Benabderrahmane

Abstract:

This work describes a numerical study of the TOX–clinching process using diffuse elements. A computer code baptized SEMA "Static Explicit Method Analysis" is developed to simulate the clinch joining process. The FE code is based on an Updated Lagrangian scheme. The used resolution method is based on an explicit static approach. The integration of the elasto-plastic behavior law is realized with an algorithm of Simo and Taylor. The tools are represented by plane facets.

Keywords: diffuse elements, numerical simulation, clinching, contact, large deformation

Procedia PDF Downloads 348
5690 TACTICAL: Ram Image Retrieval in Linux Using Protected Mode Architecture’s Paging Technique

Authors: Sedat Aktas, Egemen Ulusoy, Remzi Yildirim

Abstract:

This article explains how to get a ram image from a computer with a Linux operating system and what steps should be followed while getting it. What we mean by taking a ram image is the process of dumping the physical memory instantly and writing it to a file. This process can be likened to taking a picture of everything in the computer’s memory at that moment. This process is very important for tools that analyze ram images. Volatility can be given as an example because before these tools can analyze ram, images must be taken. These tools are used extensively in the forensic world. Forensic, on the other hand, is a set of processes for digitally examining the information on any computer or server on behalf of official authorities. In this article, the protected mode architecture in the Linux operating system is examined, and the way to save the image sample of the kernel driver and system memory to disk is followed. Tables and access methods to be used in the operating system are examined based on the basic architecture of the operating system, and the most appropriate methods and application methods are transferred to the article. Since there is no article directly related to this study on Linux in the literature, it is aimed to contribute to the literature with this study on obtaining ram images. LIME can be mentioned as a similar tool, but there is no explanation about the memory dumping method of this tool. Considering the frequency of use of these tools, the contribution of the study in the field of forensic medicine has been the main motivation of the study due to the intense studies on ram image in the field of forensics.

Keywords: linux, paging, addressing, ram-image, memory dumping, kernel modules, forensic

Procedia PDF Downloads 91
5689 River Offtake Management Using Mathematical Modelling Tool: A Case Study of the Gorai River, Bangladesh

Authors: Sarwat Jahan, Asker Rajin Rahman

Abstract:

Management of offtake of any fluvial river is very sensitive in terms of long-term sustainability where the variation of water flow and sediment transport range are wide enough throughout a hydrological year. The Gorai River is a major distributary of the Ganges River in Bangladesh and is termed as a primary source of fresh water for the South-West part of the country. Every year, significant siltation of the Gorai offtake disconnects it from the Ganges during the dry season. As a result, the socio-economic and environmental condition of the downstream areas has been deteriorating for a few decades. To improve the overall situation of the Gorai offtake and its dependent areas, a study has been conducted by the Institute of Water Modelling, Bangladesh, in 2022. Using the mathematical morphological modeling tool MIKE 21C of DHI Water & Environment, Denmark, simulated results revealed the need for dredging/river training structures for offtake management at the Gorai offtake to ensure significant dry season flow towards the downstream. The dry season flow is found to increase significantly with the proposed river interventions, which also improves the environmental conditions in terms of salinity of the South-West zone of the country. This paper summarizes the primary findings of the analyzed results of the developed mathematical model for improving the existing condition of the Gorai River.

Keywords: Gorai river, mathematical modelling, offtake, siltation, salinity

Procedia PDF Downloads 81
5688 Improving the Detection of Depression in Sri Lanka: Cross-Sectional Study Evaluating the Efficacy of a 2-Question Screen for Depression

Authors: Prasad Urvashi, Wynn Yezarni, Williams Shehan, Ravindran Arun

Abstract:

Introduction: Primary health services are often the first point of contact that patients with mental illness have with the healthcare system. A number of tools have been developed to increase detection of depression in the context of primary care. However, one challenge amongst many includes utilizing these tools within the limited primary care consultation timeframe. Therefore, short questionnaires that screen for depression that are just as effective as more comprehensive diagnostic tools may be beneficial in improving detection rates of patients visiting a primary care setting. Objective: To develop and determine the sensitivity and specificity of a 2-Question Questionnaire (2-QQ) to screen for depression in in a suburban primary care clinic in Ragama, Sri Lanka. The purpose is to develop a short screening tool for depression that is culturally adapted in order to increase the detection of depression in the Sri Lankan patient population. Methods: This was a cross-sectional study involving two steps. Step one: verbal administration of 2-QQ to patients by their primary care physician. Step two: completion of the Peradeniya Depression Scale, a validated diagnostic tool for depression, the patient after their consultation with the primary care physician. The results from the PDS were then correlated to the results from the 2-QQ for each patient to determine sensitivity and specificity of the 2-QQ. Results: A score of 1/+ on the 2-QQ was most sensitive but least specific. Thus, setting the threshold at this level is effective for correctly identifying depressed patients, but also inaccurately captures patients who are not depressed. A score of 6 on the 2-QQ was most specific but least sensitive. Setting the threshold at this level is effective for correctly identifying patients without depression, but not very effective at capturing patients with depression. Discussion: In the context of primary care, it may be worthwhile setting the 2-QQ screen at a lower threshold for positivity (such as a score of 1 or above). This would generate a high test sensitivity and thus capture the majority of patients that have depression. On the other hand, by setting a low threshold for positivity, patients who do not have depression but score higher than 1 on the 2-QQ will also be falsely identified as testing positive for depression. However, the benefits of identifying patients who present with depression may outweigh the harms of falsely identifying a non-depressed patient. It is our hope that the 2-QQ will serve as a quick primary screen for depression in the primary care setting and serve as a catalyst to identify and treat individuals with depression.

Keywords: depression, primary care, screening tool, Sri Lanka

Procedia PDF Downloads 235
5687 Shifting Paradigms for Micro, Small, and Medium Enterprises in the Global Construction Market: The Crucial Roles of Technology and Sustainability

Authors: Sohrab Donyavi

Abstract:

The global construction market is experiencing significant shifts, particularly for micro, small, and medium enterprises (MSMEs), driven by the dual imperatives of technological advancement and sustainability. MSMEs play a crucial role in the construction industry, often being the backbone of economic development and fostering entrepreneurial skills. However, their dominance has also led to industry fragmentation and challenges such as technological lag and declining profit margins, which threaten their global competitiveness. This paper explores the integration of technology and sustainability in reshaping the paradigms for MSMEs in the construction sector. The adoption of advanced technologies, such as building information modeling (BIM) and AI, are pivotal for promoting sustainable construction practices. These tools enable MSMEs to design and construct environmentally responsible buildings, thereby contributing to the industry's sustainability goals. The research highlights that achieving sustainability in construction involves significant efforts in conservation, recycling, and the development of new materials and technologies. This approach aligns with the broader goal of integrating economic, environmental, and social aims into firm objectives to create long-term value while ensuring the protection of natural resources for future generations. Critical factors for implementing sustainable oriented innovation (SOI) practices in MSMEs include top management support, government initiatives, and financial resources. These factors are essential for fostering an environment conducive to innovation and sustainability. Furthermore, the empowerment of MSMEs through improved governance, market-oriented programs, sustainable productivity growth, and access to financing is vital. In developing regions like Indonesia, these strategies are crucial for enabling MSMEs to thrive in the face of globalization. The tendency of large firms to grow larger with the help of technology and globalization has led to the emergence of a high-technology oligopoly, posing a significant challenge to traditional construction practices. This shift necessitates that MSMEs adapt by leveraging technology and embracing sustainable practices to remain competitive. The research underscores the importance of integrating technology and sustainability not only as a competitive strategy but also as a means to contribute to the global effort of environmental conservation and sustainable development. This paper concludes that the successful integration of technology and sustainability in MSMEs requires a multifaceted approach. It involves the adoption of advanced technological tools, strong support from top management, proactive government policies, and access to financial resources. By addressing these factors, MSMEs can overcome the challenges of industry fragmentation, technological lag, and declining profit margins. Ultimately, this integration will enable MSMEs to play a pivotal role in driving the construction industry towards a more sustainable and technologically advanced future. The findings and recommendations are based on a comprehensive case study utilizing semi-structured interviews, observations, questionnaires, and document reviews.

Keywords: MSMEs, construction, technology, sustainability, innovation

Procedia PDF Downloads 9
5686 Advancements in Mathematical Modeling and Optimization for Control, Signal Processing, and Energy Systems

Authors: Zahid Ullah, Atlas Khan

Abstract:

This abstract focuses on the advancements in mathematical modeling and optimization techniques that play a crucial role in enhancing the efficiency, reliability, and performance of these systems. In this era of rapidly evolving technology, mathematical modeling and optimization offer powerful tools to tackle the complex challenges faced by control, signal processing, and energy systems. This abstract presents the latest research and developments in mathematical methodologies, encompassing areas such as control theory, system identification, signal processing algorithms, and energy optimization. The abstract highlights the interdisciplinary nature of mathematical modeling and optimization, showcasing their applications in a wide range of domains, including power systems, communication networks, industrial automation, and renewable energy. It explores key mathematical techniques, such as linear and nonlinear programming, convex optimization, stochastic modeling, and numerical algorithms, that enable the design, analysis, and optimization of complex control and signal processing systems. Furthermore, the abstract emphasizes the importance of addressing real-world challenges in control, signal processing, and energy systems through innovative mathematical approaches. It discusses the integration of mathematical models with data-driven approaches, machine learning, and artificial intelligence to enhance system performance, adaptability, and decision-making capabilities. The abstract also underscores the significance of bridging the gap between theoretical advancements and practical applications. It recognizes the need for practical implementation of mathematical models and optimization algorithms in real-world systems, considering factors such as scalability, computational efficiency, and robustness. In summary, this abstract showcases the advancements in mathematical modeling and optimization techniques for control, signal processing, and energy systems. It highlights the interdisciplinary nature of these techniques, their applications across various domains, and their potential to address real-world challenges. The abstract emphasizes the importance of practical implementation and integration with emerging technologies to drive innovation and improve the performance of control, signal processing, and energy.

Keywords: mathematical modeling, optimization, control systems, signal processing, energy systems, interdisciplinary applications, system identification, numerical algorithms

Procedia PDF Downloads 93
5685 Reasonableness to Strengthen Citizen Participation in Mexican Anti-Corruption Policies

Authors: Amós García Montaño

Abstract:

In a democracy, a public policy must be developed within the regulatory framework and considering citizen participation in its planning, design, execution, and evaluation stages, necessary factors to have both legal support and sufficient legitimacy for its operation. However, the complexity and magnitude of certain public problems results in difficulties for the generation of consensus among society members, leading to unstable and unsuccessful scenarios for the exercise of the right to citizen participation and the generation of effective and efficient public policies. This is the case of public policies against corruption, an issue that in Mexico is difficult to define and generates conflicting opinions. To provide a possible solution to this delicate reality, this paper analyzes the principle of reasonableness as a tool for identifying the basic elements that guarantee a fundamental level of the exercise of the right to citizen participation in the fight against corruption, adopting elements of human rights indicator methodologies. In this sense, the relevance of having a legal framework that establishes obligations to incorporate proactive and transversal citizen participation in the matter is observed. It is also noted the need to monitor the operation of various citizen participation mechanisms in the decision-making processes of the institutions involved in the fight and prevention of corruption, which lead to an increase in the improvement of the perception of the citizen role as a relevant actor in this field. It is concluded that the principle of reasonableness is presented as a very useful tool for the identification of basic elements that facilitate the fulfillment of human rights commitments in the field of public policies.

Keywords: anticorruption, public participation, public policies, reasonableness

Procedia PDF Downloads 71
5684 Renewable Energy Integration in Cities of Developing Countries: The Case Study of Tema City, Ghana

Authors: Marriette Sakah, Christoph Kuhn, Samuel Gyamfi

Abstract:

Global electricity demand of households in 2005 is estimated to double by 2025 and nearly double again in 2030. The residential sector promises considerable demand growth through infrastructural and equipment investments, the majority of which is projected to occur in developing countries. This lays bare the urgency for enhanced efficiency in all energy systems combined with exploitation of local potential for renewable energy systems. This study explores options for reducing energy consumption, particularly in residential buildings and providing robust, decentralized and renewable energy supply for African cities. The potential of energy efficiency measures and the potential of harnessing local resources for renewable energy supply are quantitatively assessed. The scale of research specifically addresses the city level, which is regulated by local authorities. Local authorities can actively promote the transition to a renewable-based energy supply system by promoting energy efficiency and the use of alternative renewable fuels in existing buildings, and particularly in planning and development of new settlement areas through the use of incentives, regulations, and demonstration projects. They can also support a more sustainable development by shaping local land use and development patterns in such ways that reduce per capita energy consumption and are benign to the environment. The subject of the current case study, Tema, is Ghana´s main industrial hub, a port city and home to 77,000 families. Residential buildings in Tema consumed 112 GWh of electricity in 2013 or 1.45 MWh per household. If average household electricity demand were to decline at an annual rate of just 2 %, by 2035 Tema would consume only 134 GWh of electricity despite an expected increase in the number of households by 84 %. The work is based on a ground survey of the city’s residential sector. The results show that efficient technologies and decentralized renewable energy systems have great potential for meeting the rapidly growing energy demand of cities in developing countries.

Keywords: energy efficiency, energy saving potential, renewable energy integration, residential buildings, urban Africa

Procedia PDF Downloads 271
5683 The Influence of Family of Origin on Children: A Comprehensive Model and Implications for Positive Psychology and Psychotherapy

Authors: Meichen He, Xuan Yang

Abstract:

Background: In the field of psychotherapy, the role of the family of origin is of utmost importance. Over the past few decades, both individual-oriented and family-oriented approaches to child therapy have shown moderate success in reducing children's psychological and behavioral issues. Objective: However, in exploring how the family of origin influences individuals, it has been noted that there is a lack of comprehensive measurement indicators and an absence of an exact model to assess the impact of the family of origin on individual development. Therefore, this study aims to develop a model based on a literature review regarding the influence of the family of origin on children. Specifically, it will examine the effects of factors such as education level, economic status, maternal age, family integration, family violence, marital conflict, parental substance abuse, and alcohol consumption on children's self-confidence and life satisfaction. Through this research, we aim to further investigate the impact of the family of origin on children and provide directions for future research in positive psychology and psychotherapy. Methods: This study will employ a literature review methodology to gather and analyze relevant research articles on the influence of the family of origin on children. Subsequently, we will conduct quantitative analyses to establish a comprehensive model explaining how family of origin factors affect children's psychological and behavioral outcomes. Findings: the research has revealed that family of origin factors, including education level, economic status, maternal age, family integration, family violence, marital conflict, parental drug and alcohol consumption, have an impact on children's self-confidence and life satisfaction. These factors can affect children's psychological well-being and happiness through various pathways. Implications: The results of this study will contribute to a better understanding of the influence of the family of origin on children and provide valuable directions for future research in positive psychology and psychotherapy. This research will enhance awareness of children's psychological well-being and lay the foundation for improving psychotherapeutic methods.

Keywords: family of origion, positive psychology, developmental psychology, family education, social psychology, educational psychology

Procedia PDF Downloads 127
5682 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece

Authors: Panagiotis Karadimos, Leonidas Anthopoulos

Abstract:

Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.

Keywords: actual cost and duration, attribute selection, bridge construction, neural networks, predicting models, FANN TOOL, WEKA

Procedia PDF Downloads 119
5681 The Impact of Electrospinning Parameters on Surface Morphology and Chemistry of PHBV Fibers

Authors: Lukasz Kaniuk, Mateusz M. Marzec, Andrzej Bernasik, Urszula Stachewicz

Abstract:

Electrospinning is one of the commonly used methods to produce micro- or nano-fibers. The properties of electrospun fibers allow them to be used to produce tissue scaffolds, biodegradable bandages, or purification membranes. The morphology of the obtained fibers depends on the composition of the polymer solution as well as the processing parameters. Interesting properties such as high fiber porosity can be achieved by changing humidity during electrospinning. Moreover, by changing voltage polarity in electrospinning, we are able to alternate functional groups at the surface of fibers. In this study, electrospun fibers were made of natural, thermoplastic polyester – PHBV (poly(3-hydroxybutyric acid-co-3-hydrovaleric acid). The fibrous mats were obtained using both positive and negative voltage polarities, and their surface was characterized using X-ray photoelectron spectroscopy (XPS, Ulvac-Phi, Chigasaki, Japan). Furthermore, the effect of the humidity on surface morphology was investigated using scanning electron microscopy (SEM, Merlin Gemini II, Zeiss, Germany). Electrospun PHBV fibers produced with positive and negative voltage polarity had similar morphology and the average fiber diameter, 2.47 ± 0.21 µm and 2.44 ± 0.15 µm, respectively. The change of the voltage polarity had a significant impact on the reorientation of the carbonyl groups what consequently changed the surface potential of the electrospun PHBV fibers. The increase of humidity during electrospinning causes porosity in the surface structure of the fibers. In conclusion, we showed within our studies that the process parameters such as humidity and voltage polarity have a great influence on fiber morphology and chemistry, changing their functionality. Surface properties of polymer fiber have a significant impact on cell integration and attachment, which is very important in tissue engineering. The possibility of changing surface porosity allows the use of fibers in various tissue engineering and drug delivery systems. Acknowledgment: This study was conducted within 'Nanofiber-based sponges for atopic skin treatment' project., carried out within the First TEAM programme of the Foundation for Polish Science co-financed by the European Union under the European Regional Development Fund, project no POIR.04.04.00-00- 4571/18-00.

Keywords: cells integration, electrospun fiber, PHBV, surface characterization

Procedia PDF Downloads 106
5680 Technical, Environmental and Financial Assessment for Optimal Sizing of Run-of-River Small Hydropower Project: Case Study in Colombia

Authors: David Calderon Villegas, Thomas Kaltizky

Abstract:

Run-of-river (RoR) hydropower projects represent a viable, clean, and cost-effective alternative to dam-based plants and provide decentralized power production. However, RoR schemes cost-effectiveness depends on the proper selection of site and design flow, which is a challenging task because it requires multivariate analysis. In this respect, this study presents the development of an investment decision support tool for assessing the optimal size of an RoR scheme considering the technical, environmental, and cost constraints. The net present value (NPV) from a project perspective is used as an objective function for supporting the investment decision. The tool has been tested by applying it to an actual RoR project recently proposed in Colombia. The obtained results show that the optimum point in financial terms does not match the flow that maximizes energy generation from exploiting the river's available flow. For the case study, the flow that maximizes energy corresponds to a value of 5.1 m3/s. In comparison, an amount of 2.1 m3/s maximizes the investors NPV. Finally, a sensitivity analysis is performed to determine the NPV as a function of the debt rate changes and the electricity prices and the CapEx. Even for the worst-case scenario, the optimal size represents a positive business case with an NPV of 2.2 USD million and an IRR 1.5 times higher than the discount rate.

Keywords: small hydropower, renewable energy, RoR schemes, optimal sizing, objective function

Procedia PDF Downloads 117
5679 Dynamic Environmental Impact Study during the Construction of the French Nuclear Power Plants

Authors: A. Er-Raki, D. Hartmann, J. P. Belaud, S. Negny

Abstract:

This paper has a double purpose: firstly, a literature review of the life cycle analysis (LCA) and secondly a comparison between conventional (static) LCA and multi-level dynamic LCA on the following items: (i) inventories evolution with time (ii) temporal evolution of the databases. The first part of the paper summarizes the state of the art of the static LCA approach. The different static LCA limits have been identified and especially the non-consideration of the spatial and temporal evolution in the inventory, for the characterization factors (FCs) and into the databases. Then a description of the different levels of integration of the notion of temporality in life cycle analysis studies was made. In the second part, the dynamic inventory has been evaluated firstly for a single nuclear plant and secondly for the entire French nuclear power fleet by taking into account the construction durations of all the plants. In addition, the databases have been adapted by integrating the temporal variability of the French energy mix. Several iterations were used to converge towards the real environmental impact of the energy mix. Another adaptation of the databases to take into account the temporal evolution of the market data of the raw material was made. An identification of the energy mix of the time studied was based on an extrapolation of the production reference values of each means of production. An application to the construction of the French nuclear power plants from 1971 to 2000 has been performed, in which a dynamic inventory of raw material has been evaluated. Then the impacts were characterized by the ILCD 2011 characterization method. In order to compare with a purely static approach, a static impact assessment was made with the V 3.4 Ecoinvent data sheets without adaptation and a static inventory considering that all the power stations would have been built at the same time. Finally, a comparison between static and dynamic LCA approaches was set up to determine the gap between them for each of the two levels of integration. The results were analyzed to identify the contribution of the evolving nuclear power fleet construction to the total environmental impacts of the French energy mix during the same period. An equivalent strategy using a dynamic approach will further be applied to identify the environmental impacts that different scenarios of the energy transition could bring, allowing to choose the best energy mix from an environmental viewpoint.

Keywords: LCA, static, dynamic, inventory, construction, nuclear energy, energy mix, energy transition

Procedia PDF Downloads 91
5678 Geographic Information System Cloud for Sustainable Digital Water Management: A Case Study

Authors: Mohamed H. Khalil

Abstract:

Water is one of the most crucial elements which influence human lives and development. Noteworthy, over the last few years, GIS plays a significant role in optimizing water management systems, especially after exponential developing in this sector. In this context, the Egyptian government initiated an advanced ‘GIS-Web Based System’. This system is efficiently designed to tangibly assist and optimize the complement and integration of data between departments of Call Center, Operation and Maintenance, and laboratory. The core of this system is a unified ‘Data Model’ for all the spatial and tabular data of the corresponding departments. The system is professionally built to provide advanced functionalities such as interactive data collection, dynamic monitoring, multi-user editing capabilities, enhancing data retrieval, integrated work-flow, different access levels, and correlative information record/track. Noteworthy, this cost-effective system contributes significantly not only in the completeness of the base-map (93%), the water network (87%) in high level of details GIS format, enhancement of the performance of the customer service, but also in reducing the operating costs/day-to-day operations (~ 5-10 %). In addition, the proposed system facilitates data exchange between different departments (Call Center, Operation and Maintenance, and laboratory), which allowed a better understanding/analyzing of complex situations. Furthermore, this system reflected tangibly on: (i) dynamic environmental monitor/water quality indicators (ammonia, turbidity, TDS, sulfate, iron, pH, etc.), (ii) improved effectiveness of the different water departments, (iii) efficient deep advanced analysis, (iv) advanced web-reporting tools (daily, weekly, monthly, quarterly, and annually), (v) tangible planning synthesizing spatial and tabular data; and finally, (vi) scalable decision support system. It is worth to highlight that the proposed future plan (second phase) of this system encompasses scalability will extend to include integration with departments of Billing and SCADA. This scalability will comprise advanced functionalities in association with the existing one to allow further sustainable contributions.

Keywords: GIS Web-Based, base-map, water network, decision support system

Procedia PDF Downloads 76
5677 Key Parameters Analysis of the Stirring Systems in the Optmization Procedures

Authors: T. Gomes, J. Manzi

Abstract:

The inclusion of stirring systems in the calculation and optimization procedures has been undergone a significant lack of attention, what it can reflect in the results because such systems provide an additional energy to the process, besides promote a better distribution of mass and energy. This is meaningful for the reactive systems, particularly for the Continuous Stirred Tank Reactor (CSTR), for which the key variables and parameters, as well as the operating conditions of stirring systems, can play a pivotal role and it has been showed in the literature that neglect these factors can lead to sub-optimal results. It is also well known that the sole use of the First Law of Thermodynamics as an optimization tool cannot yield satisfactory results, since the joint use of the First and Second Laws condensed into a procedure so-called entropy generation minimization (EGM) has shown itself able to drive the system towards better results. Therefore, the main objective of this paper is to determine the effects of key parameters of the stirring system in the optimization procedures by means of EGM applied to the reactive systems. Such considerations have been possible by dimensional analysis according to Rayleigh and Buckingham's method, which takes into account the physical and geometric parameters and the variables of the reactive system. For the simulation purpose based on the production of propylene glycol, the results have shown a significant increase in the conversion rate from 36% (not-optimized system) to 95% (optimized system) with a consequent reduction of by-products. In addition, it has been possible to establish the influence of the work of the stirrer in the optimization procedure, in which can be described as a function of the fluid viscosity and consequently of the temperature. The conclusions to be drawn also indicate that the use of the entropic analysis as optimization tool has been proved to be simple, easy to apply and requiring low computational effort.

Keywords: stirring systems, entropy, reactive system, optimization

Procedia PDF Downloads 233
5676 Is Electricity Consumption Stationary in Turkey?

Authors: Eyup Dogan

Abstract:

The number of research articles analyzing the integration properties of energy variables has rapidly increased in the energy literature for about a decade. The stochastic behaviors of energy variables are worth knowing due to several reasons. For instance, national policies to conserve or promote energy consumption, which should be taken as shocks to energy consumption, will have transitory effects in energy consumption if energy consumption is found to be stationary in one country. Furthermore, it is also important to know the order of integration to employ an appropriate econometric model. Despite being an important subject for applied energy (economics) and having a huge volume of studies, several known limitations still exist with the existing literature. For example, many of the studies use aggregate energy consumption and national level data. In addition, a huge part of the literature is either multi-country studies or solely focusing on the U.S. This is the first study in the literature that considers a form of energy consumption by sectors at sub-national level. This research study aims at investigating unit root properties of electricity consumption for 12 regions of Turkey by four sectors in addition to total electricity consumption for the purpose of filling the mentioned limits in the literature. In this regard, we analyze stationarity properties of 60 cases . Because the use of multiple unit root tests make the results robust and consistent, we apply Dickey-Fuller unit root test based on Generalized Least Squares regression (DFGLS), Phillips-Perron unit root test (PP) and Zivot-Andrews unit root test with one endogenous structural break (ZA). The main finding of this study is that electricity consumption is trend stationary in 7 cases according to DFGLS and PP, whereas it is stationary process in 12 cases when we take into account the structural change by applying ZA. Thus, shocks to electricity consumption have transitory effects in those cases; namely, agriculture in region 1, region 4 and region 7, industrial in region 5, region 8, region 9, region 10 and region 11, business in region 4, region 7 and region 9, total electricity consumption in region 11. Regarding policy implications, policies to decrease or stimulate the use of electricity have a long-run impact on electricity consumption in 80% of cases in Turkey given that 48 cases are non-stationary process. On the other hand, the past behavior of electricity consumption can be used to predict the future behavior of that in 12 cases only.

Keywords: unit root, electricity consumption, sectoral data, subnational data

Procedia PDF Downloads 398
5675 The Use of Artificial Intelligence in Diagnosis of Mastitis in Cows

Authors: Djeddi Khaled, Houssou Hind, Miloudi Abdellatif, Rabah Siham

Abstract:

In the field of veterinary medicine, there is a growing application of artificial intelligence (AI) for diagnosing bovine mastitis, a prevalent inflammatory disease in dairy cattle. AI technologies, such as automated milking systems, have streamlined the assessment of key metrics crucial for managing cow health during milking and identifying prevalent diseases, including mastitis. These automated milking systems empower farmers to implement automatic mastitis detection by analyzing indicators like milk yield, electrical conductivity, fat, protein, lactose, blood content in the milk, and milk flow rate. Furthermore, reports highlight the integration of somatic cell count (SCC), thermal infrared thermography, and diverse systems utilizing statistical models and machine learning techniques, including artificial neural networks, to enhance the overall efficiency and accuracy of mastitis detection. According to a review of 15 publications, machine learning technology can predict the risk and detect mastitis in cattle with an accuracy ranging from 87.62% to 98.10% and sensitivity and specificity ranging from 84.62% to 99.4% and 81.25% to 98.8%, respectively. Additionally, machine learning algorithms and microarray meta-analysis are utilized to identify mastitis genes in dairy cattle, providing insights into the underlying functional modules of mastitis disease. Moreover, AI applications can assist in developing predictive models that anticipate the likelihood of mastitis outbreaks based on factors such as environmental conditions, herd management practices, and animal health history. This proactive approach supports farmers in implementing preventive measures and optimizing herd health. By harnessing the power of artificial intelligence, the diagnosis of bovine mastitis can be significantly improved, enabling more effective management strategies and ultimately enhancing the health and productivity of dairy cattle. The integration of artificial intelligence presents valuable opportunities for the precise and early detection of mastitis, providing substantial benefits to the dairy industry.

Keywords: artificial insemination, automatic milking system, cattle, machine learning, mastitis

Procedia PDF Downloads 43
5674 Surface Roughness in the Incremental Forming of Drawing Quality Cold Rolled CR2 Steel Sheet

Authors: Zeradam Yeshiwas, A. Krishnaia

Abstract:

The aim of this study is to verify the resulting surface roughness of parts formed by the Single-Point Incremental Forming (SPIF) process for an ISO 3574 Drawing Quality Cold Rolled CR2 Steel. The chemical composition of drawing quality Cold Rolled CR2 steel is comprised of 0.12 percent of carbon, 0.5 percent of manganese, 0.035 percent of sulfur, 0.04 percent phosphorous, and the remaining percentage is iron with negligible impurities. The experiments were performed on a 3-axis vertical CNC milling machining center equipped with a tool setup comprising a fixture and forming tools specifically designed and fabricated for the process. The CNC milling machine was used to transfer the tool path code generated in Mastercam 2017 environment into three-dimensional motions by the linear incremental progress of the spindle. The blanks of Drawing Quality Cold Rolled CR2 steel sheets of 1 mm of thickness have been fixed along their periphery by a fixture and hardened high-speed steel (HSS) tools with a hemispherical tip of 8, 10 and 12mm of diameter were employed to fabricate sample parts. To investigate the surface roughness, hyperbolic-cone shape specimens were fabricated based on the chosen experimental design. The effect of process parameters on the surface roughness was studied using three important process parameters, i.e., tool diameter, feed rate, and step depth. In this study, the Taylor-Hobson Surtronic 3+ surface roughness tester profilometer was used to determine the surface roughness of the parts fabricated using the arithmetic mean deviation (Rₐ). In this instrument, a small tip is dragged across a surface while its deflection is recorded. Finally, the optimum process parameters and the main factor affecting surface roughness were found using the Taguchi design of the experiment and ANOVA. A Taguchi experiment design with three factors and three levels for each factor, the standard orthogonal array L9 (3³) was selected for the study using the array selection table. The lowest value of surface roughness is significant for surface roughness improvement. For this objective, the ‘‘smaller-the-better’’ equation was used for the calculation of the S/N ratio. The finishing roughness parameter Ra has been measured for the different process combinations. The arithmetic means deviation (Rₐ) was measured via the experimental design for each combination of the control factors by using Taguchi experimental design. Four roughness measurements were taken for a single component and the average roughness was taken to optimize the surface roughness. The lowest value of Rₐ is very important for surface roughness improvement. For this reason, the ‘‘smaller-the-better’’ Equation was used for the calculation of the S/N ratio. Analysis of the effect of each control factor on the surface roughness was performed with a ‘‘S/N response table’’. Optimum surface roughness was obtained at a feed rate of 1500 mm/min, with a tool radius of 12 mm, and with a step depth of 0.5 mm. The ANOVA result shows that step depth is an essential factor affecting surface roughness (91.1 %).

Keywords: incremental forming, SPIF, drawing quality steel, surface roughness, roughness behavior

Procedia PDF Downloads 51
5673 The Dimensions of Culture in the Productive Internationalization Process: An Overview about Brazilian Companies in Bolivia

Authors: Renato Dias Baptista

Abstract:

The purpose of this paper is to analyze the elements of the cultural dimension in the internationalization process of Brazilian companies in Bolivia. This paper is based on research on two major Brazilian transnational companies which have plants in Bolivia. To achieve the objectives, the interconnective characteristics of culture in the process of productive internationalization were analyzed aiming to highlight it as a guiding element opposite the premises of the Brazilian leadership in the integration and development of the continent. The analysis aims to give relevance to the culture of a country and its relations with internationalization.

Keywords: culture, transnational, internationalization, Bolivia, Brazil

Procedia PDF Downloads 413
5672 Design and Implementation of a Hardened Cryptographic Coprocessor with 128-bit RISC-V Core

Authors: Yashas Bedre Raghavendra, Pim Vullers

Abstract:

This study presents the design and implementation of an abstract cryptographic coprocessor, leveraging AMBA(Advanced Microcontroller Bus Architecture) protocols - APB (Advanced Peripheral Bus) and AHB (Advanced High-performance Bus), to enable seamless integration with the main CPU(Central processing unit) and enhance the coprocessor’s algorithm flexibility. The primary objective is to create a versatile coprocessor that can execute various cryptographic algorithms, including ECC(Elliptic-curve cryptography), RSA(Rivest–Shamir–Adleman), and AES (Advanced Encryption Standard) while providing a robust and secure solution for modern secure embedded systems. To achieve this goal, the coprocessor is equipped with a tightly coupled memory (TCM) for rapid data access during cryptographic operations. The TCM is placed within the coprocessor, ensuring quick retrieval of critical data and optimizing overall performance. Additionally, the program memory is positioned outside the coprocessor, allowing for easy updates and reconfiguration, which enhances adaptability to future algorithm implementations. Direct links are employed instead of DMA(Direct memory access) for data transfer, ensuring faster communication and reducing complexity. The AMBA-based communication architecture facilitates seamless interaction between the coprocessor and the main CPU, streamlining data flow and ensuring efficient utilization of system resources. The abstract nature of the coprocessor allows for easy integration of new cryptographic algorithms in the future. As the security landscape continues to evolve, the coprocessor can adapt and incorporate emerging algorithms, making it a future-proof solution for cryptographic processing. Furthermore, this study explores the addition of custom instructions into RISC-V ISE (Instruction Set Extension) to enhance cryptographic operations. By incorporating custom instructions specifically tailored for cryptographic algorithms, the coprocessor achieves higher efficiency and reduced cycles per instruction (CPI) compared to traditional instruction sets. The adoption of RISC-V 128-bit architecture significantly reduces the total number of instructions required for complex cryptographic tasks, leading to faster execution times and improved overall performance. Comparisons are made with 32-bit and 64-bit architectures, highlighting the advantages of the 128-bit architecture in terms of reduced instruction count and CPI. In conclusion, the abstract cryptographic coprocessor presented in this study offers significant advantages in terms of algorithm flexibility, security, and integration with the main CPU. By leveraging AMBA protocols and employing direct links for data transfer, the coprocessor achieves high-performance cryptographic operations without compromising system efficiency. With its TCM and external program memory, the coprocessor is capable of securely executing a wide range of cryptographic algorithms. This versatility and adaptability, coupled with the benefits of custom instructions and the 128-bit architecture, make it an invaluable asset for secure embedded systems, meeting the demands of modern cryptographic applications.

Keywords: abstract cryptographic coprocessor, AMBA protocols, ECC, RSA, AES, tightly coupled memory, secure embedded systems, RISC-V ISE, custom instructions, instruction count, cycles per instruction

Procedia PDF Downloads 56
5671 Multi-Objectives Genetic Algorithm for Optimizing Machining Process Parameters

Authors: Dylan Santos De Pinho, Nabil Ouerhani

Abstract:

Energy consumption of machine-tools is becoming critical for machine-tool builders and end-users because of economic, ecological and legislation-related reasons. Many machine-tool builders are seeking for solutions that allow the reduction of energy consumption of machine-tools while preserving the same productivity rate and the same quality of machined parts. In this paper, we present the first results of a project conducted jointly by academic and industrial partners to reduce the energy consumption of a Swiss-Type lathe. We employ genetic algorithms to find optimal machining parameters – the set of parameters that lead to the best trade-off between energy consumption, part quality and tool lifetime. Three main machining process parameters are considered in our optimization technique, namely depth of cut, spindle rotation speed and material feed rate. These machining process parameters have been identified as the most influential ones in the configuration of the Swiss-type machining process. A state-of-the-art multi-objective genetic algorithm has been used. The algorithm combines three fitness functions, which are objective functions that permit to evaluate a set of parameters against the three objectives: energy consumption, quality of the machined parts, and tool lifetime. In this paper, we focus on the investigation of the fitness function related to energy consumption. Four different energy consumption related fitness functions have been investigated and compared. The first fitness function refers to the Kienzle cutting force model. The second fitness function uses the Material Removal Rate (RMM) as an indicator of energy consumption. The two other fitness functions are non-deterministic, learning-based functions. One fitness function uses a simple Neural Network to learn the relation between the process parameters and the energy consumption from experimental data. Another fitness function uses Lasso regression to determine the same relation. The goal is, then, to find out which fitness functions predict best the energy consumption of a Swiss-Type machining process for the given set of machining process parameters. Once determined, these functions may be used for optimization purposes – determine the optimal machining process parameters leading to minimum energy consumption. The performance of the four fitness functions has been evaluated. The Tornos DT13 Swiss-Type Lathe has been used to carry out the experiments. A mechanical part including various Swiss-Type machining operations has been selected for the experiments. The evaluation process starts with generating a set of CNC (Computer Numerical Control) programs for machining the part at hand. Each CNC program considers a different set of machining process parameters. During the machining process, the power consumption of the spindle is measured. All collected data are assigned to the appropriate CNC program and thus to the set of machining process parameters. The evaluation approach consists in calculating the correlation between the normalized measured power consumption and the normalized power consumption prediction for each of the four fitness functions. The evaluation shows that the Lasso and Neural Network fitness functions have the highest correlation coefficient with 97%. The fitness function “Material Removal Rate” (MRR) has a correlation coefficient of 90%, whereas the Kienzle-based fitness function has a correlation coefficient of 80%.

Keywords: adaptive machining, genetic algorithms, smart manufacturing, parameters optimization

Procedia PDF Downloads 135
5670 The Holistic Nursing WebQuest: An Interactive Teaching/Learning Strategy

Authors: Laura M. Schwarz

Abstract:

WebQuests are an internet-based interactive teaching/learning tool and utilize a scaffolded methodology. WebQuests employ critical thinking, afford inquiry-based constructivist learning, and readily employ Bloom’s Taxonomy. WebQuests have generally been used as instructional technology tools in primary and secondary education and have more recently grown in popularity in higher education. The study of the efficacy of WebQuests as an instructional approach to learning, however, has been limited, particularly in the nursing education arena. The purpose of this mixed-methods study was to determine nursing students’ perceptions of the effectiveness of the Nursing WebQuest as a teaching/learning strategy for holistic nursing-related content. Quantitative findings (N=42) suggested that learners were active participants, used reflection, thought of new ideas, used analysis skills, discovered something new, and assessed the worth of something while taking part in the WebQuests. Qualitative findings indicated that participants found WebQuest positives as easy to understand and navigate; clear and organized; interactive; good alternative learning format, and used a variety of quality resources. Participants saw drawbacks as requiring additional time and work; and occasional failed link or link causing them to lose their location in the WebQuest. Recommendations include using larger sample size and more diverse populations from various programs and universities. In conclusion, WebQuests were found to be an effective teaching/learning tool as positively assessed by study participants.

Keywords: holistic nursing, nursing education, teaching/learning strategy, WebQuests

Procedia PDF Downloads 116
5669 Integrating Service Learning into a Business Analytics Course: A Comparative Investigation

Authors: Gokhan Egilmez, Erika Hatfield, Julie Turner

Abstract:

In this study, we investigated the impacts of service-learning integration on an undergraduate level business analytics course from multiple perspectives, including academic proficiency, community awareness, engagement, social responsibility, and reflection. We assessed the impact of the service-learning experience by using a survey developed primarily based on the literature review and secondarily on an ad hoc group of researchers. Then, we implemented the survey in two sections, where one of the sections was a control group. We compared the results of the empirical survey visually and statistically.

Keywords: business analytics, service learning, experiential education, statistical analysis, survey research

Procedia PDF Downloads 94
5668 Cheiloscopy: A Study on Predominant Lip Print Patterns among the Gujarati Population

Authors: Pooja Ahuja, Tejal Bhutani, M. S. Dahiya

Abstract:

Cheiloscopy, the study of lip prints, is a tool in forensic investigation technique that deals with identification of individuals based on lips patterns. The objective of this study is to determine predominant lip print pattern found among the Gujarati population, to evaluate whether any sex difference exists and to study the permanence of the pattern over six months duration. The study comprised of 100 healthy individuals (50 males and 50 females), in the age group of 18 to 25 years of Gujarati population of the Gandhinagar region of the Gujarat state, India. By using Suzuki and Tsuchihashi classification, Lip prints were then divided into four quadrants and also classified on the basis of peripheral shape of the lips. Materials used to record the lip prints were dark brown colored lipstick, cellophane tape, and white bond paper. Lipstick was applied uniformly, and lip prints were taken on the glued portion of cellophane tape and then stuck on to a white bond paper. These lip prints were analyzed with magnifying lens and virtually with stereo microscope. On the analysis of the subject population, results showed Branched pattern Type II (29.57 percentage) to be most predominant in the Gujarati population. Branched pattern Type II (35.60 percentage) and long vertical Type I (28.28 percentage) were most prevalent in males and females respectively and large full lips were most predominantly present in both the sexes. The study concludes that lip prints in any form can be an effective tool for identification of an individual in a closed or open group forms.

Keywords: cheiloscopy, lip pattern, predomianant, Gujarati population

Procedia PDF Downloads 283
5667 The Roman Fora in North Africa Towards a Supportive Protocol to the Decision for the Morphological Restitution

Authors: Dhouha Laribi Galalou, Najla Allani Bouhoula, Atef Hammouda

Abstract:

This research delves into the fundamental question of the morphological restitution of built archaeology in order to place it in its paradigmatic context and to seek answers to it. Indeed, the understanding of the object of the study, its analysis, and the methodology of solving the morphological problem posed, are manageable aspects only by means of a thoughtful strategy that draws on well-defined epistemological scaffolding. In this stream, the crisis of natural reasoning in archaeology has generated multiple changes in this field, ranging from the use of new tools to the integration of an archaeological information system where urbanization involves the interplay of several disciplines. The built archaeological topic is also an architectural and morphological object. It is also a set of articulated elementary data, the understanding of which is about to be approached from a logicist point of view. Morphological restitution is no exception to the rule, and the inter-exchange between the different disciplines uses the capacity of each to frame the reflection on the incomplete elements of a given architecture or on its different phases and multiple states of existence. The logicist sequence is furnished by the set of scattered or destroyed elements found, but also by what can be called a rule base which contains the set of rules for the architectural construction of the object. The knowledge base built from the archaeological literature also provides a reference that enters into the game of searching for forms and articulations. The choice of the Roman Forum in North Africa is justified by the great urban and architectural characteristics of this entity. The research on the forum involves both a fairly large knowledge base but also provides the researcher with material to study - from a morphological and architectural point of view - starting from the scale of the city down to the architectural detail. The experimentation of the knowledge deduced on the paradigmatic level, as well as the deduction of an analysis model, is then carried out on the basis of a well-defined context which contextualises the experimentation from the elaboration of the morphological information container attached to the rule base and the knowledge base. The use of logicist analysis and artificial intelligence has allowed us to first question the aspects already known in order to measure the credibility of our system, which remains above all a decision support tool for the morphological restitution of Roman Fora in North Africa. This paper presents a first experimentation of the model elaborated during this research, a model framed by a paradigmatic discussion and thus trying to position the research in relation to the existing paradigmatic and experimental knowledge on the issue.

Keywords: classical reasoning, logicist reasoning, archaeology, architecture, roman forum, morphology, calculation

Procedia PDF Downloads 132
5666 Modeling of the Biodegradation Performance of a Membrane Bioreactor to Enhance Water Reuse in Agri-food Industry - Poultry Slaughterhouse as an Example

Authors: masmoudi Jabri Khaoula, Zitouni Hana, Bousselmi Latifa, Akrout Hanen

Abstract:

Mathematical modeling has become an essential tool for sustainable wastewater management, particularly for the simulation and the optimization of complex processes involved in activated sludge systems. In this context, the activated sludge model (ASM3h) was used for the simulation of a Biological Membrane Reactor (MBR) as it includes the integration of biological wastewater treatment and physical separation by membrane filtration. In this study, the MBR with a useful volume of 12.5 L was fed continuously with poultry slaughterhouse wastewater (PSWW) for 50 days at a feed rate of 2 L/h and for a hydraulic retention time (HRT) of 6.25h. Throughout its operation, High removal efficiency was observed for the removal of organic pollutants in terms of COD with 84% of efficiency. Moreover, the MBR has generated a treated effluent which fits with the limits of discharge into the public sewer according to the Tunisian standards which were set in March 2018. In fact, for the nitrogenous compounds, average concentrations of nitrate and nitrite in the permeat reached 0.26±0.3 mg. L-1 and 2.2±2.53 mg. L-1, respectively. The simulation of the MBR process was performed using SIMBA software v 5.0. The state variables employed in the steady state calibration of the ASM3h were determined using physical and respirometric methods. The model calibration was performed using experimental data obtained during the first 20 days of the MBR operation. Afterwards, kinetic parameters of the model were adjusted and the simulated values of COD, N-NH4+and N- NOx were compared with those reported from the experiment. A good prediction was observed for the COD, N-NH4+and N- NOx concentrations with 467 g COD/m³, 110.2 g N/m³, 3.2 g N/m³ compared to the experimental data which were 436.4 g COD/m³, 114.7 g N/m³ and 3 g N/m³, respectively. For the validation of the model under dynamic simulation, the results of the experiments obtained during the second treatment phase of 30 days were used. It was demonstrated that the model simulated the conditions accurately by yielding a similar pattern on the variation of the COD concentration. On the other hand, an underestimation of the N-NH4+ concentration was observed during the simulation compared to the experimental results and the measured N-NO3 concentrations were lower than the predicted ones, this difference could be explained by the fact that the ASM models were mainly designed for the simulation of biological processes in the activated sludge systems. In addition, more treatment time could be required by the autotrophic bacteria to achieve a complete and stable nitrification. Overall, this study demonstrated the effectiveness of mathematical modeling in the prediction of the performance of the MBR systems with respect to organic pollution, the model can be further improved for the simulation of nutrients removal for a longer treatment period.

Keywords: activated sludge model (ASM3h), membrane bioreactor (MBR), poultry slaughter wastewater (PSWW), reuse

Procedia PDF Downloads 41
5665 A Pilot Study on the Development and Validation of an Instrument to Evaluate Inpatient Beliefs, Expectations and Attitudes toward Reflexology (IBEAR)-16

Authors: Samuel Attias, Elad Schiff, Zahi Arnon, Eran Ben-Arye, Yael Keshet, Ibrahim Matter, Boker Lital Keinan

Abstract:

Background: Despite the extensive use of manual therapies, reflexology in particular, no validated tools have been developed to evaluate patients' beliefs, attitudes and expectations regarding reflexology. Such tools however are essential to improve the results of the reflexology treatment, by better adjusting it to the patients' attitudes and expectations. The tool also enables assessing correlations with clinical results of interventional studies using reflexology. Methods: The IBEAR (Inpatient Beliefs, Expectations and Attitudes toward Reflexology) tool contains 25 questions (8 demographic and 17 specifically addressing reflexology), and was constructed in several stages: brainstorming by a multidisciplinary team of experts; evaluation of each of the proposed questions by the experts' team; and assessment of the experts' degree of agreement per each question, based on a Likert 1-7 scale (1 – don't agree at all; 7 – agree completely). Cronbach's Alpha was computed to evaluate the questionnaire's reliability while the Factor analysis test was used for further validation (228 patients). The questionnaire was tested and re-tested (48h) on a group of 199 patients to assure clarity and reliability, using the Pearson coefficient and the Kappa test. It was modified based on these results into its final form. Results: After its construction, the IBEAR questionnaire passed the expert group's preliminary consensus, evaluation of the questions' clarity (from 5.1 to 7.0), inner validation (from 5.5 to 7) and structural validation (from 5.5 to 6.75). Factor analysis pointed to two content worlds in a division into 4 questions discussing attitudes and expectations versus 5 questions on belief and attitudes. Of the 221 questionnaires collected, a Cronbach's Alpha coefficient was calculated on nine questions relating to beliefs, expectations, and attitudes regarding reflexology. This measure stood at 0.716 (satisfactory reliability). At the Test-Retest stage, 199 research participants filled in the questionnaire a second time. The Pearson coefficient for all questions ranged between 0.73 and 0.94 (good to excellent reliability). As for dichotomic answers, Kappa scores ranged between 0.66 and 1.0 (mediocre to high). One of the questions was removed from the IBEAR following questionnaire validation. Conclusions: The present study provides evidence that the proposed IBEAR-16 questionnaire is a valid and reliable tool for the characterization of potential reflexology patients and may be effectively used in settings which include the evaluation of inpatients' beliefs, expectations, and attitudes toward reflexology.

Keywords: reflexology, attitude, expectation, belief, CAM, inpatient

Procedia PDF Downloads 218
5664 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller

Authors: Alireza Dantism

Abstract:

Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.

Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller

Procedia PDF Downloads 76