Search results for: biotechnological processes
4891 Virtual Conciliation in Colombia: Evaluation of Maturity Level within the Framework of E-Government
Authors: Jenny Paola Forero Pachón, Sonia Cristina Gamboa Sarmiento, Luis Carlos Gómez Flórez
Abstract:
The Colombian government has defined an e-government strategy to take advantage of Information Technologies (IT) in order to contribute to the building of a more efficient, transparent and participative State that provides better services to citizens and businesses. In this regard, the Justice sector is one of the government sectors where IT has generated more expectation considering that the country has a judicial processes backlog. This situation has led to the search for alternative forms of access to justice that speed up the process while providing a low cost for citizens. To this end, the Colombian government has authorized the use of Alternative Dispute Resolution methods (ADR), a remedy where disputes can be resolved more quickly compared to judicial processes while facilitating greater communication between the parties, without recourse to judicial authority. One of these methods is conciliation, which includes a special modality that takes advantage of IT for the development of itself known as virtual conciliation. With this option the conciliation is supported by information systems, applications or platforms and communications are provided through it. This paper evaluates the level of maturity in how the service of virtual conciliation is under the framework of this strategy. This evaluation is carried out considering Shahkooh's 5-phase model for e-government. As a result, it is evident that in the context of conciliation, maturity does not reach the necessary level in the model so that it can be considered as virtual conciliation; therefore, it is necessary to define strategies to maximize the potential of IT in this context.Keywords: alternative dispute resolution, e-government, evaluation of maturity, Shahkooh model, virtual conciliation
Procedia PDF Downloads 2534890 Insight2OSC: Using Electroencephalography (EEG) Rhythms from the Emotiv Insight for Musical Composition via Open Sound Control (OSC)
Authors: Constanza Levicán, Andrés Aparicio, Rodrigo F. Cádiz
Abstract:
The artistic usage of Brain-computer interfaces (BCI), initially intended for medical purposes, has increased in the past few years as they become more affordable and available for the general population. One interesting question that arises from this practice is whether it is possible to compose or perform music by using only the brain as a musical instrument. In order to approach this question, we propose a BCI for musical composition, based on the representation of some mental states as the musician thinks about sounds. We developed software, called Insight2OSC, that allows the usage of the Emotiv Insight device as a musical instrument, by sending the EEG data to audio processing software such as MaxMSP through the OSC protocol. We provide two compositional applications bundled with the software, which we call Mapping your Mental State and Thinking On. The signals produced by the brain have different frequencies (or rhythms) depending on the level of activity, and they are classified as one of the following waves: delta (0.5-4 Hz), theta (4-8 Hz), alpha (8-13 Hz), beta (13-30 Hz), gamma (30-50 Hz). These rhythms have been found to be related to some recognizable mental states. For example, the delta rhythm is predominant in a deep sleep, while beta and gamma rhythms have higher amplitudes when the person is awake and very concentrated. Our first application (Mapping your Mental State) produces different sounds representing the mental state of the person: focused, active, relaxed or in a state similar to a deep sleep by the selection of the dominants rhythms provided by the EEG device. The second application relies on the physiology of the brain, which is divided into several lobes: frontal, temporal, parietal and occipital. The frontal lobe is related to abstract thinking and high-level functions, the parietal lobe conveys the stimulus of the body senses, the occipital lobe contains the primary visual cortex and processes visual stimulus, the temporal lobe processes auditory information and it is important for memory tasks. In consequence, our second application (Thinking On) processes the audio output depending on the users’ brain activity as it activates a specific area of the brain that can be measured using the Insight device.Keywords: BCI, music composition, emotiv insight, OSC
Procedia PDF Downloads 3224889 Data-Driven Performance Evaluation of Surgical Doctors Based on Fuzzy Analytic Hierarchy Processes
Authors: Yuguang Gao, Qiang Yang, Yanpeng Zhang, Mingtao Deng
Abstract:
To enhance the safety, quality and efficiency of healthcare services provided by surgical doctors, we propose a comprehensive approach to the performance evaluation of individual doctors by incorporating insights from performance data as well as views of different stakeholders in the hospital. Exploratory factor analysis was first performed on collective multidimensional performance data of surgical doctors, where key factors were extracted that encompass assessment of professional experience and service performance. A two-level indicator system was then constructed, for which we developed a weighted interval-valued spherical fuzzy analytic hierarchy process to analyze the relative importance of the indicators while handling subjectivity and disparity in the decision-making of multiple parties involved. Our analytical results reveal that, for the key factors identified as instrumental for evaluating surgical doctors’ performance, the overall importance of clinical workload and complexity of service are valued more than capacity of service and professional experience, while the efficiency of resource consumption ranks comparatively the lowest in importance. We also provide a retrospective case study to illustrate the effectiveness and robustness of our quantitative evaluation model by assigning meaningful performance ratings to individual doctors based on the weights developed through our approach.Keywords: analytic hierarchy processes, factor analysis, fuzzy logic, performance evaluation
Procedia PDF Downloads 584888 Haiti and Power Symbolic: An Analysis Understanding of the Impact of the Presidential Political Speeches
Authors: Marc Arthur Bien Aimé, Julio da Silveira Moreira
Abstract:
This study examines the political speech in Haiti over the course of the decade 2011-2021, focusing on the speeches of the presidents Michel J. Martelly and Jovenel Moïse and their impacts on their awareness collective. In using a qualitative approach, we have analyzed the speech of the president pronounced in response to the political instability of countries, as well as interviews with a group of 20 Haitians living in Port- Au-Prince. Our results put in evidence their complex relationship between politics, awareness collective, and the influence of the powers imperialists. We show that the situation in Haiti's disastrous social and political situation is driven by personal political interests and the absence of a state political project. Moreover, the speeches of the president’s analysis are meaningless, transforming concepts such as social progress and justice in simple words. This political rhetoric contributes to the domination symbolic of the population of Haitian. This study is also linked to the theme “Constitutions, processes democratic and critical of the state in Latin America,” emphasizing the importance of analysis of political speech to understand the complexities of the democratic process and criticism of the State in their Latin American region. We suggest future research to deepen our understanding of these political dynamics and their impact on public policies and developments of the constitutions throughout Latin America.Keywords: political discourse, conscience collective, inequality social, democratic processes, constitutions, Haiti
Procedia PDF Downloads 614887 Comparative Study for Biodiesel Production Using a Batch and a Semi-Continuous Flow Reactor
Authors: S. S. L. Andrade, E. A. Souza, L. C. L. Santos, C. Moraes, A. K. C. L. Lobato
Abstract:
Biodiesel may be produced through transesterification reaction (or alcoholysis), that is the transformation of a long chain fatty acid in an alkyl ester. This reaction can occur in the presence of acid catalysts, alkali, or enzyme. Currently, for industrial processes, biodiesel is produced by alkaline route. The alkali most commonly used in these processes is hydroxides and methoxides of sodium and potassium. In this work, biodiesel production was conducted in two different systems. The first consisted of a batch reactor operating with a traditional washing system and the second consisted of a semi-continuous flow reactor operating with a membrane separation system. Potassium hydroxides was used as catalyst at a concentration of 1% by weight, the molar ratio oil/alcohol was 1/9 and temperature of 55 °C. Tests were performed using soybeans and palm oil and the ester conversion results were compared for both systems. It can be seen that the results for both oils are similar when using the batch reator or the semi-continuous flow reactor. The use of the semi-continuous flow reactor allows the removal of the formed products. Thus, in the case of a reversible reaction, with the removal of reaction products, the concentration of the reagents becomes higher and the equilibrium reaction is shifted towards the formation of more products. The higher conversion to ester with soybean and palm oil using the batch reactor was approximately 98%. In contrast, it was observed a conversion of 99% when using the same operating condition on a semi-continuous flow reactor.Keywords: biodiesel, batch reactor, semi-continuous flow reactor, transesterification
Procedia PDF Downloads 3844886 Evaluation of Sloshing in Process Equipment for Floating Cryogenic Application
Authors: Bo Jin
Abstract:
A variety of process equipment having flow in and out is widely used in industrial land-based cryogenic facilities. In some of this equipment, such as vapor-liquid separator, a liquid level is established during the steady operation. As the implementation of such industrial processes extends to off-shore floating facilities, it is important to investigate the effect of sea motion on the process equipment partially filled with liquid. One important aspect to consider is the occurrence of sloshing therein. The flow characteristics are different from the classical study of sloshing, where the fluid is enclosed inside a vessel (e.g., storage tank) with no flow in or out. Liquid inside process equipment continuously flows in and out of the system. To understand this key difference, a Computational Fluid Dynamics (CFD) model is developed to simulate the liquid motion inside a partially filled cylinder with and without continuous flow in and out. For a partially filled vertical cylinder without any continuous flow in and out, the CFD model is found to be able to capture the well-known sloshing behavior documented in the literature. For the cylinder with a continuous steady flow in and out, the CFD simulation results demonstrate that the continuous flow suppresses sloshing. Given typical cryogenic fluid has very low viscosity, an analysis based on potential flow theory is developed to explain why flow into and out of the cylinder changes the natural frequency of the system and thereby suppresses sloshing. This analysis further validates the CFD results.Keywords: computational fluid dynamics, CFD, cryogenic process equipment, off-shore floating processes, sloshing
Procedia PDF Downloads 1374885 Leachate Discharges: Review Treatment Techniques
Authors: Abdelkader Anouzla, Soukaina Bouaouda, Roukaya Bouyakhsass, Salah Souabi, Abdeslam Taleb
Abstract:
During storage and under the combined action of rainwater and natural fermentation, these wastes produce over 800.000 m3 of landfill leachates. Due to population growth and changing global economic activities, the amount of waste constantly generated increases, making more significant volumes of leachate. Leachate, when leaching into the soil, can negatively impact soil, surface water, groundwater, and the overall environment and human life. The leachate must first be treated because of its high pollutant load before being released into the environment. This article reviews the different leachate treatments in September 2022 techniques. Different techniques can be used for this purpose, such as biological, physical-chemical, and membrane methods. Young leachate is biodegradable; in contrast, these biological processes lose their effectiveness with leachate aging. They are characterized by high ammonia nitrogen concentrations that inhibit their activity. Most physical-chemical treatments serve as pre-treatment or post-treatment to complement conventional treatment processes or remove specific contaminants. After the introduction, the different types of pollutants present in leachates and their impacts have been made, followed by a discussion highlighting the advantages and disadvantages of the various treatments, whether biological, physicochemical, or membrane. From this work, due to their simplicity and reasonable cost compared to other treatment procedures, biological treatments offer the most suitable alternative to limit the effects produced by the pollutants in landfill leachates.Keywords: landfill leachate, landfill pollution, impact, wastewater
Procedia PDF Downloads 894884 UV-Vis Spectroscopy as a Tool for Online Tar Measurements in Wood Gasification Processes
Authors: Philip Edinger, Christian Ludwig
Abstract:
The formation and control of tars remain one of the major challenges in the implementation of biomass gasification technologies. Robust, on-line analytical methods are needed to investigate the fate of tar compounds when different measures for their reduction are applied. This work establishes an on-line UV-Vis method, based on a liquid quench sampling system, to monitor tar compounds in biomass gasification processes. Recorded spectra from the liquid phase were analyzed for their tar composition by means of a classical least squares (CLS) and partial least squares (PLS) approach. This allowed for the detection of UV-Vis active tar compounds with detection limits in the low part per million by volume (ppmV) region. The developed method was then applied to two case studies. The first involved a lab-scale reactor, intended to investigate the decomposition of a limited number of tar compounds across a catalyst. The second study involved a gas scrubber as part of a pilot scale wood gasification plant. Tar compound quantification results showed good agreement with off-line based reference methods (GC-FID) when the complexity of tar composition was limited. The two case studies show that the developed method can provide rapid, qualitative information on the tar composition for the purpose of process monitoring. In cases with a limited number of tar species, quantitative information about the individual tar compound concentrations provides an additional benefit of the analytical method.Keywords: biomass gasification, on-line, tar, UV-Vis
Procedia PDF Downloads 2594883 Embodied Communication - Examining Multimodal Actions in a Digital Primary School Project
Authors: Anne Öman
Abstract:
Today in Sweden and in other countries, a variety of digital artefacts, such as laptops, tablets, interactive whiteboards, are being used at all school levels. From an educational perspective, digital artefacts challenge traditional teaching because they provide a range of modes for expression and communication and are not limited to the traditional medium of paper. Digital technologies offer new opportunities for representations and physical interactions with objects, which put forward the role of the body in interaction and learning. From a multimodal perspective the emphasis is on the use of multiple semiotic resources for meaning- making and the study presented here has examined the differential use of semiotic resources by pupils interacting in a digitally designed task in a primary school context. The instances analyzed in this paper come from a case study where the learning task was to create an advertising film in a film-software. The study in focus involves the analysis of a single case with the emphasis on the examination of the classroom setting. The research design used in this paper was based on a micro ethnographic perspective and the empirical material was collected through video recordings of small-group work in order to explore pupils’ communication within the group activity. The designed task described here allowed students to build, share, collaborate upon and publish the redesigned products. The analysis illustrates the variety of communicative modes such as body position, gestures, visualizations, speech and the interaction between these modes and the representations made by the pupils. The findings pointed out the importance of embodied communication during the small- group processes from a learning perspective as well as a pedagogical understanding of pupils’ representations, which were similar from a cultural literacy perspective. These findings open up for discussions with further implications for the school practice concerning the small- group processes as well as the redesigned products. Wider, the findings could point out how multimodal interactions shape the learning experience in the meaning-making processes taking into account that language in a globalized society is more than reading and writing skills.Keywords: communicative learning, interactive learning environments, pedagogical issues, primary school education
Procedia PDF Downloads 4084882 Effects of Surface Roughness on a Unimorph Piezoelectric Micro-Electro-Mechanical Systems Vibrational Energy Harvester Using Finite Element Method Modeling
Authors: Jean Marriz M. Manzano, Marc D. Rosales, Magdaleno R. Vasquez Jr., Maria Theresa G. De Leon
Abstract:
This paper discusses the effects of surface roughness on a cantilever beam vibrational energy harvester. A silicon sample was fabricated using MEMS fabrication processes. When etching silicon using deep reactive ion etching (DRIE) at large etch depths, rougher surfaces are observed as a result of increased response in process pressure, amount of coil power and increased helium backside cooling readings. To account for the effects of surface roughness on the characteristics of the cantilever beam, finite element method (FEM) modeling was performed using actual roughness data from fabricated samples. It was found that when etching about 550um of silicon, root mean square roughness parameter, Sq, varies by 1 to 3 um (at 100um thick) across a 6-inch wafer. Given this Sq variation, FEM simulations predict an 8 to148 Hz shift in the resonant frequency while having no significant effect on the output power. The significant shift in the resonant frequency implies that careful consideration of surface roughness from fabrication processes must be done when designing energy harvesters.Keywords: deep reactive ion etching, finite element method, microelectromechanical systems, multiphysics analysis, surface roughness, vibrational energy harvester
Procedia PDF Downloads 1214881 R-Killer: An Email-Based Ransomware Protection Tool
Authors: B. Lokuketagoda, M. Weerakoon, U. Madushan, A. N. Senaratne, K. Y. Abeywardena
Abstract:
Ransomware has become a common threat in past few years and the recent threat reports show an increase of growth in Ransomware infections. Researchers have identified different variants of Ransomware families since 2015. Lack of knowledge of the user about the threat is a major concern. Ransomware detection methodologies are still growing through the industry. Email is the easiest method to send Ransomware to its victims. Uninformed users tend to click on links and attachments without much consideration assuming the emails are genuine. As a solution to this in this paper R-Killer Ransomware detection tool is introduced. Tool can be integrated with existing email services. The core detection Engine (CDE) discussed in the paper focuses on separating suspicious samples from emails and handling them until a decision is made regarding the suspicious mail. It has the capability of preventing execution of identified ransomware processes. On the other hand, Sandboxing and URL analyzing system has the capability of communication with public threat intelligence services to gather known threat intelligence. The R-Killer has its own mechanism developed in its Proactive Monitoring System (PMS) which can monitor the processes created by downloaded email attachments and identify potential Ransomware activities. R-killer is capable of gathering threat intelligence without exposing the user’s data to public threat intelligence services, hence protecting the confidentiality of user data.Keywords: ransomware, deep learning, recurrent neural networks, email, core detection engine
Procedia PDF Downloads 2134880 Evaluation of Information Technology Governance Frameworks for Better Governance in South Africa
Authors: Memory Ranga, Phillip Pretorious
Abstract:
The South African Government has invested a lot of money in Information Technology Governance (ITG) within the Government departments. The ITG framework was spearheaded by the Department of Public Service and Administration (DPSA). This led to the development of a governing ITG DPSA framework and later the Government Wide Enterprise Architecture (GWEA) Framework for assisting the departments to implement ITG. In addition to this, the government departments have adopted the Information Systems Audit and Control Association (ISACA) Control Objectives for Information and Related Technology (COBIT) for ITG processes. Despite all these available frameworks, departments fail to fully capitalise and improve the ITG processes mainly as these are too generic and difficult to apply for specific governance needs. There has been less research done to evaluate the progress on ITG initiatives within the government departments. This paper aims to evaluate the existing ITG frameworks within selected government departments in South Africa. A quantitative research approach was used in this study. Data was collected through an online questionnaire targeting ICT Managers and Directors from government departments. The study is undertaken within a case study and only the Eastern Cape Province was selected for the research. Document review mainly on ITG framework and best practices was also used. Data was analysed using the Google Analytic tools and SPSS. A one–sample Chi-Squared Test was used to verity the evaluation findings. Findings show that there is evidence that the current guiding National governance framework (DPSA) is out dated and does not accommodate the new changes in other governance frameworks. The Eastern Cape Government Departments have spent huge amount of money on ITG but not yet able to identify the benefits of the ITG initiatives. The guiding framework is rigid and does to address some of the departmental needs making it difficult to be flexible and apply the DPSA framework. Furthermore, despite the large budget on ITG, the departments still find themselves with many challenges and unable to improve some of the processes and services. All the engaged Eastern Cape departments have adopted the COBIT framework, but none has been conducting COBIT maturity Assessment which is a functionality of COBIT. There is evidence of too many the ITG frameworks and underutilisation of these frameworks. The study provides a comprehensive evaluation of the ITG frameworks that have been adopted by the South African Government Departments in the Eastern Cape Province. The evaluation guides and recommends the government departments to rethink and adopt ITG frameworks that could be customised to accommodate their needs. The adoption and application of ITG by government departments should assist in better governance and service delivery to the citizens.Keywords: information technology governance, COBIT, evaluate, framework, governance, DPSA framework
Procedia PDF Downloads 1234879 Continuous Production of Prebiotic Pectic Oligosaccharides from Sugar Beet Pulp in a Continuous Cross Flow Membrane Bioreactor
Authors: Neha Babbar, S. Van Roy, W. Dejonghe, S. Sforza, K. Elst
Abstract:
Pectic oligosaccharides (a class of prebiotics) are non-digestible carbohydrates which benefits the host by stimulating the growth of healthy gut micro flora. Production of prebiotic pectic oligosaccharides (POS) from pectin rich agricultural residues involves a cutting of long chain polymer of pectin to oligomers of pectin while avoiding the formation of monosaccharides. The objective of the present study is to develop a two-step continuous biocatalytic membrane reactor (MER) for the continuous production of POS (from sugar beet pulp) in which conversion is combined with separation. Optimization of the ratio of POS/monosaccharides, stability and productivities of the process was done by testing various residence times (RT) in the reactor vessel with diluted (10 RT, 20 RT, and 30 RT) and undiluted (30 RT, 40 RT and 60 RT) substrate. The results show that the most stable processes (steady state) were 20 RT and 30 RT for diluted substrate and 40 RT and 60 RT for undiluted substrate. The highest volumetric and specific productivities of 20 g/L/h and 11 g/gE/h; 17 g/l/h and 9 g/gE/h were respectively obtained with 20 RT (diluted substrate) and 40 RT (undiluted substrate). Under these conditions, the permeates of the reactor test with 20 RT (diluted substrate) consisted of 80 % POS fractions while that of 40 RT (undiluted substrate) resulted in 70% POS fractions. A two-step continuous biocatalytic MER for the continuous POS production looks very promising for the continuous production of tailor made POS. Although both the processes i.e 20 RT (diluted substrate) and 40 RT (undiluted substrate) gave the best results, but for an Industrial application it is preferable to use an undiluted substrate.Keywords: pectic oligosaccharides, membrane reactor, residence time, specific productivity, volumetric productivity
Procedia PDF Downloads 4404878 The Latency-Amplitude Binomial of Waves Resulting from the Application of Evoked Potentials for the Diagnosis of Dyscalculia
Authors: Maria Isabel Garcia-Planas, Maria Victoria Garcia-Camba
Abstract:
Recent advances in cognitive neuroscience have allowed a step forward in perceiving the processes involved in learning from the point of view of the acquisition of new information or the modification of existing mental content. The evoked potentials technique reveals how basic brain processes interact to achieve adequate and flexible behaviours. The objective of this work, using evoked potentials, is to study if it is possible to distinguish if a patient suffers a specific type of learning disorder to decide the possible therapies to follow. The methodology used, is the analysis of the dynamics of different areas of the brain during a cognitive activity to find the relationships between the different areas analyzed in order to better understand the functioning of neural networks. Also, the latest advances in neuroscience have revealed the existence of different brain activity in the learning process that can be highlighted through the use of non-invasive, innocuous, low-cost and easy-access techniques such as, among others, the evoked potentials that can help to detect early possible neuro-developmental difficulties for their subsequent assessment and cure. From the study of the amplitudes and latencies of the evoked potentials, it is possible to detect brain alterations in the learning process specifically in dyscalculia, to achieve specific corrective measures for the application of personalized psycho pedagogical plans that allow obtaining an optimal integral development of the affected people.Keywords: dyscalculia, neurodevelopment, evoked potentials, Learning disabilities, neural networks
Procedia PDF Downloads 1404877 Biographical Learning and Its Impact on the Democratization Processes of Post War Societies
Authors: Rudolf Egger
Abstract:
This article shows some results of an ongoing project in Kosova. This project deals with the meaning of social transformation processes in the life-courses of Kosova people. One goal is to create an oral history archive in this country. In the last seven years we did some interpretative work (using narrative interviews) concerning the experiences and meanings of social changes from the perspective of life course. We want to reconstruct the individual possibilities in creating one's life in new social structures. After the terrible massacres of ethnical-territorially defined nationalism in former Yugoslavia it is the main focus to find out something about the many small daily steps which must be done, to build up a kind of “normality” in this country. These steps can be very well reconstructed by narrations, by life stories, because personal experiences are naturally linked with social orders. Each individual story is connected with further stories, in which the collective history will be negotiated and reflected. The view on the biographical narration opens the possibility to analyze the concreteness of the “individual case” in the complexity of collective history. Life stories contain thereby a kind of a transition character, that’s why they can be used for the reconstruction of periods of political transformation. For example: In the individual story we can find very clear the national or mythological character of the Albanian people in Kosova. The shown narrations can be read also as narrative lines in relation to the (re-)interpretation of the past, in which lived life is fixed into history in the so-called collective memory in Kosova.Keywords: biographical learning, adult education, social change, post war societies
Procedia PDF Downloads 4194876 Reverse Engineering of a Secondary Structure of a Helicopter: A Study Case
Authors: Jose Daniel Giraldo Arias, Camilo Rojas Gomez, David Villegas Delgado, Gullermo Idarraga Alarcon, Juan Meza Meza
Abstract:
The reverse engineering processes are widely used in the industry with the main goal to determine the materials and the manufacture used to produce a component. There are a lot of characterization techniques and computational tools that are used in order to get this information. A study case of a reverse engineering applied to a secondary sandwich- hybrid type structure used in a helicopter is presented. The methodology used consists of five main steps, which can be applied to any other similar component: Collect information about the service conditions of the part, disassembly and dimensional characterization, functional characterization, material properties characterization and manufacturing processes characterization, allowing to obtain all the supports of the traceability of the materials and processes of the aeronautical products that ensure their airworthiness. A detailed explanation of each step is covered. Criticality and comprehend the functionalities of each part, information of the state of the art and information obtained from interviews with the technical groups of the helicopter’s operators were analyzed,3D optical scanning technique, standard and advanced materials characterization techniques and finite element simulation allow to obtain all the characteristics of the materials used in the manufacture of the component. It was found that most of the materials are quite common in the aeronautical industry, including Kevlar, carbon, and glass fibers, aluminum honeycomb core, epoxy resin and epoxy adhesive. The stacking sequence and volumetric fiber fraction are a critical issue for the mechanical behavior; a digestion acid method was used for this purpose. This also helps in the determination of the manufacture technique which for this case was Vacuum Bagging. Samples of the material were manufactured and submitted to mechanical and environmental tests. These results were compared with those obtained during reverse engineering, which allows concluding that the materials and manufacture were correctly determined. Tooling for the manufacture was designed and manufactured according to the geometry and manufacture process requisites. The part was manufactured and the mechanical, and environmental tests required were also performed. Finally, a geometric characterization and non-destructive techniques allow verifying the quality of the part.Keywords: reverse engineering, sandwich-structured composite parts, helicopter, mechanical properties, prototype
Procedia PDF Downloads 4184875 Integration of Two Thermodynamic Cycles by Absorption for Simultaneous Production of Fresh Water and Cooling
Authors: Javier Delgado-Gonzaga, Wilfrido Rivera, David Juárez-Romero
Abstract:
Cooling and water purification are processes that have contributed to the economic and social development of the modern world. However, these processes require a significant amount of energy globally. Nowadays, absorption heat pumps have been studied with great interest since they are capable of producing cooling and/or purifying water from low-temperature energy sources such as industrial waste heat or renewable energy. In addition, absorption heat pumps require negligible amounts of electricity for their operation and generally use working fluids that do not represent a risk to the environment. The objective of this work is to evaluate a system that integrates an absorption heat transformer and an absorption cooling system to produce fresh water and cooling from a low-temperature heat source. Both cycles operate with the working pair LiBr-H2O. The integration is possible through the interaction of the LiBr-H2O solution streams between both cycles and also by recycling heat from the absorption heat transformer to the absorption cooling system. Mathematical models were developed to compare the performance of four different configurations. The results showed that the configuration in which the hottest streams of LiBr-H2O solution preheated the coldest streams in the economizers of both cycles was one that achieved the best performance. The interaction of the solution currents and the heat recycling analyzed in this work serves as a record of the possibilities of integration between absorption cycles for cogeneration.Keywords: absorption heat transformer, absorption cooling system, water desalination, integrated system
Procedia PDF Downloads 784874 Coding and Decoding versus Space Diversity for Rayleigh Fading Radio Frequency Channels
Authors: Ahmed Mahmoud Ahmed Abouelmagd
Abstract:
The diversity is the usual remedy of the transmitted signal level variations (Fading phenomena) in radio frequency channels. Diversity techniques utilize two or more copies of a signal and combine those signals to combat fading. The basic concept of diversity is to transmit the signal via several independent diversity branches to get independent signal replicas via time – frequency - space - and polarization diversity domains. Coding and decoding processes can be an alternative remedy for fading phenomena, it cannot increase the channel capacity, but it can improve the error performance. In this paper we propose the use of replication decoding with BCH code class, and Viterbi decoding algorithm with convolution coding; as examples of coding and decoding processes. The results are compared to those obtained from two optimized selection space diversity techniques. The performance of Rayleigh fading channel, as the model considered for radio frequency channels, is evaluated for each case. The evaluation results show that the coding and decoding approaches, especially the BCH coding approach with replication decoding scheme, give better performance compared to that of selection space diversity optimization approaches. Also, an approach for combining the coding and decoding diversity as well as the space diversity is considered, the main disadvantage of this approach is its complexity but it yields good performance results.Keywords: Rayleigh fading, diversity, BCH codes, Replication decoding, convolution coding, viterbi decoding, space diversity
Procedia PDF Downloads 4424873 Educational Leadership and Artificial Intelligence
Authors: Sultan Ghaleb Aldaihani
Abstract:
- The environment in which educational leadership takes place is becoming increasingly complex due to factors like globalization and rapid technological change. - This is creating a "leadership gap" where the complexity of the environment outpaces the ability of leaders to effectively respond. - Educational leadership involves guiding teachers and the broader school system towards improved student learning and achievement. 2. Implications of Artificial Intelligence (AI) in Educational Leadership: - AI has great potential to enhance education, such as through intelligent tutoring systems and automating routine tasks to free up teachers. - AI can also have significant implications for educational leadership by providing better information and data-driven decision-making capabilities. - Computer-adaptive testing can provide detailed, individualized data on student learning that leaders can use for instructional decisions and accountability. 3. Enhancing Decision-Making Processes: - Statistical models and data mining techniques can help identify at-risk students earlier, allowing for targeted interventions. - Probability-based models can diagnose students likely to drop out, enabling proactive support. - These data-driven approaches can make resource allocation and decision-making more effective. 4. Improving Efficiency and Productivity: - AI systems can automate tasks and change processes to improve the efficiency of educational leadership and administration. - Integrating AI can free up leaders to focus more on their role's human, interactive elements.Keywords: Education, Leadership, Technology, Artificial Intelligence
Procedia PDF Downloads 434872 Determination of Tide Height Using Global Navigation Satellite Systems (GNSS)
Authors: Faisal Alsaaq
Abstract:
Hydrographic surveys have traditionally relied on the availability of tide information for the reduction of sounding observations to a common datum. In most cases, tide information is obtained from tide gauge observations and/or tide predictions over space and time using local, regional or global tide models. While the latter often provides a rather crude approximation, the former relies on tide gauge stations that are spatially restricted, and often have sparse and limited distribution. A more recent method that is increasingly being used is Global Navigation Satellite System (GNSS) positioning which can be utilised to monitor height variations of a vessel or buoy, thus providing information on sea level variations during the time of a hydrographic survey. However, GNSS heights obtained under the dynamic environment of a survey vessel are affected by “non-tidal” processes such as wave activity and the attitude of the vessel (roll, pitch, heave and dynamic draft). This research seeks to examine techniques that separate the tide signal from other non-tidal signals that may be contained in GNSS heights. This requires an investigation of the processes involved and their temporal, spectral and stochastic properties in order to apply suitable recovery techniques of tide information. In addition, different post-mission and near real-time GNSS positioning techniques will be investigated with focus on estimation of height at ocean. Furthermore, the study will investigate the possibility to transfer the chart datums at the location of tide gauges.Keywords: hydrography, GNSS, datum, tide gauge
Procedia PDF Downloads 2644871 The Environmental Impacts of Textiles Reuse and Recycling: A Review on Life-Cycle-Assessment Publications
Authors: Samuele Abagnato, Lucia Rigamonti
Abstract:
Life-Cycle-Assessment (LCA) is an effective tool to quantify the environmental impacts of reuse models and recycling technologies for textiles. In this work, publications in the last ten years about LCA on textile waste are classified according to location, goal and scope, functional unit, waste composition, impact assessment method, impact categories, and sensitivity analysis. Twenty papers have been selected: 50% are focused only on recycling, 30% only on reuse, the 15% on both, while only one paper considers only the final disposal of the waste. It is found that reuse is generally the best way to decrease the environmental impacts of textiles waste management because of the avoided impacts of manufacturing a new item. In the comparison between a product made with recycled yarns and a product from virgin materials, in general, the first option is less impact, especially for the categories of climate change, water depletion, and land occupation, while for other categories, such as eutrophication or ecotoxicity, under certain conditions the impacts of the recycled fibres can be higher. Cultivation seems to have quite high impacts when natural fibres are involved, especially in the land use and water depletion categories, while manufacturing requires a remarkable amount of electricity, with its associated impact on climate change. In the analysis of the reuse processes, relevant importance is covered by the laundry phase, with water consumption and impacts related to the use of detergents. About the sensitivity analysis, it can be stated that one of the main variables that influence the LCA results and that needs to be further investigated in the modeling of the LCA system about this topic is the substitution rate between recycled and virgin fibres, that is the amount of recycled material that can be used in place of virgin one. Related to this, also the yield of the recycling processes has a strong influence on the results of the impact. The substitution rate is also important in the modeling of the reuse processes because it represents the number of avoided new items bought in place of the reused ones. Another aspect that appears to have a large influence on the impacts is consumer behaviour during the use phase (for example, the number of uses between two laundry cycles). In conclusion, to have a deeper knowledge of the impacts of a life-cycle approach of textile waste, further data and research are needed in the modeling of the substitution rate and of the use phase habits of the consumers.Keywords: environmental impacts, life-cycle-assessment, textiles recycling, textiles reuse, textiles waste management
Procedia PDF Downloads 884870 Reducing Friction Associated with Commercial Use of Biomimetics While Increasing the Potential for Using Eco Materials and Design in Industry
Authors: John P. Ulhøi
Abstract:
Firms are faced with pressure to stay innovative and entrepreneurial while at the same time leaving lighter ecological footprints. Traditionally inspiration for new product development (NPD) has come from the creative in-house staff and from the marketplace. Often NPD offered by this approach has proven to be (far from) optimal for its purpose or highly (resource and energy) efficient. More recently, a bio-inspired NPD approach has surfaced under the banner of biomimetics. Biomimetics refers to inspiration from and translations of designs, systems, processes, and or specific properties that exist in nature. The principles and structures working in nature have evolved over a long period of time enable them to be optimized for the purpose and resource and energy-efficient. These characteristics reflect the raison d'être behind the field of biomimetics. While biological expertise is required to understand and explain such natural and biological principles and structures, engineers are needed to translate biological design and processes into synthetic applications. It can, therefore, hardly be surprising, biomimetics long has gained a solid foothold in both biology and engineering. The commercial adoption of biomimetic applications in new production development (NDP) in industry, however, does not quite reflect a similar growth. Differently put, this situation suggests that something is missing in the biomimetic-NPD-equation, thus acting as a brake towards the wider commercial application of biomimetics and thus the use of eco-materials and design in the industry. This paper closes some of that gap. Before concluding, avenues for future research and implications for practice will be briefly sketched out.Keywords: biomimetics, eco-materials, NPD, commercialization
Procedia PDF Downloads 1634869 Nanomechanical Devices Vibrating at Microwave Frequencies in Simple Liquids
Authors: Debadi Chakraborty, John E. Sader
Abstract:
Nanomechanical devices have emerged as a versatile platform for a host of applications due to their extreme sensitivity to environmental conditions. For example, mass measurements with sensitivity at the atomic level have recently been demonstrated. Ultrafast laser spectroscopy coherently excite the vibrational modes of metal nanoparticles and permits precise measurement of the vibration characteristics as a function of nanoparticle shape, size and surrounding environment. This study reports that the vibration of metal nanoparticles in simple liquids, like water and glycerol are not described by conventional fluid mechanics, i.e., Navier Stokes equations. The intrinsic molecular relaxation processes in the surrounding liquid are found to have a profound effect on the fluid-structure interaction of mechanical devices at nanometre scales. Theoretical models have been developed based on the non-Newtonian viscoelastic fluid-structure interaction theory to investigate the vibration of nanoparticles immersed in simple fluids. The utility of this theoretical framework is demonstrated by comparison to measurements on single nanowires and ensembles of metal rods. This study provides a rigorous foundation for the use of metal nanoparticles as ultrasensitive mechanical sensors in fluid and opens a new paradigm for understanding extremely high frequency fluid mechanics, nanoscale sensing technologies, and biophysical processes.Keywords: fluid-structure interaction, nanoparticle vibration, ultrafast laser spectroscopy, viscoelastic damping
Procedia PDF Downloads 2744868 Preliminary Conceptions of 3D Prototyping Model to Experimental Investigation in Hypersonic Shock Tunnels
Authors: Thiago Victor Cordeiro Marcos, Joao Felipe de Araujo Martos, Ronaldo de Lima Cardoso, David Romanelli Pinto, Paulo Gilberto de Paula Toro, Israel da Silveira Rego, Antonio Carlos de Oliveira
Abstract:
Currently, the use of 3D rapid prototyping, also known as 3D printing, has been investigated by some universities around the world as an innovative technique, fast, flexible and cheap for a direct plastic models manufacturing that are lighter and with complex geometries to be tested for hypersonic shock tunnel. Initially, the purpose is integrated prototyped parts with metal models that actually are manufactured through of the conventional machining and hereafter replace them with completely prototyped models. The mechanical design models to be tested in hypersonic shock tunnel are based on conventional manufacturing processes, therefore are limited forms and standard geometries. The use of 3D rapid prototyping offers a range of options that enables geometries innovation and ways to be used for the design new models. The conception and project of a prototyped model for hypersonic shock tunnel should be rethought and adapted when comparing the conventional manufacturing processes, in order to fully exploit the creativity and flexibility that are allowed by the 3D prototyping process. The objective of this paper is to compare the conception and project of a 3D rapid prototyping model and a conventional machining model, while showing the advantages and disadvantages of each process and the benefits that 3D prototyping can bring to the manufacture of models to be tested in hypersonic shock tunnel.Keywords: 3D printing, 3D prototyping, experimental research, hypersonic shock tunnel
Procedia PDF Downloads 4694867 Public Participation Best Practices in Environmental Decision-making in Newfoundland and Labrador: Analyzing the Forestry Management Planning Process
Authors: Kimberley K. Whyte-Jones
Abstract:
Public participation may improve the quality of environmental management decisions. However, the quality of such a decision is strongly dependent on the quality of the process that leads to it. In order to ensure an effective and efficient process, key features of best practice in participation should be carefully observed; this would also combat disillusionment of citizens, decision-makers and practitioners. The overarching aim of this study is to determine what constitutes an effective public participation process relevant to the Newfoundland and Labrador, Canada context, and to discover whether the public participation process that led to the 2014-2024 Provincial Sustainable Forest Management Strategy (PSFMS) met best practices criteria. The research design uses an exploratory case study strategy to consider a specific participatory process in environmental decision-making in Newfoundland and Labrador. Data collection methods include formal semi-structured interviews and the review of secondary data sources. The results of this study will determine the validity of a specific public participation best practice framework. The findings will be useful for informing citizen participation processes in general and will deduce best practices in public participation in environmental management in the province. The study is, therefore, meaningful for guiding future policies and practices in the management of forest resources in the province of Newfoundland and Labrador, and will help in filling a noticeable gap in research compiling best practices for environmentally related public participation processes.Keywords: best practices, environmental decision-making, forest management, public participation
Procedia PDF Downloads 3204866 Advanced Data Visualization Techniques for Effective Decision-making in Oil and Gas Exploration and Production
Authors: Deepak Singh, Rail Kuliev
Abstract:
This research article explores the significance of advanced data visualization techniques in enhancing decision-making processes within the oil and gas exploration and production domain. With the oil and gas industry facing numerous challenges, effective interpretation and analysis of vast and diverse datasets are crucial for optimizing exploration strategies, production operations, and risk assessment. The article highlights the importance of data visualization in managing big data, aiding the decision-making process, and facilitating communication with stakeholders. Various advanced data visualization techniques, including 3D visualization, augmented reality (AR), virtual reality (VR), interactive dashboards, and geospatial visualization, are discussed in detail, showcasing their applications and benefits in the oil and gas sector. The article presents case studies demonstrating the successful use of these techniques in optimizing well placement, real-time operations monitoring, and virtual reality training. Additionally, the article addresses the challenges of data integration and scalability, emphasizing the need for future developments in AI-driven visualization. In conclusion, this research emphasizes the immense potential of advanced data visualization in revolutionizing decision-making processes, fostering data-driven strategies, and promoting sustainable growth and improved operational efficiency within the oil and gas exploration and production industry.Keywords: augmented reality (AR), virtual reality (VR), interactive dashboards, real-time operations monitoring
Procedia PDF Downloads 864865 Performance Analysis of High Temperature Heat Pump Cycle for Industrial Process
Authors: Seon Tae Kim, Robert Hegner, Goksel Ozuylasi, Panagiotis Stathopoulos, Eberhard Nicke
Abstract:
High-temperature heat pumps (HTHP) that can supply heat at temperatures above 200°C can enhance the energy efficiency of industrial processes and reduce the CO₂ emissions connected with the heat supply of these processes. In the current work, the thermodynamic performance of 3 different vapor compression cycles, which use R-718 (water) as a working medium, have been evaluated by using a commercial process simulation tool (EBSILON Professional). All considered cycles use two-stage vapor compression with intercooling between stages. The main aim of the study is to compare different intercooling strategies and study possible heat recovery scenarios within the intercooling process. This comparison has been carried out by computing the coefficient of performance (COP), the heat supply temperature level, and the respective mass flow rate of water for all cycle architectures. With increasing temperature difference between the heat source and heat sink, ∆T, the COP values decreased as expected, and the highest COP value was found for the cycle configurations where both compressors have the same pressure ratio (PR). The investigation on the HTHP capacities with optimized PR and exergy analysis has also been carried out. The internal heat exchanger cycle with the inward direction of secondary flow (IHX-in) showed a higher temperature level and exergy efficiency compared to other cycles. Moreover, the available operating range was estimated by considering mechanical limitations.Keywords: high temperature heat pump, industrial process, vapor compression cycle, R-718 (water), thermodynamic analysis
Procedia PDF Downloads 1494864 Experimental and Theoretical Investigation of Slow Reversible Deformation of Concrete in Surface-Active Media
Authors: Nika Botchorishvili, Olgha Giorgishvili
Abstract:
Many-year investigations of the nature of damping creep of rigid bodies and materials led to the discovery of the fundamental character of this phenomenon. It occurs only when a rigid body comes in contact with a surface-active medium (liquid or gaseous), which brings about a decrease of the free surface energy of a rigid body as a result of adsorption, chemo-sorption or wetting. The reversibility of the process consists of a gradual disappearance of creep deformation when the action of a surface-active medium stops. To clarify the essence of processes, a physical model is constructed by using Griffith’s scheme and the well-known representation formulas of deformation origination and failure processes. The total creep deformation is caused by the formation and opening of microcracks throughout the material volume under the action of load. This supposedly happens in macroscopically homogeneous silicate and organic glasses, while in polycrystals (tuff, gypsum, steel) contacting with a surface-active medium micro crack are formed mainly on the grain boundaries. The creep of rubber is due to its swelling activated by stress. Acknowledgment: All experiments are financially supported by Shota Rustaveli National Science Foundation of Georgia. Study of Properties of Concretes (Both Ordinary and Compacted) Made of Local Building Materials and Containing Admixtures, and Their Further Introduction in Construction Operations and Road Building. DP2016_26. 22.12.2016.Keywords: process reversibility, surface-active medium, Rebinder’s effect, micro crack, creep
Procedia PDF Downloads 1354863 General Architecture for Automation of Machine Learning Practices
Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain
Abstract:
Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler
Procedia PDF Downloads 574862 Study of Oxidative Processes in Blood Serum in Patients with Arterial Hypertension
Authors: Laura M. Hovsepyan, Gayane S. Ghazaryan, Hasmik V. Zanginyan
Abstract:
Hypertension (HD) is the most common cardiovascular pathology that causes disability and mortality in the working population. Most often, heart failure (HF), which is based on myocardial remodeling, leads to death in hypertension. Recently, endothelial dysfunction (EDF) or a violation of the functional state of the vascular endothelium has been assigned a significant role in the structural changes in the myocardium and the occurrence of heart failure in patients with hypertension. It has now been established that tissues affected by inflammation form increased amounts of superoxide radical and NO, which play a significant role in the development and pathogenesis of various pathologies. They mediate inflammation, modify proteins and damage nucleic acids. The aim of this work was to study the processes of oxidative modification of proteins (OMP) and the production of nitric oxide in hypertension. In the experimental work, the blood of 30 donors and 33 patients with hypertension was used. For the quantitative determination of OMP products, the based on the reaction of the interaction of oxidized amino acid residues of proteins and 2,4-dinitrophenylhydrazine (DNPH) with the formation of 2,4-dinitrophenylhydrazones, the amount of which was determined spectrophotometrically. The optical density of the formed carbonyl derivatives of dinitrophenylhydrazones was recorded at different wavelengths: 356 nm - aliphatic ketone dinitrophenylhydrazones (KDNPH) of neutral character; 370 nm - aliphatic aldehyde dinirophenylhydrazones (ADNPH) of neutral character; 430 nm - aliphatic KDNFG of the main character; 530 nm - basic aliphatic ADNPH. Nitric oxide was determined by photometry using Grace's solution. Adsorption was measured on a Thermo Scientific Evolution 201 SF at a wavelength of 546 nm. Thus, the results of the studies showed that in patients with arterial hypertension, an increased level of nitric oxide in the blood serum is observed, and there is also a tendency to an increase in the intensity of oxidative modification of proteins at a wavelength of 270 nm and 363 nm, which indicates a statistically significant increase in aliphatic aldehyde and ketone dinitrophenylhydrazones. The increase in the intensity of oxidative modification of blood plasma proteins in the studied patients, revealed by us, actually reflects the general direction of free radical processes and, in particular, the oxidation of proteins throughout the body. A decrease in the activity of the antioxidant system also leads to a violation of protein metabolism. The most important consequence of the oxidative modification of proteins is the inactivation of enzymes.Keywords: hypertension (HD), oxidative modification of proteins (OMP), nitric oxide (NO), oxidative stress
Procedia PDF Downloads 108