Search results for: Bruno de Almeida Vilela
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 189

Search results for: Bruno de Almeida Vilela

99 Statistical Model of Water Quality in Estero El Macho, Machala-El Oro

Authors: Rafael Zhindon Almeida

Abstract:

Surface water quality is an important concern for the evaluation and prediction of water quality conditions. The objective of this study is to develop a statistical model that can accurately predict the water quality of the El Macho estuary in the city of Machala, El Oro province. The methodology employed in this study is of a basic type that involves a thorough search for theoretical foundations to improve the understanding of statistical modeling for water quality analysis. The research design is correlational, using a multivariate statistical model involving multiple linear regression and principal component analysis. The results indicate that water quality parameters such as fecal coliforms, biochemical oxygen demand, chemical oxygen demand, iron and dissolved oxygen exceed the allowable limits. The water of the El Macho estuary is determined to be below the required water quality criteria. The multiple linear regression model, based on chemical oxygen demand and total dissolved solids, explains 99.9% of the variance of the dependent variable. In addition, principal component analysis shows that the model has an explanatory power of 86.242%. The study successfully developed a statistical model to evaluate the water quality of the El Macho estuary. The estuary did not meet the water quality criteria, with several parameters exceeding the allowable limits. The multiple linear regression model and principal component analysis provide valuable information on the relationship between the various water quality parameters. The findings of the study emphasize the need for immediate action to improve the water quality of the El Macho estuary to ensure the preservation and protection of this valuable natural resource.

Keywords: statistical modeling, water quality, multiple linear regression, principal components, statistical models

Procedia PDF Downloads 78
98 Organic Matter Removal in Urban and Agroindustry Wastewater by Chemical Precipitation Process

Authors: Karina Santos Silvério, Fátima Carvalho, Maria Adelaide Almeida

Abstract:

The impacts caused by anthropogenic actions on the water environment have been one of the main challenges of modern society. Population growth, added to water scarcity and climate change, points to a need to increase the resilience of production systems to increase efficiency regarding the management of wastewater generated in the different processes. Based on this context, the study developed under the NETA project (New Strategies in Wastewater Treatment) aimed to evaluate the efficiency of the Chemical Precipitation Process (CPP), using the hydrated lime (Ca(OH )₂) as a reagent in wastewater from the agroindustry sector, namely swine wastewater, slaughterhouse and urban wastewater, in order to make the productive means 100% circular, causing a direct positive impact on the environment. The purpose of CPP is to innovate in the field of effluent treatment technologies, as it allows rapid application and is economically profitable. In summary, the study was divided into four main stages: 1) Application of the reagent in a single step, raising the pH to 12.5 2) Obtaining sludge and treated effluent. 3) Natural neutralization of the effluent through Carbonation using atmospheric CO₂. 4) Characterization and evaluation of the feasibility of the chemical precipitation technique in the treatment of different wastewaters through the technique of determining the chemical oxygen demand (COD) and other supporting physical-chemical parameters. The results showed an approximate average removal efficiency above 80% for all effluents, highlighting the swine effluent with 90% removal, followed by urban effluent with 88% and slaughterhouse with 81% on average. Significant improvement was also obtained with regard to color and odor removal after Carbonation to pH 8.00.

Keywords: agroindustry wastewater, urban wastewater, natural carbonatation, chemical precipitation technique

Procedia PDF Downloads 66
97 Analysis of the Presence of Alkylglycerols by Gas Chromatography in Ostrich Oil

Authors: Luana N. Cardozo, Debora A. S. Coutinho, Fabiola Lagher, Bruno J. G. Silva, Ivonilce Venture, Mainara Tesser, Graciela Venera

Abstract:

Ostrich oil is used as food in Brazil, and it has been the subject of scientific research because it contains essential fatty acids (Omega 3, 6, 7, and 9), which provide benefits to human health. Alkylglycerols are lipid ethers consisted of a saturated or unsaturated hydrocarbon chain joined by ether-type bonding to one of the glycerol hydroxyls. It is known that supplementation with alkylglycerols can act significantly on the functioning of immune system cells, both in pathological situations and in homeostasis. Objective: Analyze the presence of alkylglycerols in ostrich oil. Methods: The ostrich oil was bought from an industry that manufactures the product for sale as food, located in Mirante da Serra, northern Brazil. The samples were sent for analysis to the chemistry department of the Federal University of Paraná, where they were analyzed by the gas chromatography method. Results: The analysis of the ostrich oil presented alkylglycerols in area 514505154. Comparison, it is possible to observe that shark liver oil contains the area 26190196, and the difference between both is highly significant. Conclusion: The importance of alkylglycerol supplementation for the immune system is known. The analysis of the results made it possible to verify the presence of alkylglycerols in the ostrich oil, which is five times higher than in the shark liver oil, that would be the largest source food, but was surpassed by the ostrich oil until the present time. The present study emphasizes that ostrich oil can be considered a food source of alkylglycerols and may play a promising role in the immune system because it contains such substance, but further studies are needed to prove its performance in the body.

Keywords: ostrich oil, nutritional composition, alkylglycerols, food

Procedia PDF Downloads 123
96 Quality of Life and Renal Biomarkers in Feline Chronic Kidney Disease

Authors: Bárbara Durão, Pedro Almeida, David Ramilo, André Meneses, Rute Canejo-Teixeira

Abstract:

The importance of quality of life (QoL) assessment in veterinary medicine is an integral part of patient care. This is especially true in cases of chronic diseases, such as chronic kidney disease (CKD), where the ever more advanced treatment options prolong the patient’s life. Whether this prolongment of life comes with an acceptable quality of life remains has been called into question. The aim of this study was to evaluate the relationship between CKD disease biomarkers and QoL in cats. Thirty-seven cats diagnosed with CKD and with no known concurrent illness were enrolled in an observational study. Through the course of several evaluations, renal biomarkers were assessed in blood and urine samples, and owners retrospectively described their cat’s quality of life using a validated instrument for this disease. Correlations between QoL scores (AWIS) and the biomarkers were assessed using Spearman’s rank test. Statistical significance was set at p-value < 0.05, and every serial sample was considered independent. Thirty-seven cats met the inclusion criteria, and all owners completed the questionnaire every time their pet was evaluated, giving a total of eighty-four questionnaires, and the average-weighted-impact-score was –0.5. Results showed there was a statistically significant correlation between the quality of life and most of 17 the studied biomarkers and confirmed that CKD has a negative impact on QoL in cats especially due to the management of the disease and secondary appetite disorders. To our knowledge, this is the attempt to assess the correlation between renal biomarkers and QoL in cats. Our results reveal a strong potential of this type of approach in clinical management, mainly in situations where it is not possible to measure biomarkers. Whilst health-related QoL is a reliable predictor of mortality and morbidity in humans; our findings can help improve the clinical practice in cats with CKD.

Keywords: chronic kidney disease, biomarkers, quality of life, feline

Procedia PDF Downloads 164
95 Local Differential Privacy-Based Data-Sharing Scheme for Smart Utilities

Authors: Veniamin Boiarkin, Bruno Bogaz Zarpelão, Muttukrishnan Rajarajan

Abstract:

The manufacturing sector is a vital component of most economies, which leads to a large number of cyberattacks on organisations, whereas disruption in operation may lead to significant economic consequences. Adversaries aim to disrupt the production processes of manufacturing companies, gain financial advantages, and steal intellectual property by getting unauthorised access to sensitive data. Access to sensitive data helps organisations to enhance the production and management processes. However, the majority of the existing data-sharing mechanisms are either susceptible to different cyber attacks or heavy in terms of computation overhead. In this paper, a privacy-preserving data-sharing scheme for smart utilities is proposed. First, a customer’s privacy adjustment mechanism is proposed to make sure that end-users have control over their privacy, which is required by the latest government regulations, such as the General Data Protection Regulation. Secondly, a local differential privacy-based mechanism is proposed to ensure the privacy of the end-users by hiding real data based on the end-user preferences. The proposed scheme may be applied to different industrial control systems, whereas in this study, it is validated for energy utility use cases consisting of smart, intelligent devices. The results show that the proposed scheme may guarantee the required level of privacy with an expected relative error in utility.

Keywords: data-sharing, local differential privacy, manufacturing, privacy-preserving mechanism, smart utility

Procedia PDF Downloads 62
94 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods

Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim

Abstract:

Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.

Keywords: economical analysis, probability of failure, retaining walls, statistical analysis

Procedia PDF Downloads 399
93 Haematological Responses on Amateur Cycling Stages Race

Authors: Renato André S. Silva, Nana L. F. Sampaio, Carlos J. G. Cruz, Bruno Vianna, Flávio O. Pires

Abstract:

multiple stage bicycle races require high physiological loads from professional cyclists. Such demands can lead to immunosuppression and health problems. However, in this type of competition, little is known about its physiological effects on amateur athletes, who generally receive less medical support. Thus, this study analyzes the hematological effects of a multiple stage bicycle race on amateur cyclists. Seven Brazilian national amateur cyclists (34 ± 4.21 years) underwent a laboratory test to evaluate VO2Max (69.89 ± 7.43 ml⋅kg-1⋅min-1). Six days later, these volunteers raced in the Tour of Goiás, participating in five races in four days (435 km) of competition. Arterial blood samples were collected one day before and one day after the competition. The Kolmogorov-Smirnov tests were used to evaluate the data distribution and Wilcoxon to compare the two moments (p <0.05) of data collection. The results show: Red cells ↓ 7.8% (5.1 ± 0.28 vs 4.7 ± 0.37 106 / mm 3, p = 0.01); Hemoglobin ↓ 7.9% (15.1 ± 0.31 vs 13.9 ± 0.27 g / dL, p = 0.01); Leukocytes ↑ 9.5% (4946 ± 553 versus 5416 ± 1075 / mm 3, p = 0.17); Platelets ↓ 7.0% (200.2 ± 51.5 vs 186.1 ± 39.5 / mm 3, p = 0.01); LDH ↑ 11% (164.4 ± 28.5 vs 182.5 ± 20.5 U / L, p = 0.17); CK ↑ 13.5% (290.7 ± 206.1 vs 330.1 ± 90.5 U / L, p = 0.39); CK-MB ↑ 2% (15.7 ± 3.9 vs. 20.1 ± 2.9 U / L, p = 0.06); Cortizol ↓ 13.5% (12.1 ± 2.4 vs 9.9 ± 1.9 μg / dL, p = 0.01); Total testosterone ↓ 7% (453.6 ± 120.1 vs 421.7 ± 74.3 ng / dL, p = 0.12); IGF-1 ↓ 15.1% (213.8 ± 18.8 vs 181.5 ± 34.7 ng / mL, p = 0.04). This means that there was significant reductions in O2 allocation / transport capacities, vascular injury disruption, and a fortuitous reduction of muscle skeletal anabolism along with maintenance and / or slight elevation of immune function, glucose and lipid energy and myocardial damage. Therefore, the results suggest that no abnormal health effect was identified among the athletes after participating in the Tour de Goiás.

Keywords: cycling, health effects, cycling stages races, haematology

Procedia PDF Downloads 191
92 Experimental Set-up for the Thermo-Hydric Study of a Wood Chips Bed Crossed by an Air Flow

Authors: Dimitri Bigot, Bruno Malet-Damour, Jérôme Vigneron

Abstract:

Many studies have been made about using bio-based materials in buildings. The goal is to reduce its environmental footprint by analyzing its life cycle. This can lead to minimize the carbon emissions or energy consumption. A previous work proposed to numerically study the feasibility of using wood chips to regulate relative humidity inside a building. This has shown the capability of a wood chips bed to regulate humidity inside the building, to improve thermal comfort, and so potentially reduce building energy consumption. However, it also shown that some physical parameters of the wood chips must be identified to validate the proposed model and the associated results. This paper presents an experimental setup able to study such a wood chips bed with different solicitations. It consists of a simple duct filled with wood chips and crossed by an air flow with variable temperature and relative humidity. Its main objective is to study the thermal behavior of the wood chips bed by controlling temperature and relative humidity of the air that enters into it and by observing the same parameters at the output. First, the experimental set up is described according to previous results. A focus is made on the particular properties that have to be characterized. Then some case studies are presented in relation to the previous results in order to identify the key physical properties. Finally, the feasibility of the proposed technology is discussed, and some model validation paths are given.

Keywords: wood chips bed, experimental set-up, bio-based material, desiccant, relative humidity, water content, thermal behaviour, air treatment

Procedia PDF Downloads 109
91 Study and Simulation of a Dynamic System Using Digital Twin

Authors: J.P. Henriques, E. R. Neto, G. Almeida, G. Ribeiro, J.V. Coutinho, A.B. Lugli

Abstract:

Industry 4.0, or the Fourth Industrial Revolution, is transforming the relationship between people and machines. In this scenario, some technologies such as Cloud Computing, Internet of Things, Augmented Reality, Artificial Intelligence, Additive Manufacturing, among others, are making industries and devices increasingly intelligent. One of the most powerful technologies of this new revolution is the Digital Twin, which allows the virtualization of a real system or process. In this context, the present paper addresses the linear and nonlinear dynamic study of a didactic level plant using Digital Twin. In the first part of the work, the level plant is identified at a fixed point of operation, BY using the existing method of least squares means. The linearized model is embedded in a Digital Twin using Automation Studio® from Famous Technologies. Finally, in order to validate the usage of the Digital Twin in the linearized study of the plant, the dynamic response of the real system is compared to the Digital Twin. Furthermore, in order to develop the nonlinear model on a Digital Twin, the didactic level plant is identified by using the method proposed by Hammerstein. Different steps are applied to the plant, and from the Hammerstein algorithm, the nonlinear model is obtained for all operating ranges of the plant. As for the linear approach, the nonlinear model is embedded in the Digital Twin, and the dynamic response is compared to the real system in different points of operation. Finally, yet importantly, from the practical results obtained, one can conclude that the usage of Digital Twin to study the dynamic systems is extremely useful in the industrial environment, taking into account that it is possible to develop and tune controllers BY using the virtual model of the real systems.

Keywords: industry 4.0, digital twin, system identification, linear and nonlinear models

Procedia PDF Downloads 130
90 Nutriscience Project: A Web-Based Intervention to Improve Nutritional Literacy among Families and Educators of Pre-School Children

Authors: R. Barros, J. Azevedo, P. Padrão, M. Gregório, I. Pádua, C. Almeida, C. Rodrigues, P. Fontes, A. Coelho

Abstract:

Recent evidence shows a positive association between nutritional literacy and healthy eating. Traditional nutrition education strategies for childhood obesity prevention have shown weak effect. The Nutriscience project aims to create and evaluate an innovative and multidisciplinary strategy for promoting effective and accessible nutritional information to children, their families, and educators. Nutriscience is a one-year prospective follow-up evaluation study including pre-school children (3-5 y), who attend national schools’ network (29). The project is structured around a web-based intervention, using an on-line interactive platform, and focus on increasing fruit and vegetable consumption, and reducing sugar and salt intake. The platform acts as a social network where educational materials, games, and nutritional challenges are proposed in a gamification approach that promotes family and community social ties. A nutrition Massive Online Open Course is developed for educators, and a national healthy culinary contest will be promoted on TV channel. A parental self-reported questionnaire assessing sociodemographic and nutritional literacy (knowledge, attitudes, skills) is administered (baseline and end of the intervention). We expect that results on nutritional literacy from the presented strategy intervention will give us important information about the best practices for health intervention with kindergarten families. This intervention program using a digital interactive platform could be an educational tool easily adapted and disseminated for childhood obesity prevention.

Keywords: childhood obesity, educational tool, nutritional literacy, web-based intervention

Procedia PDF Downloads 326
89 Normalized Enterprises Architectures: Portugal's Public Procurement System Application

Authors: Tiago Sampaio, André Vasconcelos, Bruno Fragoso

Abstract:

The Normalized Systems Theory, which is designed to be applied to software architectures, provides a set of theorems, elements and rules, with the purpose of enabling evolution in Information Systems, as well as ensuring that they are ready for change. In order to make that possible, this work’s solution is to apply the Normalized Systems Theory to the domain of enterprise architectures, using Archimate. This application is achieved through the adaptation of the elements of this theory, making them artifacts of the modeling language. The theorems are applied through the identification of the viewpoints to be used in the architectures, as well as the transformation of the theory’s encapsulation rules into architectural rules. This way, it is possible to create normalized enterprise architectures, thus fulfilling the needs and requirements of the business. This solution was demonstrated using the Portuguese Public Procurement System. The Portuguese government aims to make this system as fair as possible, allowing every organization to have the same business opportunities. The aim is for every economic operator to have access to all public tenders, which are published in any of the 6 existing platforms, independently of where they are registered. In order to make this possible, we applied our solution to the construction of two different architectures, which are able of fulfilling the requirements of the Portuguese government. One of those architectures, TO-BE A, has a Message Broker that performs the communication between the platforms. The other, TO-BE B, represents the scenario in which the platforms communicate with each other directly. Apart from these 2 architectures, we also represent the AS-IS architecture that demonstrates the current behavior of the Public Procurement Systems. Our evaluation is based on a comparison between the AS-IS and the TO-BE architectures, regarding the fulfillment of the rules and theorems of the Normalized Systems Theory and some quality metrics.

Keywords: archimate, architecture, broker, enterprise, evolvable systems, interoperability, normalized architectures, normalized systems, normalized systems theory, platforms

Procedia PDF Downloads 341
88 Tailoring of ECSS Standard for Space Qualification Test of CubeSat Nano-Satellite

Authors: B. Tiseo, V. Quaranta, G. Bruno, G. Sisinni

Abstract:

There is an increasing demand of nano-satellite development among universities, small companies, and emerging countries. Low-cost and fast-delivery are the main advantages of such class of satellites achieved by the extensive use of commercial-off-the-shelf components. On the other side, the loss of reliability and the poor success rate are limiting the use of nano-satellite to educational and technology demonstration and not to the commercial purpose. Standardization of nano-satellite environmental testing by tailoring the existing test standard for medium/large satellites is then a crucial step for their market growth. Thus, it is fundamental to find the right trade-off between the improvement of reliability and the need to keep their low-cost/fast-delivery advantages. This is particularly even more essential for satellites of CubeSat family. Such miniaturized and standardized satellites have 10 cm cubic form and mass no more than 1.33 kilograms per 1 unit (1U). For this class of nano-satellites, the qualification process is mandatory to reduce the risk of failure during a space mission. This paper reports the description and results of the space qualification test campaign performed on Endurosat’s CubeSat nano-satellite and modules. Mechanical and environmental tests have been carried out step by step: from the testing of the single subsystem up to the assembled CubeSat nano-satellite. Functional tests have been performed during all the test campaign to verify the functionalities of the systems. The test duration and levels have been selected by tailoring the European Space Agency standard ECSS-E-ST-10-03C and GEVS: GSFC-STD-7000A.

Keywords: CubeSat, nano-satellite, shock, testing, vibration

Procedia PDF Downloads 167
87 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level

Authors: Pedro M. Abreu, Bruno R. Mendes

Abstract:

The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.

Keywords: clinical pharmacy, co-payments, healthcare, medicines

Procedia PDF Downloads 238
86 The High Potential and the Little Use of Brazilian Class Actions for Prevention and Penalization Due to Workplace Accidents in Brazil

Authors: Sandra Regina Cavalcante, Rodolfo A. G. Vilela

Abstract:

Introduction: Work accidents and occupational diseases are a big problem for public health around the world and the main health problem of workers with high social and economic costs. Brazil has shown progress over the last years, with the development of the regulatory system to improve safety and quality of life in the workplace. However, the situation is far from acceptable, because the occurrences remain high and there is a great gap between legislation and reality, generated by the low level of voluntary compliance with the law. Brazilian laws provide procedural legal instruments for both, to compensate the damage caused to the worker's health and to prevent future injuries. In the Judiciary, the prevention idea is in the collective action, effected through Brazilian Class Actions. Inhibitory guardianships may impose both, improvements to the working environment, as well as determine the interruption of activity or a ban on the machine that put workers at risk. Both the Labor Prosecution and trade unions have to stand to promote this type of action, providing payment of compensation for collective moral damage. Objectives: To verify how class actions (known as ‘public civil actions’), regulated in Brazilian legal system to protect diffuse, collective and homogeneous rights, are being used to protect workers' health and safety. Methods: The author identified and evaluated decisions of Brazilian Superior Court of Labor involving collective actions and work accidents. The timeframe chosen was December 2015. The online jurisprudence database was consulted in page available for public consultation on the court website. The categorization of the data was made considering the result (court application was rejected or accepted), the request type, the amount of compensation and the author of the cause, besides knowing the reasoning used by the judges. Results: The High Court issued 21,948 decisions in December 2015, with 1448 judgments (6.6%) about work accidents and only 20 (0.09%) on collective action. After analyzing these 20 decisions, it was found that the judgments granted compensation for collective moral damage (85%) and/or obligation to make, that is, changes to improve prevention and safety (71%). The processes have been filed mainly by the Labor Prosecutor (83%), and also appeared lawsuits filed by unions (17%). The compensation for collective moral damage had average of 250,000 reais (about US$65,000), but it should be noted that there is a great range of values found, also are several situations repaired by this compensation. This is the last instance resource for this kind of lawsuit and all decisions were well founded and received partially the request made for working environment protection. Conclusions: When triggered, the labor court system provides the requested collective protection in class action. The values of convictions arbitrated in collective actions are significant and indicate that it creates social and economic repercussions, stimulating employers to improve the working environment conditions of their companies. It is necessary to intensify the use of collective actions, however, because they are more efficient for prevention than reparatory individual lawsuits, but it has been underutilized, mainly by Unions.

Keywords: Brazilian Class Action, collective action, work accident penalization, workplace accident prevention, workplace protection law

Procedia PDF Downloads 262
85 Influences of Separation of the Boundary Layer in the Reservoir Pressure in the Shock Tube

Authors: Bruno Coelho Lima, Joao F.A. Martos, Paulo G. P. Toro, Israel S. Rego

Abstract:

The shock tube is a ground-facility widely used in aerospace and aeronautics science and technology for studies on gas dynamic and chemical-physical processes in gases at high-temperature, explosions and dynamic calibration of pressure sensors. A shock tube in its simplest form is comprised of two separate tubes of equal cross-section by a diaphragm. The diaphragm function is to separate the two reservoirs at different pressures. The reservoir containing high pressure is called the Driver, the low pressure reservoir is called Driven. When the diaphragm is broken by pressure difference, a normal shock wave and non-stationary (named Incident Shock Wave) will be formed in the same place of diaphragm and will get around toward the closed end of Driven. When this shock wave reaches the closer end of the Driven section will be completely reflected. Now, the shock wave will interact with the boundary layer that was created by the induced flow by incident shock wave passage. The interaction between boundary layer and shock wave force the separation of the boundary layer. The aim of this paper is to make an analysis of influences of separation of the boundary layer in the reservoir pressure in the shock tube. A comparison among CDF (Computational Fluids Dynamics), experiments test and analytical analysis were performed. For the analytical analysis, some routines in Python was created, in the numerical simulations (Computational Fluids Dynamics) was used the Ansys Fluent, and the experimental tests were used T1 shock tube located in IEAv (Institute of Advanced Studies).

Keywords: boundary layer separation, moving shock wave, shock tube, transient simulation

Procedia PDF Downloads 302
84 Enhancing Solar Fuel Production by CO₂ Photoreduction Using Transition Metal Oxide Catalysts in Reactors Prepared by Additive Manufacturing

Authors: Renata De Toledo Cintra, Bruno Ramos, Douglas Gouvêa

Abstract:

There is a huge global concern due to the emission of greenhouse gases, consequent environmental problems, and the increase in the average temperature of the planet, caused mainly by fossil fuels, petroleum derivatives represent a big part. One of the main greenhouse gases, in terms of volume, is CO₂. Recovering a part of this product through chemical reactions that use sunlight as an energy source and even producing renewable fuel (such as ethane, methane, ethanol, among others) is a great opportunity. The process of artificial photosynthesis, through the conversion of CO₂ and H₂O into organic products and oxygen using a metallic oxide catalyst, and incidence of sunlight, is one of the promising solutions. Therefore, this research is of great relevance. To this reaction take place efficiently, an optimized reactor was developed through simulation and prior analysis so that the geometry of the internal channel is an efficient route and allows the reaction to happen, in a controlled and optimized way, in flow continuously and offering the least possible resistance. The design of this reactor prototype can be made in different materials, such as polymers, ceramics and metals, and made through different processes, such as additive manufacturing (3D printer), CNC, among others. To carry out the photocatalysis in the reactors, different types of catalysts will be used, such as ZnO deposited by spray pyrolysis in the lighting window, probably modified ZnO, TiO₂ and modified TiO₂, among others, aiming to increase the production of organic molecules, with the lowest possible energy.

Keywords: artificial photosynthesis, CO₂ reduction, photocatalysis, photoreactor design, 3D printed reactors, solar fuels

Procedia PDF Downloads 65
83 In-Silico Fusion of Bacillus Licheniformis Chitin Deacetylase with Chitin Binding Domains from Chitinases

Authors: Keyur Raval, Steffen Krohn, Bruno Moerschbacher

Abstract:

Chitin, the biopolymer of the N-acetylglucosamine, is the most abundant biopolymer on the planet after cellulose. Industrially, chitin is isolated and purified from the shell residues of shrimps. A deacetylated derivative of chitin i.e. chitosan has more market value and applications owing to it solubility and overall cationic charge compared to the parent polymer. This deacetylation on an industrial scale is performed chemically using alkalis like sodium hydroxide. This reaction not only is hazardous to the environment owing to negative impact on the marine ecosystem. A greener option to this process is the enzymatic process. In nature, the naïve chitin is converted to chitosan by chitin deacetylase (CDA). This enzymatic conversion on the industrial scale is however hampered by the crystallinity of chitin. Thus, this enzymatic action requires the substrate i.e. chitin to be soluble which is technically difficult and an energy consuming process. We in this project wanted to address this shortcoming of CDA. In lieu of this, we have modeled a fusion protein with CDA and an auxiliary protein. The main interest being to increase the accessibility of the enzyme towards crystalline chitin. A similar fusion work with chitinases had improved the catalytic ability towards insoluble chitin. In the first step, suitable partners were searched through the protein data bank (PDB) wherein the domain architecture were sought. The next step was to create the models of the fused product using various in silico techniques. The models were created by MODELLER and evaluated for properties such as the energy or the impairment of the binding sites. A fusion PCR has been designed based on the linker sequences generated by MODELLER and would be tested for its activity towards insoluble chitin.

Keywords: chitin deacetylase, modeling, chitin binding domain, chitinases

Procedia PDF Downloads 234
82 Evaluation of Surface Roughness Condition Using App Roadroid

Authors: Diego de Almeida Pereira

Abstract:

The roughness index of a road is considered the most important parameter about the quality of the pavement, as it has a close relation with the comfort and safety of the road users. Such condition can be established by means of functional evaluation of pavement surface deviations, measured by the International Roughness Index (IRI), an index that came out of the international evaluation of pavements, coordinated by the World Bank, and currently owns, as an index of limit measure, for purposes of receiving roads in Brazil, the value of 2.7 m/km. This work make use of the e.IRI parameter, obtained by the Roadroid app. for smartphones which use Android operating system. The choice of such application is due to the practicality for the user interaction, as it possesses a data storage on a cloud of its own, and the support given to universities all around the world. Data has been collected for six months, once in each month. The studies begun in March 2018, season of precipitations that worsen the conditions of the roads, besides the opportunity to accompany the damage and the quality of the interventions performed. About 350 kilometers of sections of four federal highways were analyzed, BR-020, BR-040, BR-060 and BR-070 that connect the Federal District (area where Brasilia is located) and surroundings, chosen for their economic and tourist importance, been two of them of federal and two others of private exploitation. As well as much of the road network, the analyzed stretches are coated of Hot Mix Asphalt (HMA). Thus, this present research performs a contrastive discussion between comfort conditions and safety of the roads under private exploitation in which users pay a fee to the concessionaires so they could travel on a road that meet the minimum requirements for usage, and regarding the quality of offered service on the roads under Federal Government jurisdiction. And finally, the contrast of data collected by National Department of Transport Infrastructure – DNIT, by means of a laser perfilometer, with data achieved by Roadroid, checking the applicability, the practicality and cost-effective, considering the app limitations.

Keywords: roadroid, international roughness index, Brazilian roads, pavement

Procedia PDF Downloads 74
81 New Photosensitizers Encapsulated within Arene-Ruthenium Complexes Active in Photodynamic Therapy: Intracellular Signaling and Evaluation in Colorectal Cancer Models

Authors: Suzan Ghaddar, Aline Pinon, Manuel Gallardo-villagran, Mona Diab-assaf, Bruno Therrien, Bertrand Liagre

Abstract:

Colorectal cancer (CRC) is the third most common cancer and exhibits a consistently rising incidence worldwide. Despite notable advancements in CRC treatment, frequent occurrences of side effects and the development of therapy resistance persistently challenge current approaches. Eventually, innovations in focal therapies remain imperative to enhance the patient’s overall quality of life. Photodynamic therapy (PDT) emerges as a promising treatment modality, clinically used for the treatment of various cancer types. It relies on the use of photosensitive molecules called photosensitizers (PS), which are photoactivated after accumulation in cancer cells, to induce the production of reactive oxygen species (ROS) that cause cancer cell death. Among commonly used metal-based drugs in cancer therapy, ruthenium (Ru) possesses favorable attributes that demonstrate its selectivity towards cancer cells and render it suitable for anti-cancer drug design. In vitro studies using distinct arene-Ru complexes, encapsulating porphin PS, are conducted on human HCT116 and HT-29 colorectal cancer cell lines. These studies encompass the evaluation of the antiproliferative effect, ROS production, apoptosis, cell cycle progression, molecular localization, and protein expression. Preliminary results indicated that these complexes exert significant photocytotoxicity on the studied colorectal cancer cell lines, representing them as promising and potential candidates for anti- cancer agents.

Keywords: colorectal cancer, photodynamic therapy, photosensitizers, arene-ruthenium complexes, apoptosis

Procedia PDF Downloads 77
80 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control

Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni

Abstract:

An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.

Keywords: automation, human factors, air traffic controller, MINIMA, OOTL (Out-Of-The-Loop), EEG (Electroencephalography), HMI (Human Machine Interface)

Procedia PDF Downloads 369
79 Uptake of Copper by Dead Biomass of Burkholderia cenocepacia Isolated from a Metal Mine in Pará, Brazil

Authors: Ingrid R. Avanzi, Marcela dos P. G. Baltazar, Louise H. Gracioso, Luciana J. Gimenes, Bruno Karolski, Elen A. Perpetuo, Claudio Auguto Oller do Nascimento

Abstract:

In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process. In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process.

Keywords: biosorption, dead biomass, biotechnology, copper recovery

Procedia PDF Downloads 328
78 Optimization of Quercus cerris Bark Liquefaction

Authors: Luísa P. Cruz-Lopes, Hugo Costa e Silva, Idalina Domingos, José Ferreira, Luís Teixeira de Lemos, Bruno Esteves

Abstract:

The liquefaction process of cork based tree barks has led to an increase of interest due to its potential innovation in the lumber and wood industries. In this particular study the bark of Quercus cerris (Turkish oak) is used due to its appreciable amount of cork tissue, although of inferior quality when compared to the cork provided by other Quercus trees. This study aims to optimize alkaline catalysis liquefaction conditions, regarding several parameters. To better comprehend the possible chemical characteristics of the bark of Quercus cerris, a complete chemical analysis was performed. The liquefaction process was performed in a double-jacket reactor heated with oil, using glycerol and a mixture of glycerol/ethylene glycol as solvents, potassium hydroxide as a catalyst, and varying the temperature, liquefaction time and granulometry. Due to low liquefaction efficiency resulting from the first experimental procedures a study was made regarding different washing techniques after the filtration process using methanol and methanol/water. The chemical analysis stated that the bark of Quercus cerris is mostly composed by suberin (ca. 30%) and lignin (ca. 24%) as well as insolvent hemicelluloses in hot water (ca. 23%). On the liquefaction stage, the results that led to higher yields were: using a mixture of methanol/ethylene glycol as reagents and a time and temperature of 120 minutes and 200 ºC, respectively. It is concluded that using a granulometry of <80 mesh leads to better results, even if this parameter barely influences the liquefaction efficiency. Regarding the filtration stage, washing the residue with methanol and then distilled water leads to a considerable increase on final liquefaction percentages, which proves that this procedure is effective at liquefying suberin content and lignocellulose fraction.

Keywords: liquefaction, Quercus cerris, polyalcohol liquefaction, temperature

Procedia PDF Downloads 324
77 Developing an Edutainment Game for Children with ADHD Based on SAwD and VCIA Model

Authors: Bruno Gontijo Batista

Abstract:

This paper analyzes how the Socially Aware Design (SAwD) and the Value-oriented and Culturally Informed Approach (VCIA) design model can be used to develop an edutainment game for children with Attention Deficit Hyperactivity Disorder (ADHD). The SAwD approach seeks a design that considers new dimensions in human-computer interaction, such as culture, aesthetics, emotional and social aspects of the user's everyday experience. From this perspective, the game development was VCIA model-based, including the users in the design process through participatory methodologies, considering their behavioral patterns, culture, and values. This is because values, beliefs, and behavioral patterns influence how technology is understood and used and the way it impacts people's lives. This model can be applied at different stages of design, which goes from explaining the problem and organizing the requirements to the evaluation of the prototype and the final solution. Thus, this paper aims to understand how this model can be used in the development of an edutainment game for children with ADHD. In the area of education and learning, children with ADHD have difficulties both in behavior and in school performance, as they are easily distracted, which is reflected both in classes and on tests. Therefore, they must perform tasks that are exciting or interesting for them, once the pleasure center in the brain is activated, it reinforces the center of attention, leaving the child more relaxed and focused. In this context, serious games have been used as part of the treatment of ADHD in children aiming to improve focus and attention, stimulate concentration, as well as be a tool for improving learning in areas such as math and reading, combining education and entertainment (edutainment). Thereby, as a result of the research, it was developed, in a participatory way, applying the VCIA model, an edutainment game prototype, for a mobile platform, for children between 8 and 12 years old.

Keywords: ADHD, edutainment, SAwD, VCIA

Procedia PDF Downloads 169
76 Molecular Mechanisms of Lipid Metabolism and Obesity Modulation by Caspase-1/11 and nlrp3 Inflammasome in Mice

Authors: Lívia Pimentel Sant'ana Dourado, Raquel Das Neves Almeida, Luís Henrique Costa Corrêa Neto, Nayara Soares, Kelly Grace Magalhães

Abstract:

Introduction: Obesity and high-fat diet intake have a crucial impact on immune cells and inflammatory profile, highlighting an emerging realization that obesity is an inflammatory disease. In the present work, we aimed to characterize the role of caspase-1/11 and NLRP3 inflammasome in the establishment of mice obesity and modulation of inflammatory lipid metabolism induced by high fat diet intake. Methods and results: Wild type, caspase-1/11 and NLRP3 knockout mice were fed with standard fat diet (SFD) or high fat diet (HFD) for 90 days. The weight of animals was measured weekly to monitor the weight gain. After 90 days, the blood, peritoneal lavage cells, heart and liver were collected from mice studied here. Cytokines were measured in serum by ELISA and analyzed in spectrophotometry. Lipid antigen presentation molecule CD1d expression, reactive oxygen species (ROS) generation and lipid droplets biogenesis were analyzed in cells from mice peritoneal cavity by flow cytometry. Liver histopathology was performed for morphological evaluation of the organ. The absence of caspase-1/11, but not NLRP3, in mice fed with HFD favored the mice weight gain, increased liver size, induced development of hepatic steatosis and IL-12 secretion in mice compared to mice fed with SFD. In addition, caspase-1/11 knockout mice fed with HFD presented an increased CD1d molecule expression, as well as higher levels of lipid droplets biogenesis and ROS generation compared to wild type mice also fed with HFD. Conclusion: Our data suggest that caspase-1/11 knockout mice have greater susceptibility to obesity as well as increased activation of lipid metabolism and inflammatory markers.

Keywords: caspase 1, caspase 11, inflamassome, obesity, lipids

Procedia PDF Downloads 301
75 Predicting Stem Borer Density in Maize Using RapidEye Data and Generalized Linear Models

Authors: Elfatih M. Abdel-Rahman, Tobias Landmann, Richard Kyalo, George Ong’amo, Bruno Le Ru

Abstract:

Maize (Zea mays L.) is a major staple food crop in Africa, particularly in the eastern region of the continent. The maize growing area in Africa spans over 25 million ha and 84% of rural households in Africa cultivate maize mainly as a means to generate food and income. Average maize yields in Sub Saharan Africa are 1.4 t/ha as compared to global average of 2.5–3.9 t/ha due to biotic and abiotic constraints. Amongst the biotic production constraints in Africa, stem borers are the most injurious. In East Africa, yield losses due to stem borers are currently estimated between 12% to 40% of the total production. The objective of the present study was therefore to predict stem borer larvae density in maize fields using RapidEye reflectance data and generalized linear models (GLMs). RapidEye images were captured for a test site in Kenya (Machakos) in January and in February 2015. Stem borer larva numbers were modeled using GLMs assuming Poisson (Po) and negative binomial (NB) distributions with error with log arithmetic link. Root mean square error (RMSE) and ratio prediction to deviation (RPD) statistics were employed to assess the models performance using a leave one-out cross-validation approach. Results showed that NB models outperformed Po ones in all study sites. RMSE and RPD ranged between 0.95 and 2.70, and between 2.39 and 6.81, respectively. Overall, all models performed similar when used the January and the February image data. We conclude that reflectance data from RapidEye data can be used to estimate stem borer larvae density. The developed models could to improve decision making regarding controlling maize stem borers using various integrated pest management (IPM) protocols.

Keywords: maize, stem borers, density, RapidEye, GLM

Procedia PDF Downloads 484
74 Greenhouse Controlled with Graphical Plotting in Matlab

Authors: Bruno R. A. Oliveira, Italo V. V. Braga, Jonas P. Reges, Luiz P. O. Santos, Sidney C. Duarte, Emilson R. R. Melo, Auzuir R. Alexandria

Abstract:

This project aims to building a controlled greenhouse, or for better understanding, a structure where one can maintain a given range of temperature values (°C) coming from radiation emitted by an incandescent light, as previously defined, characterizing as a kind of on-off control and a differential, which is the plotting of temperature versus time graphs assisted by MATLAB software via serial communication. That way it is possible to connect the stove with a computer and monitor parameters. In the control, it was performed using a PIC 16F877A microprocessor which enabled convert analog signals to digital, perform serial communication with the IC MAX232 and enable signal transistors. The language used in the PIC's management is Basic. There are also a cooling system realized by two coolers 12V distributed in lateral structure, being used for venting and the other for exhaust air. To find out existing temperature inside is used LM35DZ sensor. Other mechanism used in the greenhouse construction was comprised of a reed switch and a magnet; their function is in recognition of the door position where a signal is sent to a buzzer when the door is open. Beyond it exist LEDs that help to identify the operation which the stove is located. To facilitate human-machine communication is employed an LCD display that tells real-time temperature and other information. The average range of design operating without any major problems, taking into account the limitations of the construction material and structure of electrical current conduction, is approximately 65 to 70 ° C. The project is efficient in these conditions, that is, when you wish to get information from a given material to be tested at temperatures not as high. With the implementation of the greenhouse automation, facilitating the temperature control and the development of a structure that encourages correct environment for the most diverse applications.

Keywords: greenhouse, microcontroller, temperature, control, MATLAB

Procedia PDF Downloads 393
73 Beak Size and Asynchronous Hatch in Broiler Chicks

Authors: Mariana Thimotheo, Gabriel Carvalho Ripamonte, Marina De Almeida Nogueira, Silvia Camila Da Costa Aguiar, Marcelo Henrique Santana Ulian, Euclides Braga Malheiros, Isabel Cristina Boleli

Abstract:

Beak plays a fundamental role in the hatching process of the chicks, since it is used for internal and external pipping. The present study examined whether the size of the beak influences the birth period of the broiler chicks in the hatching window. It was analyzed the beak size (length, height and width) of one-hundred twenty nine newly hatched chicks from light eggs (56.22-61.05g) and one-hundred twenty six chicks from heavy eggs (64.95-70.90g), produced by 38 and 45 weeks old broiler breeders (Cobb 500®), respectively. Egg incubation occurred at 37.5°C and 60% RH, with egg turning every hour. Length, height and width of the beaks were measured using a digital caliper (Zaas precision - digital caliper 6", 0.01mm) and the data expressed in millimeters. The beak length corresponded to distance between the tip of the beak and the rictus. The height of the beak was measured in the region of the culmen and its width in the region of the nostrils. Data were analyzed following a 3x2 factorial experimental design, being three birth periods within the hatching window (early: 471.78 to 485.42h, intermediate: 485.43 to 512.27h, and late: 512.28 to 528.72h) and two egg weights (light and heavy). There was a significant interaction between birth period and egg weight for beak height (P < 0.05), which was higher in the intermediate chicks from heavy eggs than in the other chicks from the same egg weight and chicks from light eggs (P < 0.05), that did not differ (P > 0.05). The beak length was influenced only for a birth period, and decreased through the hatch window (early < intermediate < late) (P < 0.05). The width of the beaks was influenced by both main factors, birth period and egg weight (P < 0.05). Early and intermediate chicks had similar beak width, but greater than late chicks, and chicks from heavy eggs presented greater beak width than chicks from light eggs (P < 0.05). In sum, the results show that chicks with longer beak hatch first and that beak length is an important variable for hatch period determination mainly for light eggs.

Keywords: beak dimensions, egg weight, hatching period, hatching window

Procedia PDF Downloads 157
72 Simultaneous Interpreting and Meditation: An Experimental Study on the Effects of Qigong Meditation on Simultaneous Interpreting Performance

Authors: Lara Bruno, Ilaria Tipà, Franco Delogu

Abstract:

Simultaneous interpreting (SI) is a demanding language task which includes the contemporary activation of different cognitive processes. This complex activity requires interpreters not only to be proficient in their working languages; but also to have a great ability in focusing attention and controlling anxiety during their performance. Effects of Qigong meditation techniques have a positive impact on several cognitive functions, including attention and anxiety control. This study aims at exploring the influence of Qigong meditation on the quality of simultaneous interpreting. 20 interpreting students, divided into two groups, were trained for 8 days in Qigong meditation practice. Before and after training, a brief simultaneous interpreting task was performed. Language combinations of group A and group B were respectively English-Italian and Chinese-Italian. Students’ performances were recorded and rated by independent evaluators. Assessments were based on 12 different parameters, divided into 4 macro-categories: content, form, delivery and anxiety control. To determine if there was any significant variation between the pre-training and post-training SI performance, ANOVA analyses were conducted on the ratings provided by the independent evaluators. Main results indicate a significant improvement of the interpreting performance after the meditation training intervention for both groups. However, group A registered a higher improvement compared to Group B. Nonetheless, positive effects of meditation have been found in all the observed macro-categories. Meditation was not only beneficial for speech delivery and anxiety control but also for cognitive and attention abilities. From a cognitive and pedagogical point of view, present results open new paths of research on the practice of meditation as a tool to improve SI performances.

Keywords: cognitive science, interpreting studies, Qigong meditation, simultaneous interpreting, training

Procedia PDF Downloads 151
71 The Impact of Study Abroad Experience on Interpreting Performance

Authors: Ruiyuan Wang, Jing Han, Bruno Di Biase, Mark Antoniou

Abstract:

The purpose of this study is to explore the relationship between working memory (WM) capacity and Chinese-English consecutive interpreting (CI) performance in interpreting learners with different study abroad experience (SAE). Such relationship is not well understood. This study also examines whether Chinese interpreting learners with SAE in English-speaking countries, demonstrate a better performance in inflectional morphology and agreement, notoriously unstable in Chinese speakers of English L2, in their interpreting output than learners without SAE. Fifty Chinese university students, majoring in Chinese-English Interpreting, were recruited in Australia (n=25) and China (n=25). The two groups matched in age, language proficiency, and interpreting training period. Study abroad (SA) group has been studying in an English-speaking country (Australia) for over 12 months, and none of the students recruited in China (the no study abroad = NSA group) had ever studied or lived in an English-speaking country. Data on language proficiency and training background were collected via a questionnaire. Lexical retrieval performance and working memory (WM) capacity data were collected experimentally, and finally, interpreting data was elicited via a direct CI task. Main results of the study show that WM significantly correlated with participants' CI performance independently of learning context. Moreover, SA outperformed NSA learners in terms of subject-verb number agreement. Apart from that, WM capacity was also found to correlate significantly with their morphosyntactic accuracy. This paper sheds some light on the relationship between study abroad, WM capacity, and CI performance. Exploring the effect of study abroad on interpreting trainees and how various important factors correlate may help interpreting educators bring forward more targeted teaching paradigms for participants with different learning experiences.

Keywords: study abroad experience, consecutive interpreting, working memory, inflectional agreement

Procedia PDF Downloads 88
70 Experimental Analysis for the Inlet of the Brazilian Aerospace Vehicle 14-X B

Authors: João F. A. Martos, Felipe J. Costa, Sergio N. P. Laiton, Bruno C. Lima, Israel S. Rêgo, Paulo P. G. Toro

Abstract:

Nowadays, the scramjet is a topic that has attracted the attention of several scientific communities (USA, Australia, Germany, France, Japan, India, China, Russia), that are investing in this in this type of propulsion system due its interest to facilitate access to space and reach hypersonic speed, who have invested in this type of propulsion due to the interest in facilitating access to space. The Brazilian hypersonic scramjet aerospace vehicle 14-X B is a technological demonstrator of a hypersonic airbreathing propulsion system based on the supersonic combustion (scramjet) intended to be tested in flight into the Earth's atmosphere at 30 km altitude and Mach number 7. The 14-X B has been designed at the Prof. Henry T. Nagamatsu Laboratory of Aerothermodynamics and Hypersonics of the Institute for Advanced Studies (IEAv) in Brazil. The IEAv Hypersonic Shock Tunnel, named T3, is a ground-test facility able to reproduce the flight conditions as the Mach number as well as pressure and temperature in the test section close to those encountered during the test flight of the vehicle 14-X B into design conditions. A 1-m long stainless steel 14-X B model was experimentally investigated at T3 Hypersonic Shock Tunnel, for freestream Mach number 7. Static pressure measurements along the lower surface of the 14-X B model, along with high-speed schlieren photographs taken from the 5.5° leading edge and the 14.5° deflection compression ramp, provided experimental data that were compared to the analytical-theoretical solutions and the computational fluid dynamics (CFD) simulations. The results show a good qualitative agreement, and in consequence demonstrating the importance of these methods in the project of the 14-X B hypersonic aerospace vehicle.

Keywords: 14-X, CFD, hypersonic, hypersonic shock tunnel, scramjet

Procedia PDF Downloads 345