Search results for: Diego Cosmelli
64 A Worldwide Assessment of Geothermal Energy Policy: Systematic, Qualitative and Critical Literature Review
Authors: Diego Moya, Juan Paredes, Clay Aldas, Ramiro Tite, Prasad Kaparaju
Abstract:
Globally, energy policy for geothermal development is addressed in different forms, depending on the economy, resources, country-development, environment aspects and technology access. Although some countries have established strong regulations and standards for geothermal exploration, exploitation and sustainable use at the policy level (government departments and institutions), others have discussed geothermal laws at legal levels (congress – a national legislative body of a country). Appropriate regulations are needed not only to meet local and international funding requirements but also to avoid speculation in the use of the geothermal resource. In this regards, this paper presents the results of a systematic, qualitative and critical literature review of geothermal energy policy worldwide addressing two scenarios: policy and legal levels. At first, literature is collected and classified from scientific and government sources regarding geothermal energy policy of the most advanced geothermal producing countries, including Iceland, New Zealand, Mexico, the USA, Central America, Italy, Japan, Philippines, Indonesia, Kenia, and Australia. This is followed by a systematic review of the literature aiming to know the best geothermal practices and what remains uncertain regarding geothermal policy implementation. This analysis is made considering the stages of geothermal production. Furthermore, a qualitative analysis is conducted comparing the findings across geothermal policies in the countries mentioned above. Then, a critical review aims to identify significant items in the field to be applied in countries with geothermal potential but with no or weak geothermal policies. Finally, patterns and relationships are detected, and conclusions are drawn.Keywords: assessment, geothermal, energy policy, worldwide
Procedia PDF Downloads 38563 Endoscopic Ultrasound-Guided Choledochoduodenostomy in an Advanced Extrahepatic Cholangiocarcinoma
Authors: Diego Carrasco, Catarina Freitas, Hugo Rio Tinto, Ricardo Rio Tinto, Nuno Couto, Joaquim Gago, Carlos Carvalho
Abstract:
Introduction: Endoscopic ultrasound-guided choledochoduodenostomy (EUS-CD) to drain the gallbladder can be a palliative care procedure for non-surgical oncologic patients with cholelithiasis and cholangitis process. Case description: A 59-years old Caucasian male diagnosed with extrahepatic cholangiocarcinoma with multiple liver, lung and peritoneum metastasis, unresponsive to treatment with gemcitabine/cisplatin, presented in the institution with fever, hypotension, and severe upper right abdominal pain secondary to cholelithiasis and cholangitis process. The patient was admitted and started on large spectrum antibiotics plus fluid-challenge. Afterward, a percutaneous transhepatic biliary drainage (PTBD) was performed to drain the gallbladder. This procedure temporarily stabilized the patient. However, the definitive solution required gallbladder removal. Since the patient exhibited an advanced oncologic disease and poor response to the chemotherapy, he was not a candidate for surgical intervention. Diagnostic Pathways: A self-expanding metal stent was placed from the duodenum into the bile duct by endoscopic ultrasound-guided. The stent allowed efficient drainage of the contrast from the gallbladder at the end of the endoscopic procedure. Conclusion and Discussion: The stent allowed efficient drainage of the contrast from the gallbladder at the end of the endoscopic procedure and successfully reversed the cholangitis process. EUS-CD is an effective and safe technique and can be used as a palliative care procedure for non-surgical oncologic patients.Keywords: palliative care, cholangiocarcinoma, choledochoduodenostomy, endoscopic ultrasound-guided
Procedia PDF Downloads 18562 Design and Construction of a Device to Facilitate the Stretching of a Plantiflexors Muscles in the Therapy of Rehabilitation for Patients with Spastic Hemiplegia
Authors: Nathalia Andrea Calderon Lesmes, Eduardo Barragan Parada, Diego Fernando Villegas Bermudez
Abstract:
Spasticity in the plantiflexor muscles as a product of stroke (CVA-Cerebrovascular accident) restricts the mobility and independence of the affected people. Commonly, physiotherapists are in charge of manually performing the rehabilitation therapy known as Sustained Mechanical Stretching, rotating the affected foot of the patient in the sagittal plane. However, this causes a physical wear on the professional because it is a fatiguing movement. In this article, a mechanical device is developed to implement this rehabilitation therapy more efficiently. The device consists of a worm-crown mechanism that is driven by a crank to gradually rotate a platform in the sagittal plane of the affected foot, in order to achieve dorsiflexion. The device has a range of sagittal rotation up to 150° and has velcro located on the footplate that secures the foot. The design of this device was modeled by using CAD software and was checked structurally with a general purpose finite element software to be sure that the device is safe for human use. As a measurement system, a goniometer is used in the lateral part of the device and load cells are used to measure the force in order to determine the opposing torque exerted by the muscle. Load cells sensitivity is 1.8 ± 0.002 and has a repeatability of 0.03. Validation of the effectiveness of the device is measured by reducing the opposition torque and increasing mobility for a given patient. In this way, with a more efficient therapy, an improvement in the recovery of the patient's mobility and therefore in their quality of life can be achieved.Keywords: biomechanics, mechanical device, plantiflexor muscles, rehabilitation, spastic hemiplegia, sustained mechanical stretching
Procedia PDF Downloads 16561 Microbioreactor System for Cell Behavior Analysis Focused on Nerve Tissue Engineering
Authors: Yusser Olguín, Diego Benavente, Fernando Dorta, Nicole Orellana, Cristian Acevedo
Abstract:
One of the greatest challenges of tissue engineering is the generation of materials in which the highest possible number of conditions can be incorporated to stimulate the proliferation and differentiation of cells, which will be transformed together with the material into new functional tissue. In this sense, considering the properties of microfluidics and its relationship with cellular micro-environments, the possibility of controlling flow patterns and the ability to design diverse patterns in the chips, a microfluidic cell culture system can be established as a means for the evaluation of the effect of different parameters in a controlled and precise manner. Specifically in relation to the study and development of alternatives in peripheral nervous tissue engineering, it is necessary to consider different physical and chemical neurotrophic stimuli that promote cell growth and differentiation. Chemical stimuli include certain vitamins, glucocorticoids, gangliosides, and growth factors, while physical stimuli include topological stimuli, mechanical forces of the cellular environment and electrical stimulation. In this context, the present investigation shows the results of cell stimulation in a microbioreactor using electrical and chemical stimuli, where the differentiation of PC12 cells as a neuronal model is evidenced by neurite expression, dependent on the stimuli and their combination. The results were analysed with a multi-factor statistical approach, showing several relationships and dependencies between different parameters. Chip design, operating parameters and concentrations of neurotrophic chemical factors were found to be preponderant, based on the characteristics of the electrical stimuli.Keywords: microfluidics, nerve tissue engineering, microbioreactor, electrical stimuli
Procedia PDF Downloads 8560 The Performance of Natural Light by Roof Systems in Cultural Buildings
Authors: Ana Paula Esteves, Diego S. Caetano, Louise L. B. Lomardo
Abstract:
This paper presents an approach to the performance of the natural lighting, when the use of appropriated solar lighting systems on the roof is applied in cultural buildings such as museums and foundations. The roofs, as a part of contact between the building and the external environment, require special attention in projects that aim at energy efficiency, being an important element for the capture of natural light in greater quantity, but also for being the most important point of generation of photovoltaic solar energy, even semitransparent, allowing the partial passage of light. Transparent elements in roofs, as well as superior protection of the building, can also play other roles, such as: meeting the needs of natural light for the accomplishment of the internal tasks, attending to the visual comfort; to bring benefits to the human perception and about the interior experience in a building. When these resources are well dimensioned, they also contribute to the energy efficiency and consequent character of sustainability of the building. Therefore, when properly designed and executed, a roof light system can bring higher quality natural light to the interior of the building, which is related to the human health and well-being dimension. Furthermore, it can meet the technologic, economic and environmental yearnings, making possible the more efficient use of that primordial resource, which is the light of the Sun. The article presents the analysis of buildings that used zenith light systems in search of better lighting performance in museums and foundations: the Solomon R. Guggenheim Museum in the United States, the Iberê Camargo Foundation in Brazil, the Museum of Fine Arts in Castellón in Spain and the Pinacoteca of São Paulo.Keywords: natural lighting, roof lighting systems, natural lighting in museums, comfort lighting
Procedia PDF Downloads 21059 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining
Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride
Abstract:
In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning
Procedia PDF Downloads 13458 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning
Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah
Abstract:
Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning
Procedia PDF Downloads 3257 Optimal Concentration of Fluorescent Nanodiamonds in Aqueous Media for Bioimaging and Thermometry Applications
Authors: Francisco Pedroza-Montero, Jesús Naín Pedroza-Montero, Diego Soto-Puebla, Osiris Alvarez-Bajo, Beatriz Castaneda, Sofía Navarro-Espinoza, Martín Pedroza-Montero
Abstract:
Nanodiamonds have been widely studied for their physical properties, including chemical inertness, biocompatibility, optical transparency from the ultraviolet to the infrared region, high thermal conductivity, and mechanical strength. In this work, we studied how the fluorescence spectrum of nanodiamonds quenches concerning the concentration in aqueous solutions systematically ranging from 0.1 to 10 mg/mL. Our results demonstrated a non-linear fluorescence quenching as the concentration increases for both of the NV zero-phonon lines; the 5 mg/mL concentration shows the maximum fluorescence emission. Furthermore, this behaviour is theoretically explained as an electronic recombination process that modulates the intensity in the NV centres. Finally, to gain more insight, the FRET methodology is used to determine the fluorescence efficiency in terms of the fluorophores' separation distance. Thus, the concentration level is simulated as follows, a small distance between nanodiamonds would be considered a highly concentrated system, whereas a large distance would mean a low concentrated one. Although the 5 mg/mL concentration shows the maximum intensity, our main interest is focused on the concentration of 0.5 mg/mL, which our studies demonstrate the optimal human cell viability (99%). In this respect, this concentration has the feature of being as biocompatible as water giving the possibility to internalize it in cells without harming the living media. To this end, not only can we track nanodiamonds on the surface or inside the cell with excellent precision due to their fluorescent intensity, but also, we can perform thermometry tests transforming a fluorescence contrast image into a temperature contrast image.Keywords: nanodiamonds, fluorescence spectroscopy, concentration, bioimaging, thermometry
Procedia PDF Downloads 40456 Thermal Simulation for Urban Planning in Early Design Phases
Authors: Diego A. Romero Espinosa
Abstract:
Thermal simulations are used to evaluate comfort and energy consumption of buildings. However, the performance of different urban forms cannot be assessed precisely if an environmental control system and user schedules are considered. The outcome of such analysis would lead to conclusions that combine the building use, operation, services, envelope, orientation and density of the urban fabric. The influence of these factors varies during the life cycle of a building. The orientation, as well as the surroundings, can be considered a constant during the lifetime of a building. The structure impacts the thermal inertia and has the largest lifespan of all the building components. On the other hand, the building envelope is the most frequent renovated component of a building since it has a great impact on energy performance and comfort. Building services have a shorter lifespan and are replaced regularly. With the purpose of addressing the performance, an urban form, a specific orientation, and density, a thermal simulation method were developed. The solar irradiation is taken into consideration depending on the outdoor temperature. Incoming irradiation at low temperatures has a positive impact increasing the indoor temperature. Consequently, overheating would be the combination of high outdoor temperature and high irradiation at the façade. On this basis, the indoor temperature is simulated for a specific orientation of the evaluated urban form. Thermal inertia and building envelope performance are considered additionally as the materiality of the building. The results of different thermal zones are summarized using the 'Degree day method' for cooling and heating. During the early phase of a design process for a project, such as Masterplan, conclusions regarding urban form, density and materiality can be drawn by means of this analysis.Keywords: building envelope, density, masterplanning, urban form
Procedia PDF Downloads 14555 Crickets as Social Business Model for Rural Women in Colombia
Authors: Diego Cruz, Helbert Arevalo, Diana Vernot
Abstract:
In 2013, the Food and Agriculture Organization of the United Nations (FAO) said that insect production for food and feed could become an economic opportunity for rural women in developing countries. However, since then, just a few initiatives worldwide had tried to implement this kind of project in zones of tropical countries without previous experience in cricket production and insect human consumption, such as Colombia. In this project, ArthroFood company and the University of La Sabana join efforts to make a holistic multi-perspective analysis from biological, economic, culinary, and social sides of the Gryllodes sigillatus production by rural women of the municipality of La Mesa, Cundinamarca, Colombia. From a biological and economic perspective, G. sigillatus production in a 60m2 greenhouse was evaluated considering the effect of rearing density and substrates on final weight and length, developing time, survival rate, and proximate composition. Additionally, the production cost and labor hours were recorded for five months. On the other hand, from a socio- economic side, the intention of the rural women to implement cricket farms or micro-entrepreneurship around insect production was evaluated after developing ethnographies and empowerment, entrepreneurship, and cricket production workshops. Finally, the results of the elaboration of culinary recipes with cricket powder incorporating cultural aspects of the context of La Mesa, Cundinamarca, will be presented. This project represents Colombia's first attempt to create a social business model of cricket production involving rural women, academies, the private sector, and local authorities.Keywords: cricket production, developing country, edible insects, entrepreneurship, insect culinary recipes
Procedia PDF Downloads 10454 Biomechanical Performance of the Synovial Capsule of the Glenohumeral Joint with a BANKART Lesion through Finite Element Analysis
Authors: Duvert A. Puentes T., Javier A. Maldonado E., Ivan Quintero., Diego F. Villegas
Abstract:
Mechanical Computation is a great tool to study the performance of complex models. An example of it is the study of the human body structure. This paper took advantage of different types of software to make a 3D model of the glenohumeral joint and apply a finite element analysis. The main objective was to study the change in the biomechanical properties of the joint when it presents an injury. Specifically, a BANKART lesion, which consists in the detachment of the anteroinferior labrum from the glenoid. Stress and strain distribution of the soft tissues were the focus of this study. First, a 3D model was made of a joint without any pathology, as a control sample, using segmentation software for the bones with the support of medical imagery and a cadaveric model to represent the soft tissue. The joint was built to simulate a compression and external rotation test using CAD to prepare the model in the adequate position. When the healthy model was finished, it was submitted to a finite element analysis and the results were validated with experimental model data. With the validated model, it was sensitized to obtain the best mesh measurement. Finally, the geometry of the 3D model was changed to imitate a BANKART lesion. Then, the contact zone of the glenoid with the labrum was slightly separated simulating a tissue detachment. With this new geometry, the finite element analysis was applied again, and the results were compared with the control sample created initially. With the data gathered, this study can be used to improve understanding of the labrum tears. Nevertheless, it is important to remember that the computational analysis are approximations and the initial data was taken from an in vitro assay.Keywords: biomechanics, computational model, finite elements, glenohumeral joint, bankart lesion, labrum
Procedia PDF Downloads 16153 A Multicriteria Analysis of Energy Poverty Index: A Case Study of Non-interconnected Zones in Colombia
Authors: Angelica Gonzalez O, Leonardo Rivera Cadavid, Diego Fernando Manotas
Abstract:
Energy poverty considers a population that does not have access to modern energy service. In particular, an area of a country that is not connected to the national electricity grid is known as a Non-Interconnected Zone (NIZ). Access to electricity has a significant impact on the welfare and development opportunities of the population. Different studies have shown that most health problems have an empirical cause and effect relationship with multidimensional energy poverty. Likewise, research has been carried out to review the consequences of not having access to electricity, and its results have concluded a statistically significant relationship between energy poverty and sources of drinking water, access to clean water, risks of mosquito bites, obesity, sterilization, marital status, occupation, and residence. Therefore, extensive research has been conducted in the construction of an energy poverty measure based on an index. Some of these studies introduce a Multidimensional Energy Poverty Index (MEPI), Compose Energy Poverty Index (CEPI), Low Income High Costs indicator (LIHC), among others. For this purpose, this study analyzes the energy poverty index using a multicriteria analysis determining the set of feasible alternatives - for which Colombia's ZNI will be used as a case study - to be considered in the problem and the set of relevant criteria in the characterization of the ZNI, from which the prioritization is obtained to determine the level of adjustment of each alternative with respect to the performance in each criterion. Additionally, this study considers the installation of Micro-Grids (MG). This is considered a straightforward solution to this problem because an MG is a local electrical grid, able to operate in grid-connected and island mode. Drawing on those insights, this study compares an energy poverty index considering an MG installation and calculates the impacts of different criterias in an energy poverty index in NIZ.Keywords: multicirteria, energy poverty, rural, microgrids, non-interconnect zones
Procedia PDF Downloads 11752 TutorBot+: Automatic Programming Assistant with Positive Feedback based on LLMs
Authors: Claudia Martínez-Araneda, Mariella Gutiérrez, Pedro Gómez, Diego Maldonado, Alejandra Segura, Christian Vidal-Castro
Abstract:
The purpose of this document is to showcase the preliminary work in developing an EduChatbot-type tool and measuring the effects of its use aimed at providing effective feedback to students in programming courses. This bot, hereinafter referred to as tutorBot+, was constructed based on chatGPT and is tasked with assisting and delivering timely positive feedback to students in the field of computer science at the Universidad Católica de Concepción. The proposed working method consists of four stages: (1) Immersion in the domain of Large Language Models (LLMs), (2) Development of the tutorBot+ prototype and integration, (3) Experiment design, and (4) Intervention. The first stage involves a literature review on the use of artificial intelligence in education and the evaluation of intelligent tutors, as well as research on types of feedback for learning and the domain of chatGPT. The second stage encompasses the development of tutorBot+, and the final stage involves a quasi-experimental study with students from the Programming and Database labs, where the learning outcome involves the development of computational thinking skills, enabling the use and measurement of the tool's effects. The preliminary results of this work are promising, as a functional chatBot prototype has been developed in both conversational and non-conversational versions integrated into an open-source online judge and programming contest platform system. There is also an exploration of the possibility of generating a custom model based on a pre-trained one tailored to the domain of programming. This includes the integration of the created tool and the design of the experiment to measure its utility.Keywords: assessment, chatGPT, learning strategies, LLMs, timely feedback
Procedia PDF Downloads 6851 Visualization Tool for EEG Signal Segmentation
Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh
Abstract:
This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation
Procedia PDF Downloads 39750 Low-Density Lipoproteins Mediated Delivery of Paclitaxel and MRI Imaging Probes for Personalized Medicine Applications
Authors: Sahar Rakhshan, Simonetta Geninatti Crich, Diego Alberti, Rachele Stefania
Abstract:
The combination of imaging and therapeutic agents in the same smart nanoparticle is a promising option to perform a minimally invasive imaging guided therapy. In this study, Low density lipoproteins (LDL), one of the most attractive biodegradable and biocompatible nanoparticles, were used for the simultaneous delivery of Paclitaxel (PTX), a hydrophobic antitumour drug and an amphiphilic contrast agent, Gd-AAZTA-C17, in B16-F10 melanoma cell line. These cells overexpress LDL receptors, as assessed by Flow cytometry analysis. PTX and Gd-AAZTA-C17 loaded LDLs (LDL-PTX-Gd) have been prepared, characterized and their stability was assessed under 72 h incubation at 37 ◦C and compared to LDL loaded with Gd-AAZTA-C17 (LDL-Gd) and LDL-PTX. The cytotoxic effect of LDL-PTX-Gd was evaluated by MTT assay. The anti-tumour drug loaded into LDLs showed a significantly higher toxicity on B16-F10 cells with respect to the commercially available formulation Paclitaxel Kabi (PTX Kabi) used in clinical applications. It was possible to demonstrate a high uptake of LDL-Gd in B16-F10 cells. As a consequence of the high cell uptake, melanoma cells showed significantly high cytotoxic effect when incubated in the presence of PTX (LDL-PTX-Gd). Furthermore, B16-F10 have been used to perform Magnetic Resonance Imaging. By the analysis of the image signal intensity, it was possible to extrapolate the amount of internalized PTX indirectly by the decrease of relaxation times caused by Gd, proportional to its concentration. Finally, the treatment with PTX loaded LDL on B16-F10 tumour bearing mice resulted in a marked reduction of tumour growth compared to the administration of PTX Kabi alone. In conclusion, LDLs are selectively taken-up by tumour cells and can be successfully exploited for the selective delivery of Paclitaxel and imaging agents.Keywords: low density lipoprotein, melanoma cell lines, MRI, paclitaxel, personalized medicine application, theragnostic System
Procedia PDF Downloads 12549 Evaluation of Surface Roughness Condition Using App Roadroid
Authors: Diego de Almeida Pereira
Abstract:
The roughness index of a road is considered the most important parameter about the quality of the pavement, as it has a close relation with the comfort and safety of the road users. Such condition can be established by means of functional evaluation of pavement surface deviations, measured by the International Roughness Index (IRI), an index that came out of the international evaluation of pavements, coordinated by the World Bank, and currently owns, as an index of limit measure, for purposes of receiving roads in Brazil, the value of 2.7 m/km. This work make use of the e.IRI parameter, obtained by the Roadroid app. for smartphones which use Android operating system. The choice of such application is due to the practicality for the user interaction, as it possesses a data storage on a cloud of its own, and the support given to universities all around the world. Data has been collected for six months, once in each month. The studies begun in March 2018, season of precipitations that worsen the conditions of the roads, besides the opportunity to accompany the damage and the quality of the interventions performed. About 350 kilometers of sections of four federal highways were analyzed, BR-020, BR-040, BR-060 and BR-070 that connect the Federal District (area where Brasilia is located) and surroundings, chosen for their economic and tourist importance, been two of them of federal and two others of private exploitation. As well as much of the road network, the analyzed stretches are coated of Hot Mix Asphalt (HMA). Thus, this present research performs a contrastive discussion between comfort conditions and safety of the roads under private exploitation in which users pay a fee to the concessionaires so they could travel on a road that meet the minimum requirements for usage, and regarding the quality of offered service on the roads under Federal Government jurisdiction. And finally, the contrast of data collected by National Department of Transport Infrastructure – DNIT, by means of a laser perfilometer, with data achieved by Roadroid, checking the applicability, the practicality and cost-effective, considering the app limitations.Keywords: roadroid, international roughness index, Brazilian roads, pavement
Procedia PDF Downloads 8548 Finite Element Modelling for the Development of a Planar Ultrasonic Dental Scaler for Prophylactic and Periodontal Care
Authors: Martin Hofmann, Diego Stutzer, Thomas Niederhauser, Juergen Burger
Abstract:
Dental biofilm is the main etiologic factor for caries, periodontal and peri-implant infections. In addition to the risk of tooth loss, periodontitis is also associated with an increased risk of systemic diseases such as atherosclerotic cardiovascular disease and diabetes. For this reason, dental hygienists use ultrasonic scalers for prophylactic and periodontal care of the teeth. However, the current instruments are limited to their dimensions and operating frequencies. The innovative design of a planar ultrasonic transducer introduces a new type of dental scalers. The flat titanium-based design allows the mass to be significantly reduced compared to a conventional screw-mounted Langevin transducer, resulting in a more efficient and controllable scaler. For the development of the novel device, multi-physics finite element analysis was used to simulate and optimise various design concepts. This process was supported by prototyping and electromechanical characterisation. The feasibility and potential of a planar ultrasonic transducer have already been confirmed by our current prototypes, which achieve higher performance compared to commercial devices. Operating at the desired resonance frequency of 28 kHz with a driving voltage of 40 Vrms results in an in-plane tip oscillation with a displacement amplitude of up to 75 μm by having less than 8 % out-of-plane movement and an energy transformation factor of 1.07 μm/mA. In a further step, we will adapt the design to two additional resonance frequencies (20 and 40 kHz) to obtain information about the most suitable mode of operation. In addition to the already integrated characterization methods, we will evaluate the clinical efficiency of the different devices in an in vitro setup with an artificial biofilm pocket model.Keywords: ultrasonic instrumentation, ultrasonic scaling, piezoelectric transducer, finite element simulation, dental biofilm, dental calculus
Procedia PDF Downloads 12247 Failure Analysis: Solid Rocket Motor Type “Candy” - Explosion in a Static Test
Authors: Diego Romero, Fabio Rojas, J. Alejandro Urrego
Abstract:
The sounding rockets are aerospace vehicles that were developed in the mid-20th century, and Colombia has been involved in research that was carried out with the aim of innovating with this technology. The rockets are university research programs with the collaboration of the local government, with a simple strategy, develop and reduce the greatest costs associated with the production of a kind type of technology. In this way, in this document presents the failure analysis of a solid rocket motor, with the real compatibly to reach the thermosphere with a low-cost fuel. This solid rocket motor is the latest development of the Uniandes Aerospace Project (PUA for its Spanish acronym), an undergraduate and postgraduate research group at Universidad de los Andes (Bogotá, Colombia), dedicated to incurring in this type of technology. This motor has been carried out on Candy-type solid fuel, which is a compound of potassium nitrate and sorbitol, and the investigation has allowed the production of solid motors powerful enough to reach space, and which represents a unique technological advance in Latin America and an important development in experimental rocketry.To outline the main points the explosion in a static test is an important to explore and demonstrate the ways to develop technology, methodologies, production and manufacturing, being a solid rocket motor with 30 kN of thrust. In conclusion, this analysis explores different fields such as: design, manufacture, materials, production, first fire and more, with different engineering tools with principal objective find root failure. Following the engineering analysis methodology, was possible to design a new version of motor, with learned lessons new manufacturing specification, therefore, when publishing this project, it is intended to be a reference for future research in this field and benefit the industry.Keywords: candy propellant, candy rockets, explosion, failure analysis, static test, solid rocket motor
Procedia PDF Downloads 16146 Scoping Review of Biological Age Measurement Composed of Biomarkers
Authors: Diego Alejandro Espíndola-Fernández, Ana María Posada-Cano, Dagnóvar Aristizábal-Ocampo, Jaime Alberto Gallo-Villegas
Abstract:
Background: With the increase in life expectancy, aging has been subject of frequent research, and therefore multiple strategies have been proposed to quantify the advance of the years based on the known physiology of human senescence. For several decades, attempts have been made to characterize these changes through the concept of biological age, which aims to integrate, in a measure of time, structural or functional variation through biomarkers in comparison with simple chronological age. The objective of this scoping review is to deepen the updated concept of measuring biological age composed of biomarkers in the general population and to summarize recent evidence to identify gaps and priorities for future research. Methods: A scoping review was conducted according to the five-phase methodology developed by Arksey and O'Malley through a search of five bibliographic databases to February 2021. Original articles were included with no time or language limit that described the biological age composed of at least two biomarkers in those over 18 years of age. Results: 674 articles were identified, of which 105 were evaluated for eligibility and 65 were included with information on the measurement of biological age composed of biomarkers. Articles from 1974 of 15 nationalities were found, most observational studies, in which clinical or paraclinical biomarkers were used, and 11 different methods described for the calculation of the composite biological age were informed. The outcomes reported were the relationship with the same measured biomarkers, specified risk factors, comorbidities, physical or cognitive functionality, and mortality. Conclusions: The concept of biological age composed of biomarkers has evolved since the 1970s and multiple methods of its quantification have been described through the combination of different clinical and paraclinical variables from observational studies. Future research should consider the population characteristics, and the choice of biomarkers against the proposed outcomes to improve the understanding of aging variables to direct effective strategies for a proper approach.Keywords: biological age, biological aging, aging, senescence, biomarker
Procedia PDF Downloads 18645 Microthermometry of Carbonated Rocks of the Hondita-Lomagorda Formations, the Tiger Cave Sector, Municipality of Yaguara, Colombia
Authors: Camila Lozano-Vivas, Camila Quevedo-Villamil, Ingrid Munoz-Quijano, Diego Loaiza
Abstract:
Colombia's limited oil reserves make the finding of new fields of extraction or the potentiate of the existing ones a more important task to do every day; the exploration projects that allow to have a better knowledge of the oil basins are essential. The upper Magdalena Valley basin - VSM, whose reserves are limited, has been one of the first basins for the exploration and production of hydrocarbons in Colombia. The Hondita and Lomagorda formations were deposited in the Late Cretaceous Middle Albian to the Coniacian and are characterized by being the hydrocarbon-generating rocks in the VSM basin oil system along with the Shale de Bambucá; therefore multiple studies have been made. In the oil industry, geochemical properties are used to understand the origin, migration, accumulation, and alteration of hydrocarbons and, in general, the evolution of the basin containing them. One of the most important parameters to understand this evolution is the formation temperature of the oil system. For this reason, a microthermometric study of fluid inclusions was carried out to recognize formation temperatures and to determine certain basic physicochemical variables, homogenization temperature, pressure, density and salinity of the fluid at the time of entrapment, providing evidence on the history of different events in different geological environments in the evolution of a sedimentary basin. Prior to this study, macroscopic and microscopic petrographic analyses of the samples collected in the field were performed. The results of the mentioned properties of the fluid inclusions in the different samples analyzed have salinities ranging from 20.22% to 26.37% eq. by weight NaCl, similar densities found in the ranges of 1.05 to 1.16 g/cc and an average homogenization temperature at 142.92°C, indicating that, at the time of their entanglement, the rock was in the window of generation of medium hydrocarbons –light with fragile characteristics of the rock that would make it useful to treat them as naturally fractured reservoirs.Keywords: homogenization temperature, fluid inclusions, microthermometry, salinity
Procedia PDF Downloads 14844 Spatial Data Science for Data Driven Urban Planning: The Youth Economic Discomfort Index for Rome
Authors: Iacopo Testi, Diego Pajarito, Nicoletta Roberto, Carmen Greco
Abstract:
Today, a consistent segment of the world’s population lives in urban areas, and this proportion will vastly increase in the next decades. Therefore, understanding the key trends in urbanization, likely to unfold over the coming years, is crucial to the implementation of sustainable urban strategies. In parallel, the daily amount of digital data produced will be expanding at an exponential rate during the following years. The analysis of various types of data sets and its derived applications have incredible potential across different crucial sectors such as healthcare, housing, transportation, energy, and education. Nevertheless, in city development, architects and urban planners appear to rely mostly on traditional and analogical techniques of data collection. This paper investigates the prospective of the data science field, appearing to be a formidable resource to assist city managers in identifying strategies to enhance the social, economic, and environmental sustainability of our urban areas. The collection of different new layers of information would definitely enhance planners' capabilities to comprehend more in-depth urban phenomena such as gentrification, land use definition, mobility, or critical infrastructural issues. Specifically, the research results correlate economic, commercial, demographic, and housing data with the purpose of defining the youth economic discomfort index. The statistical composite index provides insights regarding the economic disadvantage of citizens aged between 18 years and 29 years, and results clearly display that central urban zones and more disadvantaged than peripheral ones. The experimental set up selected the city of Rome as the testing ground of the whole investigation. The methodology aims at applying statistical and spatial analysis to construct a composite index supporting informed data-driven decisions for urban planning.Keywords: data science, spatial analysis, composite index, Rome, urban planning, youth economic discomfort index
Procedia PDF Downloads 13543 Photoprotective and Antigenotoxic Effects of a Mixture of Posoqueria latifolia Flower Extract and Kaempferol Against Ultraviolet B Radiation
Authors: Silvia Ximena Barrios, Diego Armando Villamizar Mantilla, Raquel Elvira Ocazionez, , Elena E. Stashenko, María Pilar Vinardell, Jorge Luis Fuentes
Abstract:
Introduction: Skin overexposure to solar radiation has been a serious public health concern, because of its potential carcinogenicity. Therefore, preventive protection strategies using photoprotective agents are critical to counteract the harmful effect of solar radiation. Plants may be a source of photoprotective compounds that inhibit cellular mutations involved in skin cancer initiation. This work evaluated the photoprotective and antigenotoxic effects against ultraviolet B (UVB) radiation of a mixture of Posoqueria latifolia flower extract and Kaempferol (MixPoKa). Methods: The photoprotective efficacy of MixPoka (Posoqueria latifolia flower extract 250 μg/ml and Kaempferol 349.5 μM) was evaluated using in vitro indices such as sun protection factor SPFᵢₙ ᵥᵢₜᵣₒ and critical wavelength (λc). The MixPoKa photostability (Eff) at human minimal erythema doses (MED), according to the Fitzpatrick skin scale, was also estimated. Cytotoxicity and genotoxicity/antigenotoxicity were studied in MRC5 human fibroblasts using the trypan blue exclusion and Comet assays, respectively. Kinetics of the genetic damage repair post irradiation in the presence and absence of the MixPoka, was also evaluated. Results: The MixPoka -UV absorbance spectrum was high across the spectral bands between 200 and 400 nm. The UVB photoprotection efficacy of MixPoka was high (SPFᵢₙ ᵥᵢₜᵣₒ = 25.70 ± 0.06), showed wide photoprotection spectrum (λc = 380 ± 0), and resulted photostable (Eff = 92.3–100.0%). The MixPoka was neither cytotoxic nor genotoxic in MRC5 human fibroblasts; but presented significant antigenotoxic effect against UVB radiation. Additionally, MixPoka stimulate DNA repair post-irradiation. The potential of this phytochemical mixture as sunscreen ingredients was discussed. Conclusion: MixPoka showed a significant antigenotoxic effect against UVB radiation and stimulated DNA repair after irradiation. MixPoka could be used as an ingredient in a sunscreen cream.Keywords: flower extract, photoprotection, antigenotoxicity, cytotoxicity, genotoxicit
Procedia PDF Downloads 8742 Enhancing Sell-In and Sell-Out Forecasting Using Ensemble Machine Learning Method
Authors: Vishal Das, Tianyi Mao, Zhicheng Geng, Carmen Flores, Diego Pelloso, Fang Wang
Abstract:
Accurate sell-in and sell-out forecasting is a ubiquitous problem in the retail industry. It is an important element of any demand planning activity. As a global food and beverage company, Nestlé has hundreds of products in each geographical location that they operate in. Each product has its sell-in and sell-out time series data, which are forecasted on a weekly and monthly scale for demand and financial planning. To address this challenge, Nestlé Chilein collaboration with Amazon Machine Learning Solutions Labhas developed their in-house solution of using machine learning models for forecasting. Similar products are combined together such that there is one model for each product category. In this way, the models learn from a larger set of data, and there are fewer models to maintain. The solution is scalable to all product categories and is developed to be flexible enough to include any new product or eliminate any existing product in a product category based on requirements. We show how we can use the machine learning development environment on Amazon Web Services (AWS) to explore a set of forecasting models and create business intelligence dashboards that can be used with the existing demand planning tools in Nestlé. We explored recent deep learning networks (DNN), which show promising results for a variety of time series forecasting problems. Specifically, we used a DeepAR autoregressive model that can group similar time series together and provide robust predictions. To further enhance the accuracy of the predictions and include domain-specific knowledge, we designed an ensemble approach using DeepAR and XGBoost regression model. As part of the ensemble approach, we interlinked the sell-out and sell-in information to ensure that a future sell-out influences the current sell-in predictions. Our approach outperforms the benchmark statistical models by more than 50%. The machine learning (ML) pipeline implemented in the cloud is currently being extended for other product categories and is getting adopted by other geomarkets.Keywords: sell-in and sell-out forecasting, demand planning, DeepAR, retail, ensemble machine learning, time-series
Procedia PDF Downloads 27341 Development of PVA/polypyrrole Scaffolds by Supercritical CO₂ for Its Application in Biomedicine
Authors: Antonio Montes, Antonio Cozar, Clara Pereyra, Diego Valor, Enrique Martinez de la Ossa
Abstract:
Tissues and organs can be damaged because of traumatism, congenital illnesses, or cancer and the traditional therapeutic alternatives, such as surgery, cannot usually completely repair the damaged tissues. Tissue engineering allows regeneration of the patient's tissues, reducing the problems caused by the traditional methods. Scaffolds, polymeric structures with interconnected porosity, can be promoted the proliferation and adhesion of the patient’s cells in the damaged area. Furthermore, by means of impregnation of the scaffold with beneficial active substances, tissue regeneration can be induced through a drug delivery process. The objective of the work is the fabrication of a PVA scaffold coated with Gallic Acid and polypyrrole through a one-step foaming and impregnation process using the SSI technique (Supercritical Solvent Impregnation). In this technique, supercritical CO₂ penetrates into the polymer chains producing the plasticization of the polymer. In the depressurization step a CO₂ cellular nucleation and growing to take place to an interconnected porous structure of the polymer. The foaming process using supercritical CO₂ as solvent and expansion agent presents advantages compared to the traditional scaffolds’ fabrication methods, such as the polymer’s high solubility in the solvent or the possibility of carrying out the process at a low temperature, avoiding the inactivation of the active substance. In this sense, the supercritical CO₂ avoids the use of organic solvents and reduces the solvent residues in the final product. Moreover, this process does not require long processing time that could cause the stratification of substance inside the scaffold reducing the therapeutic efficiency of the formulation. An experimental design has been carried out to optimize the SSI technique operating conditions, as well as a study of the morphological characteristics of the scaffold for its use in tissue engineerings, such as porosity, conductivity or the release profiles of the active substance. It has been proved that the obtained scaffolds are partially porous, conductors of electricity and are able to release Gallic Acid in the long term.Keywords: scaffold, foaming, supercritical, PVA, polypyrrole, gallic acid
Procedia PDF Downloads 18140 A Risk Assessment Tool for the Contamination of Aflatoxins on Dried Figs Based on Machine Learning Algorithms
Authors: Kottaridi Klimentia, Demopoulos Vasilis, Sidiropoulos Anastasios, Ihara Diego, Nikolaidis Vasileios, Antonopoulos Dimitrios
Abstract:
Aflatoxins are highly poisonous and carcinogenic compounds produced by species of the genus Aspergillus spp. that can infect a variety of agricultural foods, including dried figs. Biological and environmental factors, such as population, pathogenicity, and aflatoxinogenic capacity of the strains, topography, soil, and climate parameters of the fig orchards, are believed to have a strong effect on aflatoxin levels. Existing methods for aflatoxin detection and measurement, such as high performance liquid chromatography (HPLC), and enzyme-linked immunosorbent assay (ELISA), can provide accurate results, but the procedures are usually time-consuming, sample-destructive, and expensive. Predicting aflatoxin levels prior to crop harvest is useful for minimizing the health and financial impact of a contaminated crop. Consequently, there is interest in developing a tool that predicts aflatoxin levels based on topography and soil analysis data of fig orchards. This paper describes the development of a risk assessment tool for the contamination of aflatoxin on dried figs, based on the location and altitude of the fig orchards, the population of the fungus Aspergillus spp. in the soil, and soil parameters such as pH, saturation percentage (SP), electrical conductivity (EC), organic matter, particle size analysis (sand, silt, clay), the concentration of the exchangeable cations (Ca, Mg, K, Na), extractable P, and trace of elements (B, Fe, Mn, Zn and Cu), by employing machine learning methods. In particular, our proposed method integrates three machine learning techniques, i.e., dimensionality reduction on the original dataset (principal component analysis), metric learning (Mahalanobis metric for clustering), and k-nearest neighbors learning algorithm (KNN), into an enhanced model, with mean performance equal to 85% by terms of the Pearson correlation coefficient (PCC) between observed and predicted values.Keywords: aflatoxins, Aspergillus spp., dried figs, k-nearest neighbors, machine learning, prediction
Procedia PDF Downloads 18439 Determinants of Post-Psychotic Depression in Schizophrenia Patients in ACSH and Mekellle Hospital Tigray, Ethiopia, 2019
Authors: Ashenafi Ayele, Shewit Haftu, Tesfalem Araya
Abstract:
Background: “Post-psychotic depression”, “post schizophrenic depression”, and “secondary depression” have been used to describe the occurrence of depressive symptoms during the chronic phase of schizophrenia. Post-psychotic depression is the most common cause of death due to suicide in schizophrenia patients. Overall lifetime risk for patients with schizophrenia is 50% for suicide attempts and 9-13% lifetime risk for completed suicide and also it is associated with poor prognosis and poor quality of life. Objective: To assess determinant of post psychotic depression in schizophrenia patients ACSH and Mekelle General Hospital, Tigray Ethiopia 2019. Methods: An institutional based unmatched case control study was conducted among 69 cases and 138 controls with the ratio of case to control 1 ratio 2. The sample is calculated using epi-info 3.1 to assess the determinant factors of post-psychotic depression in schizophrenia patients. The cases were schizophrenia patients who have been diagnosed at least for more than one-year stable for two months, and the controls are any patients who are diagnosed as schizophrenia patients. Study subjects were selected using a consecutive sampling technique. The Calgary depression scale for schizophrenia self-administered questionnaire was used. Before the interview, it was assessed the client’s capacity to give intended information using a scale called the University of California, San Diego Brief Assessment of Capacity to Consent (UBACC). Bivariant and multiple Logistic regression analysis was performed to determine between the independent and dependent variables. The significant independent predictor was declared at 95% confidence interval and P-value of less than 0.05. Result: Females were affected by post psychotic depression with the (AOR=2.01, 95%CI: 1.003- 4.012, P= 0.49).Patients who have mild form of positive symptom of schizophrenia affected by post psychotic depression with (AOR =4.05, 95%CI: 1.888- 8.7.8, P=0001).Patients who have minimal form of negative symptom of schizophrenia are affected by post psychotic depression with (AOR =4.23, 95%CI: 1.081-17.092, P=.038). Conclusion: In this study, sex (female) and presence of positive and negative symptoms of schizophrenia were significantly associated. It is recommended that the post psychotic depression should be assessed in every schizophrenia patient to decrease the severity of illness, and to improve patient’s quality of life.Keywords: determinants, post-psychotic depression, Mekelle city
Procedia PDF Downloads 12238 Analysis of Genic Expression of Honey Bees Exposed to Sublethal Pesticides Doses Using the Transcriptome Technique
Authors: Ricardo de Oliveira Orsi, Aline Astolfi, Daniel Diego Mendes, Isabella Cristina de Castro Lippi, Jaine da Luz Scheffer, Yan Souza Lima, Juliana Lunardi, Giovanna do Padro Ribeiro, Samir Moura Kadri
Abstract:
NECTAR Brazilian group (Center of Education, Science, and Technology in Rational Beekeeping) conducted studies on the pesticides honey bees effects using the transcriptome sequencing (RNA-Seq) analyzes for gene expression studies. In this way, we analyzed the effects of Pyraclostrobin and Fipronil on the honey bees with 21 old-days (forager) in laboratory conditions. For this, frames containing sealed brood were removed from the beehives and maintenance on the stove (32°C and 75% humidity) until the bees were born. So, newly emerged workers were marked on the pronotum with a non-toxic pen and reintroduced into their original hives. After 21 days, 120 marked bees were collected with an entomological forces and immediately stored in Petri dishes, perforated to ensure ventilation, and kept fasted for 3 hours. These honeybees were exposed to food contaminated or not with the sublethal dose of Pyraclostrobin (850 ppb/bee) or Fipronil (2.5 ppb/bee). After four hours of exposure, 15 bees from each treatment were referred to transcriptome analysis. Total RNA analysis was extracted from the brain pools (03 brains per pool) using the TRIzol® reagent protocol according to the manufacturer's instructions. cDNA libraries were constructed, and the FASTQC program was used to check adapter content and assess the quality of raw reads. Differential expression analysis was performed with the DESeq2 package. Genes that had an adjusted value of less than 0.05 were considered to be significantly up-regulated. Regarding the Pyraclostrobin, alterations were observed in the pattern of 17 gene related to of antioxidant system, cellular respiration, glucose metabolism, and regulation of juvenile hormone and the hormone insulin. Glyphosate altered the 10 gene related to the digestive system, exoskeleton composition, vitamin E transport, and antioxidant system. The results indicate that the necessity of studies using the sublethal doses to evaluate the pesticides uses and risks on crops and its effects on the honey bees.Keywords: beekeeping, honey bees, pesticides, transcriptome
Procedia PDF Downloads 12537 Use of Didactic Bibliographic Resources to Improve the Teaching and Learning Processes of Animal Reproduction in Veterinary Science
Authors: Yasser Y. Lenis, Amy Jo Montgomery, Diego F. Carrillo-Gonzalez
Abstract:
Introduction: The use of didactic instruments in different learning environments plays a pivotal role in enhancing the level of knowledge in veterinary science students. The direct instruction of basic animal reproduction concepts in students enrolled in veterinary medicine programs allows them to elucidate the biological and molecular mechanisms that perpetuate the animal species in an ecosystem. Therefore, universities must implement didactic strategies that facilitate the teaching and learning processes for students and, in turn, enrich learning environments. Objective: to evaluate the effect of the use of a didactic textbook on the level of theoretical knowledge in embryo-maternal recognition for veterinary medicine students. Methods: the participants (n=24) were divided into two experimental groups: control (Ctrl) and treatment (Treat). Both groups received 4 hours of theoretical training regarding the basic concepts in bovine embryo-maternal recognition. However, the Treat group was also exposed to a guided lecture and the activity play-to-learn from a cow reproduction didactic textbook. A pre-test and a post-test were applied to assess the prior and subsequent knowledge in the participants. Descriptive statistics were applied to identify the success rates for each of the tests. Afterwards, a repeated measures model was applied where the effect of the intervention was considered. Results: no significant difference (p>0,05) was observed in the number of right answers for groups Ctrl (54,2%±12,7) and Treat (40,8%±16,8) in the pre-test. There was no difference (p>0,05) compering the number of right answers in Ctrl pre-test (54,2%±12,7) and post-test (60,8±18,8). However, the Treat group showed a significant (p>0,05) difference in the number of right answers when comparing pre-test (40,8%±16,8) and post-test (71,7%±14,7). Finally, after the theoretical training and the didactic activity in the Treat group, an increase of 10.9% (p<0,05) in the number of right answers was found when compared with the Ctrl group. Conclusion: the use of didactic tools that include guided lectures and activities like play-to-learn from a didactic textbook enhances the level of knowledge in an animal reproduction course for veterinary medicine students.Keywords: animal reproduction, pedagogic, level of knowledge, learning environment
Procedia PDF Downloads 6336 Study of Mixing Conditions for Different Endothelial Dysfunction in Arteriosclerosis
Authors: Sara Segura, Diego Nuñez, Miryam Villamil
Abstract:
In this work, we studied the microscale interaction of foreign substances with blood inside an artificial transparent artery system that represents medium and small muscular arteries. This artery system had channels ranging from 75 μm to 930 μm and was fabricated using glass and transparent polymer blends like Phenylbis(2,4,6-trimethylbenzoyl) phosphine oxide, Poly(ethylene glycol) and PDMS in order to be monitored in real time. The setup was performed using a computer controlled precision micropump and a high resolution optical microscope capable of tracking fluids at fast capture. Observation and analysis were performed using a real time software that reconstructs the fluid dynamics determining the flux velocity, injection dependency, turbulence and rheology. All experiments were carried out with fully computer controlled equipment. Interactions between substances like water, serum (0.9% sodium chloride and electrolyte with a ratio of 4 ppm) and blood cells were studied at microscale as high as 400nm of resolution and the analysis was performed using a frame-by-frame observation and HD-video capture. These observations lead us to understand the fluid and mixing behavior of the interest substance in the blood stream and to shed a light on the use of implantable devices for drug delivery at arteries with different Endothelial dysfunction. Several substances were tested using the artificial artery system. Initially, Milli-Q water was used as a control substance for the study of the basic fluid dynamics of the artificial artery system. However, serum and other low viscous substances were pumped into the system with the presence of other liquids to study the mixing profiles and behaviors. Finally, mammal blood was used for the final test while serum was injected. Different flow conditions, pumping rates, and time rates were evaluated for the determination of the optimal mixing conditions. Our results suggested the use of a very fine controlled microinjection for better mixing profiles with and approximately rate of 135.000 μm3/s for the administration of drugs inside arteries.Keywords: artificial artery, drug delivery, microfluidics dynamics, arteriosclerosis
Procedia PDF Downloads 29435 The Relationship between the Content of Inner Human Experience and Well-Being: An Experience Sampling Study
Authors: Xinqi Guo, Karen R. Dobkins
Abstract:
Background and Objectives: Humans are probably the only animals whose minds are constantly filled with thoughts, feelings and emotions. Previous studies have investigated human minds from different dimensions, including its proportion of time for not being present, its representative format, its personal relevance, its temporal locus, and affect valence. The current study aims at characterizing human mind by employing Experience Sampling Methods (ESM), a self-report research procedure for studying daily experience. This study emphasis on answering the following questions: 1) How does the contents of the inner experience vary across demographics, 2) Are certain types of inner experiences correlated with level of mindfulness and mental well-being (e.g., are people who spend more time being present happier, and are more mindful people more at-present?), 3) Will being prompted to report one’s inner experience increase mindfulness and mental well-being? Methods: Participants were recruited from the subject pool of UC San Diego or from the social media. They began by filling out two questionnaires: 1) Five Facet Mindfulness Questionnaire-Short Form, and 2) Warwick-Edinburgh Mental Well-being Scale, and demographic information. Then they participated in the ESM part by responding to the prompts which contained questions about their real-time inner experience: if they were 'at-present', 'mind-wandering', or 'zoned-out'. The temporal locus, the clarity, and the affect valence, and the personal importance of the thought they had the moment before the prompt were also assessed. A mobile app 'RealLife Exp' randomly delivered these prompts 3 times/day for 6 days during wake-time. After the 6 days, participants completed questionnaire (1) and (2) again. Their changes of score were compared to a control group who did not participate in the ESM procedure (yet completed (1) and (2) one week apart). Results: Results are currently preliminary as we continue to collect data. So far, there is a trend that participants are present, mind-wandering and zoned-out, about 53%, 23% and 24% during wake-time, respectively. The thoughts of participants are ranked to be clearer and more neutral if they are present vs. mind-wandering. Mind-wandering thoughts are 66% about the past, consisting 80% of inner speech. Discussion and Conclusion: This study investigated the subjective account of human mind by a tool with high ecological validity. And it broadens the understanding of the relationship between contents of mind and well-being.Keywords: experience sampling method, meta-memory, mindfulness, mind-wandering
Procedia PDF Downloads 132