Search results for: computer application
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10260

Search results for: computer application

210 Attention Treatment for People With Aphasia: Language-Specific vs. Domain-General Neurofeedback

Authors: Yael Neumann

Abstract:

Attention deficits are common in people with aphasia (PWA). Two treatment approaches address these deficits: domain-general methods like Play Attention, which focus on cognitive functioning, and domain-specific methods like Language-Specific Attention Treatment (L-SAT), which use linguistically based tasks. Research indicates that L-SAT can improve both attentional deficits and functional language skills, while Play Attention has shown success in enhancing attentional capabilities among school-aged children with attention issues compared to standard cognitive training. This study employed a randomized controlled cross-over single-subject design to evaluate the effectiveness of these two attention treatments over 25 weeks. Four PWA participated, undergoing a battery of eight standardized tests measuring language and cognitive skills. The treatments were counterbalanced. Play Attention used EEG sensors to detect brainwaves, enabling participants to manipulate items in a computer game while learning to suppress theta activity and increase beta activity. An algorithm tracked changes in the theta-to-beta ratio, allowing points to be earned during the games. L-SAT, on the other hand, involved hierarchical language tasks that increased in complexity, requiring greater attention from participants. Results showed that for language tests, Participant 1 (moderate aphasia) aligned with existing literature, showing L-SAT was more effective than Play Attention. However, Participants 2 (very severe) and 3 and 4 (mild) did not conform to this pattern; both treatments yielded similar outcomes. This may be due to the extremes of aphasia severity: the very severe participant faced significant overall deficits, making both approaches equally challenging, while the mild participant performed well initially, leaving limited room for improvement. In attention tests, Participants 1 and 4 exhibited results consistent with prior research, indicating Play Attention was superior to L-SAT. Participant 2, however, showed no significant improvement with either program, although L-SAT had a slight edge on the Visual Elevator task, measuring switching and mental flexibility. This advantage was not sustained at the one-month follow-up, likely due to the participant’s struggles with complex attention tasks. Participant 3's results similarly did not align with prior studies, revealing no difference between the two treatments, possibly due to the challenging nature of the attention measures used. Regarding participation and ecological tests, all participants showed similar mild improvements with both treatments. This limited progress could stem from the short study duration, with only five weeks allocated for each treatment, which may not have been enough time to achieve meaningful changes affecting life participation. In conclusion, the performance of participants appeared influenced by their level of aphasia severity. The moderate PWA’s results were most aligned with existing literature, indicating better attention improvement from the domain-general approach (Play Attention) and better language improvement from the domain-specific approach (L-SAT).

Keywords: attention, language, cognitive rehabilitation, neurofeedback

Procedia PDF Downloads 20
209 Decision Making on Smart Energy Grid Development for Availability and Security of Supply Achievement Using Reliability Merits

Authors: F. Iberraken, R. Medjoudj, D. Aissani

Abstract:

The development of the smart grids concept is built around two separate definitions, namely: The European one oriented towards sustainable development and the American one oriented towards reliability and security of supply. In this paper, we have investigated reliability merits enabling decision-makers to provide a high quality of service. It is based on system behavior using interruptions and failures modeling and forecasting from one hand and on the contribution of information and communication technologies (ICT) to mitigate catastrophic ones such as blackouts from the other hand. It was found that this concept has been adopted by developing and emerging countries in short and medium terms followed by sustainability concept at long term planning. This work has highlighted the reliability merits such as: Benefits, opportunities, costs and risks considered as consistent units of measuring power customer satisfaction. From the decision making point of view, we have used the analytic hierarchy process (AHP) to achieve customer satisfaction, based on the reliability merits and the contribution of such energy resources. Certainly nowadays, fossil and nuclear ones are dominating energy production but great advances are already made to jump into cleaner ones. It was demonstrated that theses resources are not only environmentally but also economically and socially sustainable. The paper is organized as follows: Section one is devoted to the introduction, where an implicit review of smart grids development is given for the two main concepts (for USA and Europeans countries). The AHP method and the BOCR developments of reliability merits against power customer satisfaction are developed in section two. The benefits where expressed by the high level of availability, maintenance actions applicability and power quality. Opportunities were highlighted by the implementation of ICT in data transfer and processing, the mastering of peak demand control, the decentralization of the production and the power system management in default conditions. Costs were evaluated using cost-benefit analysis, including the investment expenditures in network security, becoming a target to hackers and terrorists, and the profits of operating as decentralized systems, with a reduced energy not supplied, thanks to the availability of storage units issued from renewable resources and to the current power lines (CPL) enabling the power dispatcher to manage optimally the load shedding. For risks, we have razed the adhesion of citizens to contribute financially to the system and to the utility restructuring. What is the degree of their agreement compared to the guarantees proposed by the managers about the information integrity? From technical point of view, have they sufficient information and knowledge to meet a smart home and a smart system? In section three, an application of AHP method is made to achieve power customer satisfaction based on the main energy resources as alternatives, using knowledge issued from a country that has a great advance in energy mutation. Results and discussions are given in section four. It was given us to conclude that the option to a given resource depends on the attitude of the decision maker (prudent, optimistic or pessimistic), and that status quo is neither sustainable nor satisfactory.

Keywords: reliability, AHP, renewable energy resources, smart grids

Procedia PDF Downloads 443
208 Poly (3,4-Ethylenedioxythiophene) Prepared by Vapor Phase Polymerization for Stimuli-Responsive Ion-Exchange Drug Delivery

Authors: M. Naveed Yasin, Robert Brooke, Andrew Chan, Geoffrey I. N. Waterhouse, Drew Evans, Darren Svirskis, Ilva D. Rupenthal

Abstract:

Poly(3,4-ethylenedioxythiophene) (PEDOT) is a robust conducting polymer (CP) exhibiting high conductivity and environmental stability. It can be synthesized by either chemical, electrochemical or vapour phase polymerization (VPP). Dexamethasone sodium phosphate (dexP) is an anionic drug molecule which has previously been loaded onto PEDOT as a dopant via electrochemical polymerisation; however this technique requires conductive surfaces from which polymerization is initiated. On the other hand, VPP produces highly organized biocompatible CP structures while polymerization can be achieved onto a range of surfaces with a relatively straight forward scale-up process. Following VPP of PEDOT, dexP can be loaded and subsequently released via ion-exchange. This study aimed at preparing and characterising both non-porous and porous VPP PEDOT structures including examining drug loading and release via ion-exchange. Porous PEDOT structures were prepared by first depositing a sacrificial polystyrene (PS) colloidal template on a substrate, heat curing this deposition and then spin coating it with the oxidant solution (iron tosylate) at 1500 rpm for 20 sec. VPP of both porous and non-porous PEDOT was achieved by exposing to monomer vapours in a vacuum oven at 40 mbar and 40 °C for 3 hrs. Non-porous structures were prepared similarly on the same substrate but without any sacrificial template. Surface morphology, compositions and behaviour were then characterized by atomic force microscopy (AFM), scanning electron microscopy (SEM), x-ray photoelectron spectroscopy (XPS) and cyclic voltammetry (CV) respectively. Drug loading was achieved by 50 CV cycles in a 0.1 M dexP aqueous solution. For drug release, each sample was exposed to 20 mL of phosphate buffer saline (PBS) placed in a water bath operating at 37 °C and 100 rpm. Film was stimulated (continuous pulse of ± 1 V at 0.5 Hz for 17 mins) while immersed into PBS. Samples were collected at 1, 2, 6, 23, 24, 26 and 27 hrs and were analysed for dexP by high performance liquid chromatography (HPLC Agilent 1200 series). AFM and SEM revealed the honey comb nature of prepared porous structures. XPS data showed the elemental composition of the dexP loaded film surface, which related well with that of PEDOT and also showed that one dexP molecule was present per almost three EDOT monomer units. The reproducible electroactive nature was shown by several cycles of reduction and oxidation via CV. Drug release revealed success in drug loading via ion-exchange, with stimulated porous and non-porous structures exhibiting a proof of concept burst release upon application of an electrical stimulus. A similar drug release pattern was observed for porous and non-porous structures without any significant statistical difference, possibly due to the thin nature of these structures. To our knowledge, this is the first report to explore the potential of VPP prepared PEDOT for stimuli-responsive drug delivery via ion-exchange. The produced porous structures were ordered and highly porous as indicated by AFM and SEM. These porous structures exhibited good electroactivity as shown by CV. Future work will investigate porous structures as nano-reservoirs to increase drug loading while sealing these structures to minimize spontaneous drug leakage.

Keywords: PEDOT for ion-exchange drug delivery, stimuli-responsive drug delivery, template based porous PEDOT structures, vapour phase polymerization of PEDOT

Procedia PDF Downloads 231
207 Teachers Engagement to Teaching: Exploring Australian Teachers’ Attribute Constructs of Resilience, Adaptability, Commitment, Self/Collective Efficacy Beliefs

Authors: Lynn Sheridan, Dennis Alonzo, Hoa Nguyen, Andy Gao, Tracy Durksen

Abstract:

Disruptions to teaching (e.g., COVID-related) have increased work demands for teachers. There is an opportunity for research to explore evidence-informed steps to support teachers. Collective evidence informs data on teachers’ personal attributes (e.g., self-efficacy beliefs) in the workplace are seen to promote success in teaching and support teacher engagement. Teacher engagement plays a role in students’ learning and teachers’ effectiveness. Engaged teachers are better at overcoming work-related stress, burnout and are more likely to take on active roles. Teachers’ commitment is influenced by a host of personal (e.g., teacher well-being) and environmental factors (e.g., job stresses). The job demands-resources model provided a conceptual basis for examining how teachers’ well-being, and is influenced by job demands and job resources. Job demands potentially evoke strain and exceed the employee’s capability to adapt. Job resources entail what the job offers to individual teachers (e.g., organisational support), helping to reduce job demands. The application of the job demands-resources model involves gathering an evidence-base of and connection to personal attributes (job resources). The study explored the association between constructs (resilience, adaptability, commitment, self/collective efficacy) and a teacher’s engagement with the job. The paper sought to elaborate on the model and determine the associations between key constructs of well-being (resilience, adaptability), commitment, and motivation (self and collective-efficacy beliefs) to teachers’ engagement in teaching. Data collection involved online a multi-dimensional instrument using validated items distributed from 2020-2022. The instrument was designed to identify construct relationships. The participant number was 170. Data Analysis: The reliability coefficients, means, standard deviations, skewness, and kurtosis statistics for the six variables were completed. All scales have good reliability coefficients (.72-.96). A confirmatory factor analysis (CFA) and structural equation model (SEM) were performed to provide measurement support and to obtain latent correlations among factors. The final analysis was performed using structural equation modelling. Several fit indices were used to evaluate the model fit, including chi-square statistics and root mean square error of approximation. The CFA and SEM analysis was performed. The correlations of constructs indicated positive correlations exist, with the highest found between teacher engagement and resilience (r=.80) and the lowest between teacher adaptability and collective teacher efficacy (r=.22). Given the associations; we proceeded with CFA. The CFA yielded adequate fit: CFA fit: X (270, 1019) = 1836.79, p < .001, RMSEA = .04, and CFI = .94, TLI = .93 and SRMR = .04. All values were within the threshold values, indicating a good model fit. Results indicate that increasing teacher self-efficacy beliefs will increase a teacher’s level of engagement; that teacher ‘adaptability and resilience are positively associated with self-efficacy beliefs, as are collective teacher efficacy beliefs. Implications for school leaders and school systems: 1. investing in increasing teachers’ sense of efficacy beliefs to manage work demands; 2. leadership approaches can enhance teachers' adaptability and resilience; and 3. a culture of collective efficacy support. Preparing teachers for now and in the future offers an important reminder to policymakers and school leaders on the importance of supporting teachers’ personal attributes when faced with the challenging demands of the job.

Keywords: collective teacher efficacy, teacher self-efficacy, job demands, teacher engagement

Procedia PDF Downloads 130
206 Synthesis by Mechanical Alloying and Characterization of FeNi₃ Nanoalloys

Authors: Ece A. Irmak, Amdulla O. Mekhrabov, M. Vedat Akdeniz

Abstract:

There is a growing interest on the synthesis and characterization of nanoalloys since the unique chemical, and physical properties of nanoalloys can be tuned and, consequently, new structural motifs can be created by varying the type of constituent elements, atomic and magnetic ordering, as well as size and shape of the nanoparticles. Due to the fine size effects, magnetic nanoalloys have considerable attention with their enhanced mechanical, electrical, optical and magnetic behavior. As an important magnetic nanoalloy, the novel application area of Fe-Ni based nanoalloys is expected to be widened in the chemical, aerospace industry and magnetic biomedical applications. Noble metals have been using in biomedical applications for several years because of their surface plasmon properties. In this respect, iron-nickel nanoalloys are promising materials for magnetic biomedical applications because they show novel properties such as superparamagnetism and surface plasmon resonance property. Also, there is great attention for the usage Fe-Ni based nanoalloys as radar absorbing materials in aerospace and stealth industry due to having high Curie temperature, high permeability and high saturation magnetization with good thermal stability. In this study, FeNi₃ bimetallic nanoalloys were synthesized by mechanical alloying in a planetary high energy ball mill. In mechanical alloying, micron size powders are placed into the mill with milling media. The powders are repeatedly deformed, fractured and alloyed by high energy collision under the impact of balls until the desired composition and particle size is achieved. The experimental studies were carried out in two parts. Firstly, dry mechanical alloying with high energy dry planetary ball milling was applied to obtain FeNi₃ nanoparticles. Secondly, dry milling was followed by surfactant-assisted ball milling to observe the surfactant and solvent effect on the structure, size, and properties of the FeNi₃ nanoalloys. In the first part, the powder sample of iron-nickel was prepared according to the 1:3 iron to nickel ratio to produce FeNi₃ nanoparticles and the 1:10 powder to ball weight ratio. To avoid oxidation during milling, the vials had been filled with Ar inert gas before milling started. The powders were milled for 80 hours in total and the synthesis of the FeNi₃ intermetallic nanoparticles was succeeded by mechanical alloying in 40 hours. Also, regarding the particle size, it was found that the amount of nano-sized particles raised with increasing milling time. In the second part of the study, dry milling of the Fe and Ni powders with the same stoichiometric ratio was repeated. Then, to prevent agglomeration and to obtain smaller sized nanoparticles with superparamagnetic behavior, surfactants and solvent are added to the system, after 40-hour milling time, with the completion of the mechanical alloying. During surfactant-assisted ball milling, heptane was used as milling medium, and as surfactants, oleic acid and oleylamine were used in the high energy ball milling processes. The characterization of the alloyed particles in terms of microstructure, morphology, particle size, thermal and magnetic properties with respect to milling time was done by X-ray diffraction, scanning electron microscopy, energy dispersive spectroscopy, vibrating-sample magnetometer, and differential scanning calorimetry.

Keywords: iron-nickel systems, magnetic nanoalloys, mechanical alloying, nanoalloy characterization, surfactant-assisted ball milling

Procedia PDF Downloads 180
205 Mineralized Nanoparticles as a Contrast Agent for Ultrasound and Magnetic Resonance Imaging

Authors: Jae Won Lee, Kyung Hyun Min, Hong Jae Lee, Sang Cheon Lee

Abstract:

To date, imaging techniques have attracted much attention in medicine because the detection of diseases at an early stage provides greater opportunities for successful treatment. Consequently, over the past few decades, diverse imaging modalities including magnetic resonance (MR), positron emission tomography, computed tomography, and ultrasound (US) have been developed and applied widely in the field of clinical diagnosis. However, each of the above-mentioned imaging modalities possesses unique strengths and intrinsic weaknesses, which limit their abilities to provide accurate information. Therefore, multimodal imaging systems may be a solution that can provide improved diagnostic performance. Among the current medical imaging modalities, US is a widely available real-time imaging modality. It has many advantages including safety, low cost and easy access for patients. However, its low spatial resolution precludes accurate discrimination of diseased region such as cancer sites. In contrast, MR has no tissue-penetrating limit and can provide images possessing exquisite soft tissue contrast and high spatial resolution. However, it cannot offer real-time images and needs a comparatively long imaging time. The characteristics of these imaging modalities may be considered complementary, and the modalities have been frequently combined for the clinical diagnostic process. Biominerals such as calcium carbonate (CaCO3) and calcium phosphate (CaP) exhibit pH-dependent dissolution behavior. They demonstrate pH-controlled drug release due to the dissolution of minerals in acidic pH conditions. In particular, the application of this mineralization technique to a US contrast agent has been reported recently. The CaCO3 mineral reacts with acids and decomposes to generate calcium dioxide (CO2) gas in an acidic environment. These gas-generating mineralized nanoparticles generated CO2 bubbles in the acidic environment of the tumor, thereby allowing for strong echogenic US imaging of tumor tissues. On the basis of this previous work, it was hypothesized that the loading of MR contrast agents into the CaCO3 mineralized nanoparticles may be a novel strategy in designing a contrast agent for dual imaging. Herein, CaCO3 mineralized nanoparticles that were capable of generating CO2 bubbles to trigger the release of entrapped MR contrast agents in response to tumoral acidic pH were developed for the purposes of US and MR dual-modality imaging of tumors. Gd2O3 nanoparticles were selected as an MR contrast agent. A key strategy employed in this study was to prepare Gd2O3 nanoparticle-loaded mineralized nanoparticles (Gd2O3-MNPs) using block copolymer-templated CaCO3 mineralization in the presence of calcium cations (Ca2+), carbonate anions (CO32-) and positively charged Gd2O3 nanoparticles. The CaCO3 core was considered suitable because it may effectively shield Gd2O3 nanoparticles from water molecules in the blood (pH 7.4) before decomposing to generate CO2 gas, triggering the release of Gd2O3 nanoparticles in tumor tissues (pH 6.4~7.4). The kinetics of CaCO3 dissolution and CO2 generation from the Gd2O3-MNPs were examined as a function of pH and pH-dependent in vitro magnetic relaxation; additionally, the echogenic properties were estimated to demonstrate the potential of the particles for the tumor-specific US and MR imaging.

Keywords: calcium carbonate, mineralization, ultrasound imaging, magnetic resonance imaging

Procedia PDF Downloads 238
204 Process of Production of an Artisanal Brewery in a City in the North of the State of Mato Grosso, Brazil

Authors: Ana Paula S. Horodenski, Priscila Pelegrini, Salli Baggenstoss

Abstract:

The brewing industry with artisanal concepts seeks to serve a specific market, with diversified production that has been gaining ground in the national environment, also in the Amazon region. This growth is due to the more demanding consumer, with a diversified taste that wants to try new types of beer, enjoying products with new aromas, flavors, as a differential of what is so widely spread through the big industrial brands. Thus, through qualitative research methods, the study aimed to investigate how is the process of managing the production of a craft brewery in a city in the northern State of Mato Grosso (BRAZIL), providing knowledge of production processes and strategies in the industry. With the efficient use of resources, it is possible to obtain the necessary quality and provide better performance and differentiation of the company, besides analyzing the best management model. The research is descriptive with a qualitative approach through a case study. For the data collection, a semi-structured interview was elaborated, composed of the areas: microbrewery characterization, artisan beer production process, and the company supply chain management. Also, production processes were observed during technical visits. With the study, it was verified that the artisan brewery researched develops preventive maintenance strategies with the inputs, machines, and equipment, so that the quality of the product and the production process are achieved. It was observed that the distance from the supplying centers makes the management of processes and the supply chain be carried out with a longer planning time so that the delivery of the final product is satisfactory. The production process of the brewery is composed of machines and equipment that allows the control and quality of the product, which the manager states that for the productive capacity of the industry and its consumer market, the available equipment meets the demand. This study also contributes to highlight one of the challenges for the development of small breweries in front of the market giants, that is, the legislation, which fits the microbreweries as producers of alcoholic beverages. This makes the micro and small business segment to be taxed as a major, who has advantages in purchasing large batches of raw materials and tax incentives because they are large employers and tax pickers. It was possible to observe that the supply chain management system relies on spreadsheets and notes that are done manually, which could be simplified with a computer program to streamline procedures and reduce risks and failures of the manual process. In relation to the control of waste and effluents affected by the industry is outsourced and meets the needs. Finally, the results showed that the industry uses preventive maintenance as a productive strategy, which allows better conditions for the production and quality of artisanal beer. The quality is directly related to the satisfaction of the final consumer, being prized and performed throughout the production process, with the selection of better inputs, the effectiveness of the production processes and the relationship with the commercial partners.

Keywords: artisanal brewery, production management, production processes, supply chain

Procedia PDF Downloads 121
203 Reduction and Smelting of Magnetic Fraction Obtained by Magnetic-Gravimetric-Separation (MGS) of Electric Arc Furnace Dust

Authors: Sara Scolari, Davide Mombelli, Gianluca Dall'Osto, Jasna Kastivnik, Gašper Tavčar, Carlo Mapelli

Abstract:

The EIT Raw Materials RIS-DustRec-II project aims to transform Electric Arc Furnace Dust (EAFD) into a valuable resource by overcoming the challenges associated with traditional recycling approaches. EAFD, a zinc-rich industrial by-product typically recycled by the Waelz process, contains complex oxides such as franklinite (ZnFe₂O₄), which hinder the efficient extraction of zinc, by also introducing other valuable elements (Fe, Ni, Cr, Cu, …) in the slag. The project aims to develop a multistage multidisciplinary approach to separate EAFD into two streams: a magnetic and non-magnetic one. In this paper the production of self-reducing briquettes from the magnetic stream of EAFD with a reducing agent, aiming to drive carbothermic reduction and recover iron as a usable alloy, was investigated. Research was focused on optimizing the magnetic and subsequent gravimetric separation (MGS) processes, followed by high-temperature smelting to evaluate reduction efficiency and phase separation. The characterization of selected two different raw EAFD samples and their magnetic-gravitational separation to isolate zinc- and iron-rich fractions was performed by X-ray diffraction and scanning electron microscope. The iron-enriched concentrates were then agglomerated into self-reducing briquettes by mixing them with either biochar (olive pomace pyrolyzed at 350 and 750°C and wood chips pyrolyzed at 750 °C) and a Cupola Furnace dust as reducing agents, combined with gelatinized corn starch as a binder. Cylindrical briquettes were produced and cured for 14 days to ensure structural integrity during subsequent thermal treatments. Smelting tests were carried out at 1400 °C in an inert argon atmosphere to assess the metallization efficiency and the separation between metal and slag phases. A carbon/oxides mass ratio of 0.262 (C/(ZnO+Fe₂O₃)) was used in these tests to maintain continuity with previous studies and to standardize reduction conditions. The magnetic and gravimetric separations effectively isolated zinc- and iron-enriched fractions, particularly for one of the two EAFD, where the concentration of Zn in the concentration fraction was reduced by 8 wt.% while Fe reached 45 wt.%. The reduction tests conducted at 1400 °C showed that the chosen carbon/oxides ratio was sufficient for the smelting of the reducible oxides within the briquettes. However, an important limitation became apparent: the amount of carbon, exceeding the stochiometric value, proved to be excessive for the effective coalescence of metal droplets, preventing clear metal-slag separation. To address this, further smelting tests were carried out in an air atmosphere rather than inert conditions to burn off excess carbon. This paper demonstrates the potential of controlled carbothermic reduction for EAFD recycling. By carefully optimizing the C/(ZnO+Fe₂O₃) ratio, the process can maximize metal recovery while achieving better separation of the metal and slag phases. This approach offers a promising alternative to traditional EAFD recycling methods, with further studies recommended to refine the parameters for industrial application.

Keywords: biochars, electrical arc furnace dust, metallization, smelting

Procedia PDF Downloads 14
202 Development of One-Pot Sequential Cyclizations and Photocatalyzed Decarboxylative Radical Cyclization: Application Towards Aspidospermatan Alkaloids

Authors: Guillaume Bélanger, Jean-Philippe Fontaine, Clémence Hauduc

Abstract:

There is an undeniable thirst from organic chemists and from the pharmaceutical industry to access complex alkaloids with short syntheses. While medicinal chemists are interested in the fascinating wide range of biological properties of alkaloids, synthetic chemists are rather interested in finding new routes to access these challenging natural products of often low availability from nature. To synthesize complex polycyclic cores of natural products, reaction cascades or sequences performed one-pot offer a neat advantage over classical methods for their rapid increase in molecular complexity in a single operation. In counterpart, reaction cascades need to be run on substrates bearing all the required functional groups necessary for the key cyclizations. Chemoselectivity is thus a major issue associated with such a strategy, in addition to diastereocontrol and regiocontrol for the overall transformation. In the pursuit of synthetic efficiency, our research group developed an innovative one-pot transformation of linear substrates into bi- and tricyclic adducts applied to the construction of Aspidospermatan-type alkaloids. The latter is a rich class of indole alkaloids bearing a unique bridged azatricyclic core. Despite many efforts toward the synthesis of members of this family, efficient and versatile synthetic routes are still coveted. Indeed, very short, non-racemic approaches are rather scarce: for example, in the cases of aspidospermidine and aspidospermine, syntheses are all fifteen steps and over. We envisaged a unified approach to access several members of the Aspidospermatan alkaloids family. The key sequence features a highly chemoselective formamide activation that triggers a Vilsmeier-Haack cyclization, followed by an azomethine ylide generation and intramolecular cycloaddition. Despite the high density and variety of functional groups on the substrates (electron-rich and electron-poor alkenes, nitrile, amide, ester, enol ether), the sequence generated three new carbon-carbon bonds and three rings in a single operation with good yield and high chemoselectivity. A detailed study of amide, nucleophile, and dipolarophile variations to finally get to the successful combination required for the key transformation will be presented. To complete the indoline fragment of the natural products, we developed an original approach. Indeed, all reported routes to Aspidospermatan alkaloids introduce the indoline or indole early in the synthesis. In our work, the indoline needs to be installed on the azatricyclic core after the key cyclization sequence. As a result, typical Fischer indolization is not suited since this reaction is known to fail on such substrates. We thus envisaged a unique photocatalyzed decarboxylative radical cyclization. The development of this reaction as well as the scope and limitations of the methodology, will also be presented. The original Vilsmeier-Haack and azomethine ylide cyclization sequence as well as the new photocatalyzed decarboxylative radical cyclization will undoubtedly open access to new routes toward polycyclic indole alkaloids and derivatives of pharmaceutical interest in general.

Keywords: Aspidospermatan alkaloids, azomethine ylide cycloaddition, decarboxylative radical cyclization, indole and indoline synthesis, one-pot sequential cyclizations, photocatalysis, Vilsmeier-Haack Cyclization

Procedia PDF Downloads 81
201 A Perspective on Allelopathic Potential of Corylus avellana L.

Authors: Tugba G. Isin Ozkan, Yoshiharu Fujii

Abstract:

One of the most important constrains that decrease the crop yields are weeds. Increased amount and number of chemical herbicides are being utilized every day to control weeds. Chemical herbicides which cause environmental effects, and limitations on implementation of them have led to the nonchemical alternatives in the management of weeds. It is needed increasingly the application of allelopathy as a nonherbicidal innovation to control weed populations in integrated weed management. It is not only because of public concern about herbicide use, but also increased agricultural costs and herbicide resistance weeds. Allelopathy is defined as a common biological phenomenon, direct or indirect interaction which one plant or organism produces biochemicals influence the physiological processes of another neighboring plant or organism. Biochemicals involved in allelopathy are called allelochemicals that influence beneficially or detrimentally the growth, survival, development, and reproduction of other plant or organisms. All plant parts could have allelochemicals which are secondary plant metabolites. Allelochemicals are released to environment, influence the germination and seedling growth of neighbors' weeds; that is the way how allelopathy is applied for weed control. Crop cultivars have significantly different ability for inhibiting the growth of certain weeds. So, a high commercial value crop Corylus avellana L. and its byproducts were chosen to introduce for their allelopathic potential in this research. Edible nut of Corylus avellana L., commonly known as hazelnut is commercially valuable crop with byproducts; skin, hard shell, green leafy cover, and tree leaf. Research on allelopathic potential of a plant by using the sandwich bioassay method and investigation growth inhibitory activity is the first step to develop new and environmentally friendly alternatives for weed control. Thus, the objective of this research is to determine allelopathic potential of C. avellana L. and its byproducts by using sandwich method and to determine effective concentrations (EC) of their extracts for inducing half-maximum elongation inhibition on radicle of test plant, EC50. The sandwich method is reliable and fast bioassay, very useful for allelopathic screening under laboratory conditions. In experiments, lettuce (Lactuca sativa L.) seeds will be test plant, because of its high sensitivity to inhibition by allelochemicals and reliability for germination. In sandwich method, the radicle lengths of dry material treated lettuce seeds and control lettuce seeds will be measured and inhibition of radicle elongation will be determined. Lettuce seeds will also be treated by the methanol extracts of dry hazelnut parts to calculate EC₅₀ values, which are required to induce half-maximal inhibition of growth, as mg dry weight equivalent mL-1. Inhibitory activity of extracts against lettuce seedling elongation will be evaluated, like in sandwich method, by comparing the radicle lengths of treated seeds with that of control seeds and EC₅₀ values will be determined. Research samples are dry parts of Turkish hazelnut, C. avellana L. The results would suggest the opportunity for allelopathic potential of C. avellana L. with its byproducts in plant-plant interaction, might be utilized for further researches, could be beneficial in finding bioactive chemicals from natural products and developing of natural herbicides.

Keywords: allelopathy, Corylus avellana L., EC50, Lactuca sativa L., sandwich method, Turkish hazelnut

Procedia PDF Downloads 176
200 Pre-conditioning and Hot Water Sanitization of Reverse Osmosis Membrane for Medical Water Production

Authors: Supriyo Das, Elbir Jove, Ajay Singh, Sophie Corbet, Noel Carr, Martin Deetz

Abstract:

Water is a critical commodity in the healthcare and medical field. The utility of medical-grade water spans from washing surgical equipment, drug preparation to the key element of life-saving therapy such as hydrotherapy and hemodialysis for patients. A properly treated medical water reduces the bioburden load and mitigates the risk of infection, ensuring patient safety. However, any compromised condition during the production of medical-grade water can create a favorable environment for microbial growth putting patient safety at high risk. Therefore, proper upstream treatment of the medical water is essential before its application in healthcare, pharma and medical space. Reverse Osmosis (RO) is one of the most preferred treatments within healthcare industries and is recommended by all International Pharmacopeias to achieve the quality level demanded by global regulatory bodies. The RO process can remove up to 99.5% of constituents from feed water sources, eliminating bacteria, proteins and particles sizes of 100 Dalton and above. The combination of RO with other downstream water treatment technologies such as Electrodeionization and Ultrafiltration meet the quality requirements of various pharmacopeia monographs to produce highly purified water or water for injection for medical use. In the reverse osmosis process, the water from a liquid with a high concentration of dissolved solids is forced to flow through an especially engineered semi-permeable membrane to the low concentration side, resulting in high-quality grade water. However, these specially engineered RO membranes need to be sanitized either chemically or at high temperatures at regular intervals to keep the bio-burden at the minimum required level. In this paper, we talk about Dupont´s FilmTec Heat Sanitizable Reverse Osmosis membrane (HSRO) for the production of medical-grade water. An HSRO element must be pre-conditioned prior to initial use by exposure to hot water (80°C-85°C) for its stable performance and to meet the manufacturer’s specifications. Without pre-conditioning, the membrane will show variations in feed pressure operations and salt rejection. The paper will discuss the critical variables of pre-conditioning steps that can affect the overall performance of the HSRO membrane and demonstrate the data to support the need for pre-conditioning of HSRO elements. Our preliminary data suggests that there can be up to 35 % reduction in flow due to initial heat treatment, which also positively affects the increase in salt rejection. The paper will go into detail about the fundamental understanding of the performance change of HSRO after the pre-conditioning step and its effect on the quality of medical water produced. The paper will also discuss another critical point, “regular hot water sanitization” of these HSRO membranes. Regular hot water sanitization (at 80°C-85°C) is necessary to keep the membrane bioburden free; however, it can negatively impact the performance of the membrane over time. We will demonstrate several data points on hot water sanitization using FilmTec HSRO elements and challenge its robustness to produce quality medical water. The last part of this paper will discuss the construction details of the FilmTec HSRO membrane and features that make it suitable to pre-condition and sanitize at high temperatures.

Keywords: heat sanitizable reverse osmosis, HSRO, medical water, hemodialysis water, water for Injection, pre-conditioning, heat sanitization

Procedia PDF Downloads 213
199 Comparing Test Equating by Item Response Theory and Raw Score Methods with Small Sample Sizes on a Study of the ARTé: Mecenas Learning Game

Authors: Steven W. Carruthers

Abstract:

The purpose of the present research is to equate two test forms as part of a study to evaluate the educational effectiveness of the ARTé: Mecenas art history learning game. The researcher applied Item Response Theory (IRT) procedures to calculate item, test, and mean-sigma equating parameters. With the sample size n=134, test parameters indicated “good” model fit but low Test Information Functions and more acute than expected equating parameters. Therefore, the researcher applied equipercentile equating and linear equating to raw scores and compared the equated form parameters and effect sizes from each method. Item scaling in IRT enables the researcher to select a subset of well-discriminating items. The mean-sigma step produces a mean-slope adjustment from the anchor items, which was used to scale the score on the new form (Form R) to the reference form (Form Q) scale. In equipercentile equating, scores are adjusted to align the proportion of scores in each quintile segment. Linear equating produces a mean-slope adjustment, which was applied to all core items on the new form. The study followed a quasi-experimental design with purposeful sampling of students enrolled in a college level art history course (n=134) and counterbalancing design to distribute both forms on the pre- and posttests. The Experimental Group (n=82) was asked to play ARTé: Mecenas online and complete Level 4 of the game within a two-week period; 37 participants completed Level 4. Over the same period, the Control Group (n=52) did not play the game. The researcher examined between group differences from post-test scores on test Form Q and Form R by full-factorial Two-Way ANOVA. The raw score analysis indicated a 1.29% direct effect of form, which was statistically non-significant but may be practically significant. The researcher repeated the between group differences analysis with all three equating methods. For the IRT mean-sigma adjusted scores, form had a direct effect of 8.39%. Mean-sigma equating with a small sample may have resulted in inaccurate equating parameters. Equipercentile equating aligned test means and standard deviations, but resultant skewness and kurtosis worsened compared to raw score parameters. Form had a 3.18% direct effect. Linear equating produced the lowest Form effect, approaching 0%. Using linearly equated scores, the researcher conducted an ANCOVA to examine the effect size in terms of prior knowledge. The between group effect size for the Control Group versus Experimental Group participants who completed the game was 14.39% with a 4.77% effect size attributed to pre-test score. Playing and completing the game increased art history knowledge, and individuals with low prior knowledge tended to gain more from pre- to post test. Ultimately, researchers should approach test equating based on their theoretical stance on Classical Test Theory and IRT and the respective  assumptions. Regardless of the approach or method, test equating requires a representative sample of sufficient size. With small sample sizes, the application of a range of equating approaches can expose item and test features for review, inform interpretation, and identify paths for improving instruments for future study.

Keywords: effectiveness, equipercentile equating, IRT, learning games, linear equating, mean-sigma equating

Procedia PDF Downloads 193
198 Analysis of Complex Business Negotiations: Contributions from Agency-Theory

Authors: Jan Van Uden

Abstract:

The paper reviews classical agency-theory and its contributions to the analysis of complex business negotiations and gives an approach for the modification of the basic agency-model in order to examine the negotiation specific dimensions of agency-problems. By illustrating fundamental potentials for the modification of agency-theory in context of business negotiations the paper highlights recent empirical research that investigates agent-based negotiations and inter-team constellations. A general theoretical analysis of complex negotiation would be based on a two-level approach. First, the modification of the basic agency-model in order to illustrate the organizational context of business negotiations (i.e., multi-agent issues, common-agencies, multi-period models and the concept of bounded rationality). Second, the application of the modified agency-model on complex business negotiations to identify agency-problems and relating areas of risk in the negotiation process. The paper is placed on the first level of analysis – the modification. The method builds on the one hand on insights from behavior decision research (BRD) and on the other hand on findings from agency-theory as normative directives to the modification of the basic model. Through neoclassical assumptions concerning the fundamental aspects of agency-relationships in business negotiations (i.e., asymmetric information, self-interest, risk preferences and conflict of interests), agency-theory helps to draw solutions on stated worst-case-scenarios taken from the daily negotiation routine. As agency-theory is the only universal approach able to identify trade-offs between certain aspects of economic cooperation, insights obtained provide a deeper understanding of the forces that shape business negotiation complexity. The need for a modification of the basic model is illustrated by highlighting selected issues of business negotiations from agency-theory perspective: Negotiation Teams require a multi-agent approach under the condition that often decision-makers as superior-agents are part of the team. The diversity of competences and decision-making authority is a phenomenon that overrides the assumptions of classical agency-theory and varies greatly in context of certain forms of business negotiations. Further, the basic model is bound to dyadic relationships preceded by the delegation of decision-making authority and builds on a contractual created (vertical) hierarchy. As a result, horizontal dynamics within the negotiation team playing an important role for negotiation success are therefore not considered in the investigation of agency-problems. Also, the trade-off between short-term relationships within the negotiation sphere and the long-term relationships of the corporate sphere calls for a multi-period perspective taking into account the sphere-specific governance-mechanisms already established (i.e., reward and monitoring systems). Within the analysis, the implementation of bounded rationality is closely related to findings from BRD to assess the impact of negotiation behavior on underlying principal-agent-relationships. As empirical findings show, the disclosure and reservation of information to the agent affect his negotiation behavior as well as final negotiation outcomes. Last, in context of business negotiations, asymmetric information is often intended by decision-makers acting as superior-agents or principals which calls for a bilateral risk-approach to agency-relations.

Keywords: business negotiations, agency-theory, negotiation analysis, interteam negotiations

Procedia PDF Downloads 140
197 Environmentally Sustainable Transparent Wood: A Fully Green Approach from Bleaching to Impregnation for Energy-Efficient Engineered Wood Components

Authors: Francesca Gullo, Paola Palmero, Massimo Messori

Abstract:

Transparent wood is considered a promising structural material for the development of environmentally friendly, energy-efficient engineered components. To obtain transparent wood from natural wood materials two approaches can be used: i) bottom-up and ii) top-down. Through the second method, the color of natural wood samples is lightened through a chemical bleaching process that acts on chromophore groups of lignin, such as the benzene ring, quinonoid, vinyl, phenolics, and carbonyl groups. These chromophoric units form complex conjugate systems responsible for the brown color of wood. There are two strategies to remove color and increase the whiteness of wood: i) lignin removal and ii) lignin bleaching. In the lignin removal strategy, strong chemicals containing chlorine (chlorine, hypochlorite, and chlorine dioxide) and oxidizers (oxygen, ozone, and peroxide) are used to completely destroy and dissolve the lignin. In lignin bleaching methods, a moderate reductive (hydrosulfite) or oxidative (hydrogen peroxide) is commonly used to alter or remove the groups and chromophore systems of lignin, selectively discoloring the lignin while keeping the macrostructure intact. It is, therefore, essential to manipulate nanostructured wood by precisely controlling the nanopores in the cell walls by monitoring both chemical treatments and process conditions, for instance, the treatment time, the concentration of chemical solutions, the pH value, and the temperature. The elimination of wood light scattering is the second step in the fabrication of transparent wood materials, which can be achieved through two-step approaches: i) the polymer impregnation method and ii) the densification method. For the polymer impregnation method, the wood scaffold is treated with polymers having a corresponding refractive index (e.g., PMMA and epoxy resins) under vacuum to obtain the transparent composite material, which can finally be pressed to align the cellulose fibers and reduce interfacial defects in order to have a finished product with high transmittance (>90%) and excellent light-guiding. However, both the solution-based bleaching and the impregnation processes used to produce transparent wood generally consume large amounts of energy and chemicals, including some toxic or pollutant agents, and are difficult to scale up industrially. Here, we report a method to produce optically transparent wood by modifying the lignin structure with a chemical reaction at room temperature using small amounts of hydrogen peroxide in an alkaline environment. This method preserves the lignin, which results only deconjugated and acts as a binder, providing both a strong wood scaffold and suitable porosity for infiltration of biobased polymers while reducing chemical consumption, the toxicity of the reagents used, polluting waste, petroleum by-products, energy and processing time. The resulting transparent wood demonstrates high transmittance and low thermal conductivity. Through the combination of process efficiency and scalability, the obtained materials are promising candidates for application in the field of construction for modern energy-efficient buildings.

Keywords: bleached wood, energy-efficient components, hydrogen peroxide, transparent wood, wood composites

Procedia PDF Downloads 55
196 Influence of Dryer Autumn Conditions on Weed Control Based on Soil Active Herbicides

Authors: Juergen Junk, Franz Ronellenfitsch, Michael Eickermann

Abstract:

An appropriate weed management in autumn is a prerequisite for an economically successful harvest in the following year. In Luxembourg oilseed rape, wheat and barley is sown from August until October, accompanied by a chemical weed control with soil active herbicides, depending on the state of the weeds and the meteorological conditions. Based on regular ground and surface water-analysis, high levels of contamination by transformation products of respective herbicide compounds have been found in Luxembourg. The most ideal conditions for incorporating soil active herbicides are single rain events. Weed control may be reduced if application is made when weeds are under drought stress or if repeated light rain events followed by dry spells, because the herbicides tend to bind tightly to the soil particles. These effects have been frequently reported for Luxembourg throughout the last years. In the framework of a multisite long-term field experiment (EFFO) weed monitoring, plants observations and corresponding meteorological measurements were conducted. Long-term time series (1947-2016) from the SYNOP station Findel-Airport (WMO ID = 06590) showed a decrease in the number of days with precipitation. As the total precipitation amount has not significantly changed, this indicates a trend towards rain events with higher intensity. All analyses are based on decades (10-day periods) for September and October of each individual year. To assess the future meteorological conditions for Luxembourg, two different approaches were applied. First, multi-model ensembles from the CORDEX experiments (spatial resolution ~12.5 km; transient projections until 2100) were analysed for two different Representative Concentration Pathways (RCP8.5 and RCP4.5), covering the time span from 2005 until 2100. The multi-model ensemble approach allows for the quantification of the uncertainties and also to assess the differences between the two emission scenarios. Second, to assess smaller scale differences within the country a high resolution model projection using the COSMO-LM model was used (spatial resolution 1.3 km). To account for the higher computational demands, caused by the increased spatial resolution, only 10-year time slices have been simulated (reference period 1991-2000; near future 2041-2050 and far future 2091-2100). Statistically significant trends towards higher air temperatures, +1.6 K for September (+5.3 K far future) and +1.3 K for October (+4.3 K), were predicted for the near future compared to the reference period. Precipitation simultaneously decreased by 9.4 mm (September) and 5.0 mm (October) for the near future and -49 mm (September) and -10 mm (October) in the far future. Beside the monthly values also decades were analyzed for the two future time periods of the CLM model. For all decades of September and October the number of days with precipitation decreased for the projected near and far future. Changes in meteorological variables such as air temperature and precipitation did already induce transformations in weed societies (composition, late-emerging etc.) of arable ecosystems in Europe. Therefore, adaptations of agronomic practices as well as effective weed control strategies must be developed to maintain crop yield.

Keywords: CORDEX projections, dry spells, ensembles, weed management

Procedia PDF Downloads 235
195 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing

Authors: Arjun Kumar Rath, Titus Dhanasingh

Abstract:

Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.

Keywords: board diagnostics software, embedded system, hardware testing, test frameworks

Procedia PDF Downloads 147
194 Solutions for Food-Safe 3D Printing

Authors: Geremew Geidare Kailo, Igor Gáspár, András Koris, Ivana Pajčin, Flóra Vitális, Vanja Vlajkov

Abstract:

Three-dimension (3D) printing, a very popular additive manufacturing technology, has recently undergone rapid growth and replaced the use of conventional technology from prototyping to producing end-user parts and products. The 3D Printing technology involves a digital manufacturing machine that produces three-dimensional objects according to designs created by the user via 3D modeling or computer-aided design/manufacturing (CAD/CAM) software. The most popular 3D printing system is Fused Deposition Modeling (FDM) or also called Fused Filament Fabrication (FFF). A 3D-printed object is considered food safe if it can have direct contact with the food without any toxic effects, even after cleaning, storing, and reusing the object. This work analyzes the processing timeline of the filament (material for 3D printing) from unboxing to the extrusion through the nozzle. It is an important task to analyze the growth of bacteria on the 3D printed surface and in gaps between the layers. By default, the 3D-printed object is not food safe after longer usage and direct contact with food (even though they use food-safe filaments), but there are solutions for this problem. The aim of this work was to evaluate the 3D-printed object from different perspectives of food safety. Firstly, testing antimicrobial 3D printing filaments from a food safety aspect since the 3D Printed object in the food industry may have direct contact with the food. Therefore, the main purpose of the work is to reduce the microbial load on the surface of a 3D-printed part. Coating with epoxy resin was investigated, too, to see its effect on mechanical strength, thermal resistance, surface smoothness and food safety (cleanability). Another aim of this study was to test new temperature-resistant filaments and the effect of high temperature on 3D printed materials to see if they can be cleaned with boiling or similar hi-temp treatment. This work proved that all three mentioned methods could improve the food safety of the 3D printed object, but the size of this effect variates. The best result we got was with coating with epoxy resin, and the object was cleanable like any other injection molded plastic object with a smooth surface. Very good results we got by boiling the objects, and it is good to see that nowadays, more and more special filaments have a food-safe certificate and can withstand boiling temperatures too. Using antibacterial filaments reduced bacterial colonies to 1/5, but the biggest advantage of this method is that it doesn’t require any post-processing. The object is ready out of the 3D printer. Acknowledgements: The research was supported by the Hungarian and Serbian bilateral scientific and technological cooperation project funded by the Hungarian National Office for Research, Development and Innovation (NKFI, 2019-2.1.11-TÉT-2020-00249) and the Ministry of Education, Science and Technological Development of the Republic of Serbia. The authors acknowledge the Hungarian University of Agriculture and Life Sciences’s Doctoral School of Food Science for the support in this study

Keywords: food safety, 3D printing, filaments, microbial, temperature

Procedia PDF Downloads 143
193 Social Media Governance in UK Higher Education Institutions

Authors: Rebecca Lees, Deborah Anderson

Abstract:

Whilst the majority of research into social media in education focuses on the applications for teaching and learning environments, this study looks at how such activities can be managed by investigating the current state of social media regulation within UK higher education. Social media has pervaded almost all aspects of higher education; from marketing, recruitment and alumni relations to both distance and classroom-based learning and teaching activities. In terms of who uses it and how it is used, social media is growing at an unprecedented rate, particularly amongst the target market for higher education. Whilst the platform presents opportunities not found in more traditional methods of communication and interaction, such as speed and reach, it also carries substantial risks that come with inappropriate use, lack of control and issues of privacy. Typically, organisations rely on the concept of a social contract to guide employee behaviour to conform to the expectations of that organisation. Yet, where academia and social media intersect applying the notion of a social contract to enforce governance may be problematic; firstly considering the emphasis on treating students as customers with a growing focus on the use and collection of satisfaction metrics; and secondly regarding the notion of academic’s freedom of speech, opinion and discussion, which is a long-held tradition of learning instruction. Therefore the need for sound governance procedures to support expectations over online behaviour is vital, especially when the speed and breadth of adoption of social media activities has in the past outrun organisations’ abilities to manage it. An analysis of the current level of governance was conducted by gathering relevant policies, guidelines and best practice documentation available online via internet search and institutional requests. The documents were then subjected to a content analysis in the second phase of this study to determine the approach taken by institutions to apply such governance. Documentation was separated according to audience, i.e.: applicable to staff, students or all users. Given many of these included guests and visitors to the institution within their scope being easily accessible was considered important. Yet, within the UK only about half of all education institutions had explicit social media governance documentation available online without requiring member access or considerable searching. Where they existed, the majority focused solely on employee activities and tended to be policy based rather than rooted in guidelines or best practices, or held a fallback position of governing online behaviour via implicit instructions within IT and computer regulations. Explicit instructions over expected online behaviours is therefore lacking within UK HE. Given the number of educational practices that now include significant online components, it is imperative that education organisations keep up to date with the progress of social media use. Initial results from the second phase of this study which analyses the content of the governance documentation suggests they require reading levels at or above the target audience, with some considerable variability in length and layout. Further analysis will add to this growing field of investigating social media governance within higher education.

Keywords: governance, higher education, policy, social media

Procedia PDF Downloads 185
192 Challenges and Proposals for Public Policies Aimed At Increasing Energy Efficiency in Low-Income Communities in Brazil: A Multi-Criteria Approach

Authors: Anna Carolina De Paula Sermarini, Rodrigo Flora Calili

Abstract:

Energy Efficiency (EE) needs investments, new technologies, greater awareness and management on the side of citizens and organizations, and more planning. However, this issue is usually remembered and discussed only in moments of energy crises, and opportunities are missed to take better advantage of the potential of EE in the various sectors of the economy. In addition, there is little concern about the subject among the less favored classes, especially in low-income communities. Accordingly, this article presents suggestions for public policies that aim to increase EE for low-income housing and communities based on international and national experiences. After reviewing the literature, eight policies were listed, and to evaluate them; a multicriteria decision model was developed using the AHP (Analytical Hierarchy Process) and TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) methods, combined with fuzzy logic. Nine experts analyzed the policies according to 9 criteria: economic impact, social impact, environmental impact, previous experience, the difficulty of implementation, possibility/ease of monitoring and evaluating the policies, expected impact, political risks, and public governance and sustainability of the sector. The results found in order of preference are (i) Incentive program for equipment replacement; (ii) Community awareness program; (iii) EE Program with a greater focus on low income; (iv) Staggered and compulsory certification of social interest buildings; (v) Programs for the expansion of smart metering, energy monitoring and digitalization; (vi) Financing program for construction and retrofitting of houses with the emphasis on EE; (vii) Income tax deduction for investment in EE projects in low-income households made by companies; (viii) White certificates of energy for low-income. First, the policy of equipment substitution has been employed in Brazil and the world and has proven effective in promoting EE. For implementation, efforts are needed from the federal and state governments, which can encourage companies to reduce prices, and provide some type of aid for the purchase of such equipment. In second place is the community awareness program, promoting socio-educational actions on EE concepts and with energy conservation tips. This policy is simple to implement and has already been used by many distribution utilities in Brazil. It can be carried out through bids defined by the government in specific areas, being executed by third sector companies with public and private resources. Third on the list is the proposal to continue the Energy Efficiency Program (which obliges electric energy companies to allocate resources for research in the area) by suggesting the return of the mandatory investment of 60% of the resources in projects for low income. It is also relatively simple to implement, requiring efforts by the federal government to make it mandatory, and on the part of the distributors, compliance is needed. The success of the suggestions depends on changes in the established rules and efforts from the interested parties. For future work, we suggest the development of pilot projects in low-income communities in Brazil and the application of other multicriteria decision support methods to compare the results obtained in this study.

Keywords: energy efficiency, low-income community, public policy, multicriteria decision making

Procedia PDF Downloads 119
191 The Dark History of American Psychiatry: Racism and Ethical Provider Responsibility

Authors: Mary Katherine Hoth

Abstract:

Despite racial and ethnic disparities in American psychiatry being well-documented, there remains an apathetic attitude among nurses and providers within the field to engage in active antiracism and provide equitable, recovery-oriented care. It is insufficient to be a “colorblind” nurse or provider and state that call care provided is identical for every patient. Maintaining an attitude of “colorblindness” perpetuates the racism prevalent throughout healthcare and leads to negative patient outcomes. The purpose of this literature review is to highlight the how the historical beginnings of psychiatry have evolved into the disparities seen in today’s practice, as well as to provide some insight on methods that providers and nurses can employ to actively participate in challenging these racial disparities. Background The application of psychiatric medicine to White people versus Black, Indigenous, and other People of Color has been distinctly different as a direct result of chattel slavery and the development of pseudoscience “diagnoses” in the 19th century. This weaponization of the mental health of Black people continues to this day. Population The populations discussed are Black, Indigenous, and other People of Color, with a primary focus on Black people’s experiences with their mental health and the field of psychiatry. Methods A literature review was conducted using CINAHL, EBSCO, MEDLINE, and PubMed databases with the following terms: psychiatry, mental health, racism, substance use, suicide, trauma-informed care, disparities and recovery-oriented care. Articles were further filtered based on meeting the criteria of peer-reviewed, full-text availability, written in English, and published between 2018 and 2023. Findings Black patients are more likely to be diagnosed with psychotic disorders and prescribed antipsychotic medications compared to White patients who were more often diagnosed with mood disorders and prescribed antidepressants. This same disparity is also seen in children and adolescents, where Black children are more likely to be diagnosed with behavior problems such as Oppositional Defiant Disorder (ODD) and White children with the same presentation are more likely to be diagnosed with Attention Hyperactivity Disorder. Medications advertisements for antipsychotics like Haldol as recent as 1974 portrayed a Black man, labeled as “agitated” and “aggressive”, a trope we still see today in police violence cases. The majority of nursing and medical school programs do not provide education on racism and how to actively combat it in practice, leaving many healthcare professionals acutely uneducated and unaware of their own biases and racism, as well as structural and institutional racism. Conclusions Racism will continue to grow wherever it is given time, space, and energy. Providers and nurses have an ethical obligation to educate themselves, actively deconstruct their personal racism and bias, and continuously engage in active antiracism by dismantling racism wherever it is encountered, be it structural, institutional, or scientific racism. Agents of change at the patient care level not only improve the outcomes of Black patients, but it will also lead the way in ensuring Black, Indigenous, and other People of Color are included in research of methods and medications in psychiatry in the future.

Keywords: disparities, psychiatry, racism, recovery-oriented care, trauma-informed care

Procedia PDF Downloads 130
190 Effects of Irrigation Applications during Post-Anthesis Period on Flower Development and Pyrethrin Accumulation in Pyrethrum

Authors: Dilnee D. Suraweera, Tim Groom, Brian Chung, Brendan Bond, Andrew Schipp, Marc E. Nicolas

Abstract:

Pyrethrum (Tanacetum cinerariifolium) is a perennial plant belongs to family Asteraceae. This is cultivated commercially for extraction of natural insecticide pyrethrins, which accumulates in their flower head achenes. Approximately 94% of the pyrethrins are produced within secretory ducts and trichomes of achenes of the mature pyrethrum flower. This is the most widely used botanical insecticide in the world and Australia is the current largest pyrethrum producer in the world. Rainfall in pyrethrum growing regions in Australia during pyrethrum flowering period, in late spring and early summer is significantly less. Due to lack of adequate soil moisture and under elevated temperature conditions during post-anthesis period, resulting in yield reductions. Therefore, understanding of yield responses of pyrethrum to irrigation is important for Pyrethrum as a commercial crop. Irrigation management has been identified as a key area of pyrethrum crop management strategies that could be manipulated to increase yield. Pyrethrum is a comparatively drought tolerant plant and it has some ability to survive in dry conditions due to deep rooting. But in dry areas and in dry seasons, the crop cannot reach to its full yield potential without adequate soil moisture. Therefore, irrigation is essential during the flowering period prevent crop water stress and maximise yield. Irrigation during the water deficit period results in an overall increased rate of water uptake and growth by the plant which is essential to achieve the maximum yield benefits from commercial crops. The effects of irrigation treatments applied at post-anthesis period on pyrethrum yield responses were studied in two irrigation methods. This was conducted in a first harvest commercial pyrethrum field in Waubra, Victoria, during 2012/2013 season. Drip irrigation and overhead sprinkler irrigation treatments applied during whole flowering period were compared with ‘rainfed’ treatment in relation to flower yield and pyrethrin yield responses. The results of this experiment showed that the application of 180mm of irrigation throughout the post-anthesis period, from early flowering stages to physiological maturity under drip irrigation treatment increased pyrethrin concentration by 32%, which combined with the 95 % increase in the flower yield to give a total pyrethrin yield increase of 157%, compared to the ‘rainfed’ treatment. In contrast to that overhead sprinkler irrigation treatment increased pyrethrin concentration by 19%, which combined with the 60 % increase in the flower yield to give a total pyrethrin yield increase of 91%, compared to the ‘rainfed’ treatment. Irrigation treatments applied throughout the post-anthesis period significantly increased flower yield as a result of enhancement of number of flowers and flower size. Irrigation provides adequate soil moisture for flower development in pyrethrum which slows the rate of flower development and increases the length of the flowering period, resulting in a delayed crop harvest (11 days) compared to the ‘rainfed’ treatment. Overall, irrigation has a major impact on pyrethrin accumulation which increases the rate and duration of pyrethrin accumulation resulting in higher pyrethrin yield per flower at physiological maturity. The findings of this study will be important for future yield predictions and to develop advanced agronomic strategies to maximise pyrethrin yield in pyrethrum.

Keywords: achene, drip irrigation, overhead irrigation, pyrethrin

Procedia PDF Downloads 410
189 From Intuitive to Constructive Audit Risk Assessment: A Complementary Approach to CAATTs Adoption

Authors: Alon Cohen, Jeffrey Kantor, Shalom Levy

Abstract:

The use of the audit risk model in auditing has faced limitations and difficulties, leading auditors to rely on a conceptual level of its application. The qualitative approach to assessing risks has resulted in different risk assessments, affecting the quality of audits and decision-making on the adoption of CAATTs. This study aims to investigate risk factors impacting the implementation of the audit risk model and propose a complementary risk-based instrument (KRIs) to form substance risk judgments and mitigate against heightened risk of material misstatement (RMM). The study addresses the question of how risk factors impact the implementation of the audit risk model, improve risk judgments, and aid in the adoption of CAATTs. The study uses a three-stage scale development procedure involving a pretest and subsequent study with two independent samples. The pretest involves an exploratory factor analysis, while the subsequent study employs confirmatory factor analysis for construct validation. Additionally, the authors test the ability of the KRIs to predict audit efforts needed to mitigate against heightened RMM. Data was collected through two independent samples involving 767 participants. The collected data was analyzed using exploratory factor analysis and confirmatory factor analysis to assess scale validity and construct validation. The suggested KRIs, comprising two risk components and seventeen risk items, are found to have high predictive power in determining audit efforts needed to reduce RMM. The study validates the suggested KRIs as an effective instrument for risk assessment and decision-making on the adoption of CAATTs. This study contributes to the existing literature by implementing a holistic approach to risk assessment and providing a quantitative expression of assessed risks. It bridges the gap between intuitive risk evaluation and the theoretical domain, clarifying the mechanism of risk assessments. It also helps improve the uniformity and quality of risk assessments, aiding audit standard-setters in issuing updated guidelines on CAATT adoption. A few limitations and recommendations for future research should be mentioned. First, the process of developing the scale was conducted in the Israeli auditing market, which follows the International Standards on Auditing (ISAs). Although ISAs are adopted in European countries, for greater generalization, future studies could focus on other countries that adopt additional or local auditing standards. Second, this study revealed risk factors that have a material impact on the assessed risk. However, there could be additional risk factors that influence the assessment of the RMM. Therefore, future research could investigate other risk segments, such as operational and financial risks, to bring a broader generalizability to our results. Third, although the sample size in this study fits acceptable scale development procedures and enables drawing conclusions from the body of research, future research may develop standardized measures based on larger samples to reduce the generation of equivocal results and suggest an extended risk model.

Keywords: audit risk model, audit efforts, CAATTs adoption, key risk indicators, sustainability

Procedia PDF Downloads 77
188 Community Strengths and Indigenous Resilience as Drivers for Health Reform Change

Authors: Shana Malio-Satele, Lemalu Silao Vaisola Sefo

Abstract:

Introductory Statement: South Seas Healthcare is Ōtara’s largest Pacific health provider in South Auckland, New Zealand. Our vision is excellent health and well-being for Pacific people and all communities through strong Pacific values. During the DELTA and Omicron outbreak of COVID-19, our Pacific people, indigenous Māori, and the community of South Auckland were disproportionately affected and faced significant hardship with existing inequities magnified. This study highlights the community-based learnings of harnessing community-based strengths such as indigenous resilience, family-informed experiences and stories that provide critical insights that inform health reform changes that will be sustainable and equitable for all indigenous populations. This study is based on critical learnings acquired during COVID-19 that challenge the deficit narrative common in healthcare about indigenous populations. This study shares case studies of marginalised groups and religious groups and the successful application of indigenous cultural strengths, such as collectivism, positive protective factors, and using trusted relationships to create meaningful change in the way healthcare is delivered. The significance of this study highlights the critical conditions needed to adopt a community-informed way of creating integrated healthcare that works and the role that the community can play in being part of the solution. Methodologies: Key methodologies utilised are indigenous and Pacific-informed. To achieve critical learnings from the community, Pacific research methodologies, heavily informed by the Polynesian practice, were applied. Specifically, this includes; Teu Le Va (Understanding the importance of trusted relationships as a way of creating positive health solutions); The Fonofale Methodology (A way of understanding how health incorporates culture, family, the physical, spiritual, mental and other dimensions of health, as well as time, context and environment; The Fonua Methodology – Understanding the overall wellbeing and health of communities, families and individuals and their holistic needs and environmental factors and the Talanoa methodology (Researching through conversation, where understanding the individual and community is through understanding their history and future through stories). Major Findings: Key findings in the study included: 1. The collectivist approach in the community is a strengths-based response specific to populations, which highlights the importance of trusted relationships and cultural values to achieve meaningful outcomes. 2. The development of a “village model” which identified critical components to achieving health reform change; system navigation, a sense of service that was culturally responsive, critical leadership roles, culturally appropriate support, and the ability to influence the system enablers to support an alternative way of working. Concluding Statement: There is a strong connection between community-based strengths being implemented into healthcare strategies and reforms and the sustainable success of indigenous populations and marginalised communities accessing services that are cohesive, equitably resourced, accessible and meaningful for families. This study highlights the successful community-informed approaches and practices used during the COVID-19 response in New Zealand that are now being implemented in the current health reform.

Keywords: indigenous voice, community voice, health reform, New Zealand

Procedia PDF Downloads 91
187 Automatic Adult Age Estimation Using Deep Learning of the ResNeXt Model Based on CT Reconstruction Images of the Costal Cartilage

Authors: Ting Lu, Ya-Ru Diao, Fei Fan, Ye Xue, Lei Shi, Xian-e Tang, Meng-jun Zhan, Zhen-hua Deng

Abstract:

Accurate adult age estimation (AAE) is a significant and challenging task in forensic and archeology fields. Attempts have been made to explore optimal adult age metrics, and the rib is considered a potential age marker. The traditional way is to extract age-related features designed by experts from macroscopic or radiological images followed by classification or regression analysis. Those results still have not met the high-level requirements for practice, and the limitation of using feature design and manual extraction methods is loss of information since the features are likely not designed explicitly for extracting information relevant to age. Deep learning (DL) has recently garnered much interest in imaging learning and computer vision. It enables learning features that are important without a prior bias or hypothesis and could be supportive of AAE. This study aimed to develop DL models for AAE based on CT images and compare their performance to the manual visual scoring method. Chest CT data were reconstructed using volume rendering (VR). Retrospective data of 2500 patients aged 20.00-69.99 years were obtained between December 2019 and September 2021. Five-fold cross-validation was performed, and datasets were randomly split into training and validation sets in a 4:1 ratio for each fold. Before feeding the inputs into networks, all images were augmented with random rotation and vertical flip, normalized, and resized to 224×224 pixels. ResNeXt was chosen as the DL baseline due to its advantages of higher efficiency and accuracy in image classification. Mean absolute error (MAE) was the primary parameter. Independent data from 100 patients acquired between March and April 2022 were used as a test set. The manual method completely followed the prior study, which reported the lowest MAEs (5.31 in males and 6.72 in females) among similar studies. CT data and VR images were used. The radiation density of the first costal cartilage was recorded using CT data on the workstation. The osseous and calcified projections of the 1 to 7 costal cartilages were scored based on VR images using an eight-stage staging technique. According to the results of the prior study, the optimal models were the decision tree regression model in males and the stepwise multiple linear regression equation in females. Predicted ages of the test set were calculated separately using different models by sex. A total of 2600 patients (training and validation sets, mean age=45.19 years±14.20 [SD]; test set, mean age=46.57±9.66) were evaluated in this study. Of ResNeXt model training, MAEs were obtained with 3.95 in males and 3.65 in females. Based on the test set, DL achieved MAEs of 4.05 in males and 4.54 in females, which were far better than the MAEs of 8.90 and 6.42 respectively, for the manual method. Those results showed that the DL of the ResNeXt model outperformed the manual method in AAE based on CT reconstruction of the costal cartilage and the developed system may be a supportive tool for AAE.

Keywords: forensic anthropology, age determination by the skeleton, costal cartilage, CT, deep learning

Procedia PDF Downloads 74
186 3D Structuring of Thin Film Solid State Batteries for High Power Demanding Applications

Authors: Alfonso Sepulveda, Brecht Put, Nouha Labyedh, Philippe M. Vereecken

Abstract:

High energy and power density are the main requirements of today’s high demanding applications in consumer electronics. Lithium ion batteries (LIB) have the highest energy density of all known systems and are thus the best choice for rechargeable micro-batteries. Liquid electrolyte LIBs present limitations in safety, size and design, thus thin film all-solid state batteries are predominantly considered to overcome these restrictions in small devices. Although planar all-solid state thin film LIBs are at present commercially available they have low capacity (<1mAh/cm2) which limits their application scenario. By using micro-or nanostructured surfaces (i.e. 3D batteries) and appropriate conformal coating technology (i.e. electrochemical deposition, ALD) the capacity can be increased while still keeping a high rate performance. The main challenges in the introduction of solid-state LIBs are low ionic conductance and limited cycle life time due to mechanical stress and shearing interfaces. Novel materials and innovative nanostructures have to be explored in order to overcome these limitations. Thin film 3D compatible materials need to provide with the necessary requirements for functional and viable thin-film stacks. Thin film electrodes offer shorter Li-diffusion paths and high gravimetric and volumetric energy densities which allow them to be used at ultra-fast charging rates while keeping their complete capacities. Thin film electrolytes with intrinsically high ion conductivity (~10-3 S.cm) do exist, but are not electrochemically stable. On the other hand, electronically insulating electrolytes with a large electrochemical window and good chemical stability are known, but typically have intrinsically low ionic conductivities (<10-6 S cm). In addition, there is the need for conformal deposition techniques which can offer pinhole-free coverage over large surface areas with large aspect ratio features for electrode, electrolyte and buffer layers. To tackle the scaling of electrodes and the conformal deposition requirements on future 3D batteries we study LiMn2O4 (LMO) and Li4Ti5O12 (LTO). These materials are among the most interesting electrode candidates for thin film batteries offering low cost, low toxicity, high voltage and high capacity. LMO and LTO are considered 3D compatible materials since they can be prepared through conformal deposition techniques. Here, we show the scaling effects on rate performance and cycle stability of thin film cathode layers of LMO created by RF-sputtering. Planar LMO thin films below 100 nm have been electrochemically characterized. The thinnest films show the highest volumetric capacity and the best cycling stability. The increased stability of the films below 50 nm allows cycling in both the 4 and 3V potential region, resulting in a high volumetric capacity of 1.2Ah/cm3. Also, the creation of LTO anode layers through a post-lithiation process of TiO2 is demonstrated here. Planar LTO thin films below 100 nm have been electrochemically characterized. A 70 nm film retains 85% of its original capacity after 100 (dis)charging cycles at 10C. These layers can be implemented into a high aspect ratio structures. IMEC develops high aspect Si pillars arrays which is the base for the advance of 3D thin film all-solid state batteries of future technologies.

Keywords: Li-ion rechargeable batteries, thin film, nanostructures, rate performance, 3D batteries, all-solid state

Procedia PDF Downloads 338
185 Improved Anatomy Teaching by the 3D Slicer Platform

Authors: Ahmedou Moulaye Idriss, Yahya Tfeil

Abstract:

Medical imaging technology has become an indispensable tool in many branches of the biomedical, health area, and research and is vitally important for the training of professionals in these fields. It is not only about the tools, technologies, and knowledge provided but also about the community that this training project proposes. In order to be able to raise the level of anatomy teaching in the medical school of Nouakchott in Mauritania, it is necessary and even urgent to facilitate access to modern technology for African countries. The role of technology as a key driver of justifiable development has long been recognized. Anatomy is an essential discipline for the training of medical students; it is a key element for the training of medical specialists. The quality and results of the work of a young surgeon depend on his better knowledge of anatomical structures. The teaching of anatomy is difficult as the discipline is being neglected by medical students in many academic institutions. However, anatomy remains a vital part of any medical education program. When anatomy is presented in various planes medical students approve of difficulties in understanding. They do not increase their ability to visualize and mentally manipulate 3D structures. They are sometimes not able to correctly identify neighbouring or associated structures. This is the case when they have to make the identification of structures related to the caudate lobe when the liver is moved to different positions. In recent decades, some modern educational tools using digital sources tend to replace old methods. One of the main reasons for this change is the lack of cadavers in laboratories with poorly qualified staff. The emergence of increasingly sophisticated mathematical models, image processing, and visualization tools in biomedical imaging research have enabled sophisticated three-dimensional (3D) representations of anatomical structures. In this paper, we report our current experience in the Faculty of Medicine in Nouakchott Mauritania. One of our main aims is to create a local learning community in the fields of anatomy. The main technological platform used in this project is called 3D Slicer. 3D Slicer platform is an open-source application available for free for viewing, analysis, and interaction with biomedical imaging data. Using the 3D Slicer platform, we created from real medical images anatomical atlases of parts of the human body, including head, thorax, abdomen, liver, and pelvis, upper and lower limbs. Data were collected from several local hospitals and also from the website. We used MRI and CT-Scan imaging data from children and adults. Many different anatomy atlases exist, both in print and digital forms. Anatomy Atlas displays three-dimensional anatomical models, image cross-sections of labelled structures and source radiological imaging, and a text-based hierarchy of structures. Open and free online anatomical atlases developed by our anatomy laboratory team will be available to our students. This will allow pedagogical autonomy and remedy the shortcomings by responding more fully to the objectives of sustainable local development of quality education and good health at the national level. To make this work a reality, our team produced several atlases available in our faculty in the form of research projects.

Keywords: anatomy, education, medical imaging, three dimensional

Procedia PDF Downloads 244
184 Regenerating Habitats. A Housing Based on Modular Wooden Systems

Authors: Rui Pedro de Sousa Guimarães Ferreira, Carlos Alberto Maia Domínguez

Abstract:

Despite the ambitions to achieve climate neutrality by 2050, to fulfill the Paris Agreement's goals, the building and construction sector remains one of the most resource-intensive and greenhouse gas-emitting industries in the world, accounting for 40% of worldwide CO ₂ emissions. Over the past few decades, globalization and population growth have led to an exponential rise in demand in the housing market and, by extension, in the building industry. Considering this housing crisis, it is obvious that we will not stop building in the near future. However, the transition, which has already started, is challenging and complex because it calls for the worldwide participation of numerous organizations in altering how building systems, which have been a part of our everyday existence for over a century, are used. Wood is one of the alternatives that is most frequently used nowadays (under responsible forestry conditions) because of its physical qualities and, most importantly, because it produces fewer carbon emissions during manufacturing than steel or concrete. Furthermore, as wood retains its capacity to store CO ₂ after application and throughout the life of the building, working as a natural carbon filter, it helps to reduce greenhouse gas emissions. After a century-long focus on other materials, in the last few decades, technological advancements have made it possible to innovate systems centered around the use of wood. However, there are still some questions that require further exploration. It is necessary to standardize production and manufacturing processes based on prefabrication and modularization principles to achieve greater precision and optimization of the solutions, decreasing building time, prices, and waste from raw materials. In addition, this approach will make it possible to develop new architectural solutions to solve the rigidity and irreversibility of buildings, two of the most important issues facing housing today. Most current models are still created as inflexible, fixed, monofunctional structures that discourage any kind of regeneration, based on matrices that sustain the conventional family's traditional model and are founded on rigid, impenetrable compartmentalization. Adaptability and flexibility in housing are, and always have been, necessities and key components of architecture. People today need to constantly adapt to their surroundings and themselves because of the fast-paced, disposable, and quickly obsolescent nature of modern items. Migrations on a global scale, different kinds of co-housing, or even personal changes are some of the new questions that buildings have to answer. Designing with the reversibility of construction systems and materials in mind not only allows for the concept of "looping" in construction, with environmental advantages that enable the development of a circular economy in the sector but also unleashes multiple social benefits. In this sense, it is imperative to develop prefabricated and modular construction systems able to address the formalization of a reversible proposition that adjusts to the scale of time and its multiple reformulations, many of which are unpredictable. We must allow buildings to change, grow, or shrink over their lifetime, respecting their nature and, finally, the nature of the people living in them. It´s the ability to anticipate the unexpected, adapt to social factors, and take account of demographic shifts in society to stabilize communities, the foundation of real innovative sustainability.

Keywords: modular, timber, flexibility, housing

Procedia PDF Downloads 80
183 A Regulator's Assessment of Consumer Risk When Evaluating a User Test for an Umbrella Brand Name in an over the Counter Medicine

Authors: A. Bhatt, C. Bassi, H. Farragher, J. Musk

Abstract:

Background: All medicines placed on the EU market are legally required to be accompanied by labeling and package leaflet, which provide comprehensive information, enabling its safe and appropriate use. Mock-ups with results of assessments using a target patient group must be submitted for a marketing authorisation application. Consumers need confidence in non-prescription, OTC medicines in order to manage their minor ailments and umbrella brands assist purchasing decisions by assisting easy identification within a particular therapeutic area. A number of regulatory agencies have risk management tools and guidelines to assist in developing umbrella brands for OTC medicines, however assessment and decision making is subjective and inconsistent. This study presents an evaluation in the UK following the US FDA warning concerning methaemoglobinaemia following 21 reported cases (11 children under 2 years) caused by OTC oral analgesics containing benzocaine. METHODS: A standard face to face, 25 structured task based user interview testing methodology using a standard questionnaire and rating scale in consumers aged 15-91 years, was conducted independently between June and October 2015 in their homes. Whether individuals could discriminate between the labelling, safety information and warnings on cartons and PILs between 3 different OTC medicines packs with the same umbrella name was evaluated. Each pack was presented with differing information hierarchy using, different coloured cartons, containing the 3 different active ingredients, benzocaine (oromucosal spray) and two lozenges containing 2, 4, dichlorobenzyl alcohol, amylmetacresol and hexylresorcinol respectively (for the symptomatic relief of sore throat pain). The test was designed to determine whether warnings on the carton and leaflet were prominent, accessible to alert users that one product contained benzocaine, risk of methaemoglobinaemia, and refer to the leaflet for the signs of the condition and what to do should this occur. Results: Two consumers did not locate the warnings on the side of the pack, eventually found them on the back and two suggestions to further improve accessibility of the methaemoglobinaemia warning. Using a gold pack design for the oromucosal spray, all consumers could differentiate between the 3 drugs, minimum age particulars, pharmaceutical form and the risk factor methaemoglobinaemia. The warnings for benzocaine were deemed to be clear or very clear; appearance of the 3 packs were either very well differentiated or quite well differentiated. The PIL test passed on all criteria. All consumers could use the product correctly, identify risk factors ensuring the critical information necessary for the safe use was legible and easily accessible so that confusion and errors were minimised. Conclusion: Patients with known methaemoglobinaemia are likely to be vigilant in checking for benzocaine containing products, despite similar umbrella brand names across a range of active ingredients. Despite these findings, the package design and spray format were not deemed to be sufficient to mitigate potential safety risks associated with differences in target populations and contraindications when submitted to the Regulatory Agency. Although risk management tools are increasingly being used by agencies to assist in providing objective assurance of package safety, further transparency, reduction in subjectivity and proportionate risk should be demonstrated.

Keywords: labelling, OTC, risk, user testing

Procedia PDF Downloads 309
182 A Postmodern Framework for Quranic Hermeneutics

Authors: Christiane Paulus

Abstract:

Post-Islamism assumes that the Quran should not be viewed in terms of what Lyotard identifies as a ‘meta-narrative'. However, its socio-ethical content can be viewed as critical of power discourse (Foucault). Practicing religion seems to be limited to rites and individual spirituality, taqwa. Alternatively, can we build on Muhammad Abduh's classic-modern reform and develop it through a postmodernist frame? This is the main question of this study. Through his general and vague remarks on the context of the Quran, Abduh was the first to refer to the historical and cultural distance of the text as an obstacle for interpretation. His application, however, corresponded to the modern absolute idea of authentic sharia. He was followed by Amin al-Khuli, who hermeneutically linked the content of the Quran to the theory of evolution. Fazlur Rahman and Nasr Hamid abu Zeid remain reluctant to go beyond the general level in terms of context. The hermeneutic circle, therefore, persists in challenging, how to get out to overcome one’s own assumptions. The insight into and the acceptance of the lasting ambivalence of understanding can be grasped as a postmodern approach; it is documented in Derrida's discovery of the shift in text meanings, difference, also in Lyotard's theory of différend. The resulting mixture of meanings (Wolfgang Welsch) can be read together with the classic ambiguity of the premodern interpreters of the Quran (Thomas Bauer). Confronting hermeneutic difficulties in general, Niklas Luhmann proves every description an attribution, tautology, i.e., remaining in the circle. ‘De-tautologization’ is possible, namely by analyzing the distinctions in the sense of objective, temporal and social information that every text contains. This could be expanded with the Kantian aesthetic dimension of reason (critique of pure judgment) corresponding to the iʽgaz of the Coran. Luhmann asks, ‘What distinction does the observer/author make?’ Quran as a speech from God to the first listeners could be seen as a discourse responding to the problems of everyday life of that time, which can be viewed as the general goal of the entire Qoran. Through reconstructing koranic Lifeworlds (Alfred Schütz) in detail, the social structure crystallizes the socio-economic differences, the enormous poverty. The koranic instruction to provide the basic needs for the neglected groups, which often intersect (old, poor, slaves, women, children), can be seen immediately in the text. First, the references to lifeworlds/social problems and discourses in longer koranic passages should be hypothesized. Subsequently, information from the classic commentaries could be extracted, the classical Tafseer, in particular, contains rich narrative material for reconstructing. By selecting and assigning suitable, specific context information, the meaning of the description becomes condensed (Clifford Geertz). In this manner, the text gets necessarily an alienation and is newly accessible. The socio-ethical implications can thus be grasped from the difference of the original problem and the revealed/improved order/procedure; this small step can be materialized as such, not as an absolute solution but as offering plausible patterns for today’s challenges as the Agenda 2030.

Keywords: postmodern hermeneutics, condensed description, sociological approach, small steps of reform

Procedia PDF Downloads 221
181 Learning Recomposition after the Remote Period with Finalist Students of the Technical Course in the Environment of the Ifpa, Paragominas Campus, Pará State, Brazilian Amazon

Authors: Liz Carmem Silva-Pereira, Raffael Alencar Mesquita Rodrigues, Francisco Helton Mendes Barbosa, Emerson de Freitas Ferreira

Abstract:

Due to the Covid-19 pandemic declared in March 2020 by the World Health Organization, the way of social coexistence across the planet was affected, especially in educational processes, from the implementation of the remote modality as a teaching strategy. This teaching-learning modality caused a change in the routine and learning of basic education students, which resulted in serious consequences for the return to face-to-face teaching in 2021. 2022, at the Federal Institute of Education, Science and Technology of Pará (IFPA) – Campus Paragominas had their training process severely affected, having studied the initial half of their training in the remote modality, which compromised the carrying out of practical classes, technical visits and field classes, essential for the student formation on the environmental technician. With the objective of promoting the recomposition of these students' learning after returning to the face-to-face modality, an educational strategy was developed in the last period of the course. As teaching methodologies were used for research as an educational principle, the integrative project and the parallel recovery action applied jointly, aiming at recomposing the basic knowledge of the natural sciences, together with the technical knowledge of the environmental area applied to the course. The project assisted 58 finalist students of the environmental technical course. A research instrument was elaborated with parameters of evaluation of the environmental quality for study in 19 collection points, in the Uraim River urban hydrographic basin, in the Paragominas City – Pará – Brazilian Amazon. Students were separated into groups under the professors' and laboratory assistants’ orientation, and in the field, they observed and evaluated the places' environmental conditions and collected physical data and water samples, which were taken to the chemistry and biology laboratories at Campus Paragominas for further analysis. With the results obtained, each group prepared a technical report on the environmental conditions of each evaluated point. This work methodology enabled the practical application of theoretical knowledge received in various disciplines during the remote teaching modality, contemplating the integration of knowledge, people, skills, and abilities for the best technical training of finalist students. At the activity end, the satisfaction of the involved students in the project was evaluated, through a form, with the signing of the informed consent term, using the Likert scale as an evaluation parameter. The results obtained in the satisfaction survey were: on the use of research projects within the disciplines attended, 82% of satisfaction was obtained; regarding the revision of contents in the execution of the project, 84% of satisfaction was obtained; regarding the acquired field experience, 76.9% of satisfaction was obtained, regarding the laboratory experience, 86.2% of satisfaction was obtained, and regarding the use of this methodology as parallel recovery, 71.8% was obtained of satisfaction. In addition to the excellent performance of students in acquiring knowledge, it was possible to remedy the deficiencies caused by the absence of practical classes, technical visits, and field classes, which occurred during the execution of the remote teaching modality, fulfilling the desired educational recomposition.

Keywords: integrative project, parallel recovery, research as an educational principle, teaching-learning

Procedia PDF Downloads 66