Search results for: flat panel display
229 Investigation of Alumina Membrane Coated Titanium Implants on Osseointegration
Authors: Pinar Erturk, Sevde Altuntas, Fatih Buyukserin
Abstract:
In order to obtain an effective integration between an implant and a bone, implant surfaces should have similar properties to bone tissue surfaces. Especially mimicry of the chemical, mechanical and topographic properties of the implant to the bone is crucial for fast and effective osseointegration. Titanium-based biomaterials are more preferred in clinical use, and there are studies of coating these implants with oxide layers that have chemical/nanotopographic properties stimulating cell interactions for enhanced osseointegration. There are low success rates of current implantations, especially in craniofacial implant applications, which are large and vital zones, and the oxide layer coating increases bone-implant integration providing long-lasting implants without requiring revision surgery. Our aim in this study is to examine bone-cell behavior on titanium implants with an aluminum oxide layer (AAO) on effective osseointegration potential in the deformation of large zones with difficult spontaneous healing. In our study, aluminum layer coated titanium surfaces were anodized in sulfuric, phosphoric, and oxalic acid, which are the most common used AAO anodization electrolytes. After morphologic, chemical, and mechanical tests on AAO coated Ti substrates, viability, adhesion, and mineralization of adult bone cells on these substrates were analyzed. Besides with atomic layer deposition (ALD) as a sensitive and conformal technique, these surfaces were coated with pure alumina (5 nm); thus, cell studies were performed on ALD-coated nanoporous oxide layers with suppressed ionic content too. Lastly, in order to investigate the effect of the topography on the cell behavior, flat non-porous alumina layers on silicon wafers formed by ALD were compared with the porous ones. Cell viability ratio was similar between anodized surfaces, but pure alumina coated titanium and anodized surfaces showed a higher viability ratio compared to bare titanium and bare anodized ones. Alumina coated titanium surfaces, which anodized in phosphoric acid, showed significantly different mineralization ratios after 21 days over other bare titanium and titanium surfaces which anodized in other electrolytes. Bare titanium was the second surface that had the highest mineralization ratio. Otherwise, titanium, which is anodized in oxalic acid electrolyte, demonstrated the lowest mineralization. No significant difference was shown between bare titanium and anodized surfaces except AAO titanium surface anodized in phosphoric acid. Currently, osteogenic activities of these cells on the genetic level are investigated by quantitative real-time polymerase chain reaction (qRT-PCR) analysis results of RUNX-2, VEGF, OPG, and osteopontin genes. Also, as a result of the activities of the genes mentioned before, Western Blot will be used for protein detection. Acknowledgment: The project is supported by The Scientific and Technological Research Council of Turkey.Keywords: alumina, craniofacial implant, MG-63 cell line, osseointegration, oxalic acid, phosphoric acid, sulphuric acid, titanium
Procedia PDF Downloads 131228 Organic Light Emitting Devices Based on Low Symmetry Coordination Structured Lanthanide Complexes
Authors: Zubair Ahmed, Andrea Barbieri
Abstract:
The need to reduce energy consumption has prompted a considerable research effort for developing alternative energy-efficient lighting systems to replace conventional light sources (i.e., incandescent and fluorescent lamps). Organic light emitting device (OLED) technology offers the distinctive possibility to fabricate large area flat devices by vacuum or solution processing. Lanthanide β-diketonates complexes, due to unique photophysical properties of Ln(III) ions, have been explored as emitting layers in OLED displays and in solid-state lighting (SSL) in order to achieve high efficiency and color purity. For such applications, the excellent photoluminescence quantum yield (PLQY) and stability are the two key points that can be achieved simply by selecting the proper organic ligands around the Ln ion in a coordination sphere. Regarding the strategies to enhance the PLQY, the most common is the suppression of the radiationless deactivation pathways due to the presence of high-frequency oscillators (e.g., OH, –CH groups) around the Ln centre. Recently, a different approach to maximize the PLQY of Ln(β-DKs) has been proposed (named 'Escalate Coordination Anisotropy', ECA). It is based on the assumption that coordinating the Ln ion with different ligands will break the centrosymmetry of the molecule leading to less forbidden transitions (loosening the constraints of the Laporte rule). The OLEDs based on such complexes are available, but with low efficiency and stability. In order to get efficient devices, there is a need to develop some new Ln complexes with enhanced PLQYs and stabilities. For this purpose, the Ln complexes, both visible and (NIR) emitting, of variant coordination structures based on the various fluorinated/non-fluorinated β-diketones and O/N-donor neutral ligands were synthesized using a one step in situ method. In this method, the β-diketones, base, LnCl₃.nH₂O and neutral ligands were mixed in a 3:3:1:1 M ratio in ethanol that gave air and moisture stable complexes. Further, they were characterized by means of elemental analysis, NMR spectroscopy and single crystal X-ray diffraction. Thereafter, their photophysical properties were studied to select the best complexes for the fabrication of stable and efficient OLEDs. Finally, the OLEDs were fabricated and investigated using these complexes as emitting layers along with other organic layers like NPB,N,N′-Di(1-naphthyl)-N,N′-diphenyl-(1,1′-biphenyl)-4,4′-diamine (hole-transporting layer), BCP, 2,9-Dimethyl-4,7-diphenyl-1,10-phenanthroline (hole-blocker) and Alq3 (electron-transporting layer). The layers were sequentially deposited under high vacuum environment by thermal evaporation onto ITO glass substrates. Moreover, co-deposition techniques were used to improve charge transport in the devices and to avoid quenching phenomena. The devices show strong electroluminescence at 612, 998, 1064 and 1534 nm corresponding to ⁵D₀ →⁷F₂(Eu), ²F₅/₂ → ²F₇/₂ (Yb), ⁴F₃/₂→ ⁴I₉/₂ (Nd) and ⁴I1₃/₂→ ⁴I1₅/₂ (Er). All the devices fabricated show good efficiency as well as stability.Keywords: electroluminescence, lanthanides, paramagnetic NMR, photoluminescence
Procedia PDF Downloads 121227 Interpreter Scholarship Program That Improves Language Services in New South Wales: A Participatory Action Research Approach
Authors: Carly Copolov, Rema Nazha, Sahba C. Delshad, George Bisas
Abstract:
In New South Wales (NSW), Australia, we speak more than 275 languages and dialects. Interpreters play an indispensable role in our multicultural society by ensuring the people of NSW all enjoy the same opportunities. The NSW Government offers scholarships to enable people who speak in-demand and high priority languages to become eligible to be practicing interpreters. The NSW Interpreter Scholarship Program was launched in January 2019, targeting priority languages from new and emerging, as well as existing language communities. The program offers fully-funded scholarships to study at Technical and Further Education (TAFE), receive National Accreditation Authority for Translators and Interpreters (NAATI) certification, and be mentored and gain employment with the interpreter panel of Multicultural NSW. A Participatory Action Research approach was engaged to challenge the current system for people to become practicing interpreters in NSW. There were over 800 metro Sydney applications and close to 200 regional applications. Three courses were run through TAFE NSW (2 in metro Sydney and 1 in regional NSW). Thirty-nine students graduated from the program in 2019. The first metro Sydney location had 18 graduates complete the course in Assyrian, Burmese, Chaldean, Kurdish-Kurmanji, Nepali, and Tibetan. The second metro Sydney location had 9 graduates complete the course in Tongan, Kirundi, Mongolian and Italian. The regional location had 12 graduates who complete the course from new emerging language communities such as Kurdish-Kurmanji, Burmese, Zomi Chin, Hakha Chin, and Tigrinya. The findings showed that students were very positive about the program as the large majority said they were satisfied with the course content, they felt prepared for the NAATI test at the conclusion of the course, and they would definitely recommend the program to their friends. Also, 18 students from the 2019 cohort signed up to receive further mentoring by experienced interpreters. In 2020 it is anticipated that 3 courses will be run through TAFE NSW (2 in regional NSW and 1 in metro Sydney) to reflect the needs of new emerging language communities settling in regional areas. In conclusion, it has been demonstrated that the NSW Interpreter Scholarship Program improves the supply, quality, and use of language services in NSW, Australia, so that people who speak in-demand and high priority languages are ensured better access to crucial government servicesKeywords: interpreting, emerging communities, scholarship program, Sydney
Procedia PDF Downloads 146226 A New Method Separating Relevant Features from Irrelevant Ones Using Fuzzy and OWA Operator Techniques
Authors: Imed Feki, Faouzi Msahli
Abstract:
Selection of relevant parameters from a high dimensional process operation setting space is a problem frequently encountered in industrial process modelling. This paper presents a method for selecting the most relevant fabric physical parameters for each sensory quality feature. The proposed relevancy criterion has been developed using two approaches. The first utilizes a fuzzy sensitivity criterion by exploiting from experimental data the relationship between physical parameters and all the sensory quality features for each evaluator. Next an OWA aggregation procedure is applied to aggregate the ranking lists provided by different evaluators. In the second approach, another panel of experts provides their ranking lists of physical features according to their professional knowledge. Also by applying OWA and a fuzzy aggregation model, the data sensitivity-based ranking list and the knowledge-based ranking list are combined using our proposed percolation technique, to determine the final ranking list. The key issue of the proposed percolation technique is to filter automatically and objectively the relevant features by creating a gap between scores of relevant and irrelevant parameters. It permits to automatically generate threshold that can effectively reduce human subjectivity and arbitrariness when manually choosing thresholds. For a specific sensory descriptor, the threshold is defined systematically by iteratively aggregating (n times) the ranking lists generated by OWA and fuzzy models, according to a specific algorithm. Having applied the percolation technique on a real example, of a well known finished textile product especially the stonewashed denims, usually considered as the most important quality criteria in jeans’ evaluation, we separate the relevant physical features from irrelevant ones for each sensory descriptor. The originality and performance of the proposed relevant feature selection method can be shown by the variability in the number of physical features in the set of selected relevant parameters. Instead of selecting identical numbers of features with a predefined threshold, the proposed method can be adapted to the specific natures of the complex relations between sensory descriptors and physical features, in order to propose lists of relevant features of different sizes for different descriptors. In order to obtain more reliable results for selection of relevant physical features, the percolation technique has been applied for combining the fuzzy global relevancy and OWA global relevancy criteria in order to clearly distinguish scores of the relevant physical features from those of irrelevant ones.Keywords: data sensitivity, feature selection, fuzzy logic, OWA operators, percolation technique
Procedia PDF Downloads 605225 The Display of Environmental Information to Promote Energy Saving Practices: Evidence from a Massive Behavioral Platform
Authors: T. Lazzarini, M. Imbiki, P. E. Sutter, G. Borragan
Abstract:
While several strategies, such as the development of more efficient appliances, the financing of insulation programs or the rolling out of smart meters represent promising tools to reduce future energy consumption, their implementation relies on people’s decisions-actions. Likewise, engaging with consumers to reshape their behavior has shown to be another important way to reduce energy usage. For these reasons, integrating the human factor in the energy transition has become a major objective for researchers and policymakers. Digital education programs based on tangible and gamified user interfaces have become a new tool with potential effects to reduce energy consumption4. The B2020 program, developed by the firm “Économie d’Énergie SAS”, proposes a digital platform to encourage pro-environmental behavior change among employees and citizens. The platform integrates 160 eco-behaviors to help saving energy and water and reducing waste and CO2 emissions. A total of 13,146 citizens have used the tool so far to declare the range of eco-behaviors they adopt in their daily lives. The present work seeks to build on this database to identify the potential impact of adopted energy-saving behaviors (n=62) to reduce the use of energy in buildings. To this end, behaviors were classified into three categories regarding the nature of its implementation (Eco-habits: e.g., turning-off the light, Eco-actions: e.g., installing low carbon technology such as led light-bulbs and Home-Refurbishments: e.g., such as wall-insulation or double-glazed energy efficient windows). General Linear Models (GLM) disclosed the existence of a significantly higher frequency of Eco-habits when compared to the number of home-refurbishments realized by the platform users. While this might be explained in part by the high financial costs that are associated with home renovation works, it also contrasts with the up to three times larger energy-savings that can be accomplished by these means. Furthermore, multiple regression models failed to disclose the expected relationship between energy-savings and frequency of adopted eco behaviors, suggesting that energy-related practices are not necessarily driven by the correspondent energy-savings. Finally, our results also suggested that people adopting more Eco-habits and Eco-actions were more likely to engage in Home-Refurbishments. Altogether, these results fit well with a growing body of scientific research, showing that energy-related practices do not necessarily maximize utility, as postulated by traditional economic models, and suggest that other variables might be triggering them. Promoting home refurbishments could benefit from the adoption of complementary energy-saving habits and actions.Keywords: energy-saving behavior, human performance, behavioral change, energy efficiency
Procedia PDF Downloads 200224 Economic Impact of Drought on Agricultural Society: Evidence Based on a Village Study in Maharashtra, India
Authors: Harshan Tee Pee
Abstract:
Climate elements include surface temperatures, rainfall patterns, humidity, type and amount of cloudiness, air pressure and wind speed and direction. Change in one element can have an impact on the regional climate. The scientific predictions indicate that global climate change will increase the number of extreme events, leading to more frequent natural hazards. Global warming is likely to intensify the risk of drought in certain parts and also leading to increased rainfall in some other parts. Drought is a slow advancing disaster and creeping phenomenon– which accumulate slowly over a long period of time. Droughts are naturally linked with aridity. But droughts occur over most parts of the world (both wet and humid regions) and create severe impacts on agriculture, basic household welfare and ecosystems. Drought condition occurs at least every three years in India. India is one among the most vulnerable drought prone countries in the world. The economic impacts resulting from extreme environmental events and disasters are huge as a result of disruption in many economic activities. The focus of this paper is to develop a comprehensive understanding about the distributional impacts of disaster, especially impact of drought on agricultural production and income through a panel study (drought year and one year after the drought) in Raikhel village, Maharashtra, India. The major findings of the study indicate that cultivating area as well as the number of cultivating households reduced after the drought, indicating a shift in the livelihood- households moved from agriculture to non-agriculture. Decline in the gross cropped area and production of various crops depended on the negative income from these crops in the previous agriculture season. All the landholding categories of households except landlords had negative income in the drought year and also the income disparities between the households were higher in that year. In the drought year, the cost of cultivation was higher for all the landholding categories due to the increased cost for irrigation and input cost. In the drought year, agriculture products (50 per cent of the total products) were used for household consumption rather than selling in the market. It is evident from the study that livelihood which was based on natural resources became less attractive to the people to due to the risk involved in it and people were moving to less risk livelihood for their sustenance.Keywords: climate change, drought, agriculture economics, disaster impact
Procedia PDF Downloads 118223 Connecting the Dots: Bridging Academia and National Community Partnerships When Delivering Healthy Relationships Programming
Authors: Nicole Vlasman, Karamjeet Dhillon
Abstract:
Over the past four years, the Healthy Relationships Program has been delivered in community organizations and schools across Canada. More than 240 groups have been facilitated in collaboration with 33 organizations. As a result, 2157 youth have been engaged in the programming. The purpose and scope of the Healthy Relationships Program are to offer sustainable, evidence-based skills through small group implementation to prevent violence and promote positive, healthy relationships in youth. The program development has included extensive networking at regional and national levels. The Healthy Relationships Program is currently being implemented, adapted, and researched within the Resilience and Inclusion through Strengthening and Enhancing Relationships (RISE-R) project. Alongside the project’s research objectives, the RISE-R team has worked to virtually share the ongoing findings of the project through a slow ontology approach. Slow ontology is a practice integrated into project systems and structures whereby slowing the pace and volume of outputs offers creative opportunities. Creative production reveals different layers of success and complements the project, the building blocks for sustainability. As a result of integrating a slow ontology approach, the RISE-R team has developed a Geographic Information System (GIS) that documents local landscapes through a Story Map feature, and more specifically, video installations. Video installations capture the cartography of space and place within the context of singular diverse community spaces (case studies). By documenting spaces via human connections, the project captures narratives, which further enhance the voices and faces of the community within the larger project scope. This GIS project aims to create a visual and interactive flow of information that complements the project's mixed-method research approach. Conclusively, creative project development in the form of a geographic information system can provide learning and engagement opportunities at many levels (i.e., within community organizations and educational spaces or with the general public). In each of these disconnected spaces, fragmented stories are connected through a visual display of project outputs. A slow ontology practice within the context of the RISE-R project documents activities on the fringes and within internal structures; primarily through documenting project successes as further contributions to the Centre for School Mental Health framework (philosophy, recruitment techniques, allocation of resources and time, and a shared commitment to evidence-based products).Keywords: community programming, geographic information system, project development, project management, qualitative, slow ontology
Procedia PDF Downloads 155222 Contribution of PALB2 and BLM Mutations to Familial Breast Cancer Risk in BRCA1/2 Negative South African Breast Cancer Patients Detected Using High-Resolution Melting Analysis
Authors: N. C. van der Merwe, J. Oosthuizen, M. F. Makhetha, J. Adams, B. K. Dajee, S-R. Schneider
Abstract:
Women representing high-risk breast cancer families, who tested negative for pathogenic mutations in BRCA1 and BRCA2, are four times more likely to develop breast cancer compared to women in the general population. Sequencing of genes involved in genomic stability and DNA repair led to the identification of novel contributors to familial breast cancer risk. These include BLM and PALB2. Bloom's syndrome is a rare homozygous autosomal recessive chromosomal instability disorder with a high incidence of various types of neoplasia and is associated with breast cancer when in a heterozygous state. PALB2, on the other hand, binds to BRCA2 and together, they partake actively in DNA damage repair. Archived DNA samples of 66 BRCA1/2 negative high-risk breast cancer patients were retrospectively selected based on the presence of an extensive family history of the disease ( > 3 affecteds per family). All coding regions and splice-site boundaries of both genes were screened using High-Resolution Melting Analysis. Samples exhibiting variation were bi-directionally automated Sanger sequenced. The clinical significance of each variant was assessed using various in silico and splice site prediction algorithms. Comprehensive screening identified a total of 11 BLM and 26 PALB2 variants. The variants detected ranged from global to rare and included three novel mutations. Three BLM and two PALB2 likely pathogenic mutations were identified that could account for the disease in these extensive breast cancer families in the absence of BRCA mutations (BLM c.11T > A, p.V4D; BLM c.2603C > T, p.P868L; BLM c.3961G > A, p.V1321I; PALB2 c.421C > T, p.Gln141Ter; PALB2 c.508A > T, p.Arg170Ter). Conclusion: The study confirmed the contribution of pathogenic mutations in BLM and PALB2 to the familial breast cancer burden in South Africa. It explained the presence of the disease in 7.5% of the BRCA1/2 negative families with an extensive family history of breast cancer. Segregation analysis will be performed to confirm the clinical impact of these mutations for each of these families. These results justify the inclusion of both these genes in a comprehensive breast and ovarian next generation sequencing cancer panel and should be screened simultaneously with BRCA1 and BRCA2 as it might explain a significant percentage of familial breast and ovarian cancer in South Africa.Keywords: Bloom Syndrome, familial breast cancer, PALB2, South Africa
Procedia PDF Downloads 236221 Impact Evaluation and Technical Efficiency in Ethiopia: Correcting for Selectivity Bias in Stochastic Frontier Analysis
Authors: Tefera Kebede Leyu
Abstract:
The purpose of this study was to estimate the impact of LIVES project participation on the level of technical efficiency of farm households in three regions of Ethiopia. We used household-level data gathered by IRLI between February and April 2014 for the year 2013(retroactive). Data on 1,905 (754 intervention and 1, 151 control groups) sample households were analyzed using STATA software package version 14. Efforts were made to combine stochastic frontier modeling with impact evaluation methodology using the Heckman (1979) two-stage model to deal with possible selectivity bias arising from unobservable characteristics in the stochastic frontier model. Results indicate that farmers in the two groups are not efficient and operate below their potential frontiers i.e., there is a potential to increase crop productivity through efficiency improvements in both groups. In addition, the empirical results revealed selection bias in both groups of farmers confirming the justification for the use of selection bias corrected stochastic frontier model. It was also found that intervention farmers achieved higher technical efficiency scores than the control group of farmers. Furthermore, the selectivity bias-corrected model showed a different technical efficiency score for the intervention farmers while it more or less remained the same for that of control group farmers. However, the control group of farmers shows a higher dispersion as measured by the coefficient of variation compared to the intervention counterparts. Among the explanatory variables, the study found that farmer’s age (proxy to farm experience), land certification, frequency of visit to improved seed center, farmer’s education and row planting are important contributing factors for participation decisions and hence technical efficiency of farmers in the study areas. We recommend that policies targeting the design of development intervention programs in the agricultural sector focus more on providing farmers with on-farm visits by extension workers, provision of credit services, establishment of farmers’ training centers and adoption of modern farm technologies. Finally, we recommend further research to deal with this kind of methodological framework using a panel data set to test whether technical efficiency starts to increase or decrease with the length of time that farmers participate in development programs.Keywords: impact evaluation, efficiency analysis and selection bias, stochastic frontier model, Heckman-two step
Procedia PDF Downloads 75220 Multi-Agent System Based Distributed Voltage Control in Distribution Systems
Authors: A. Arshad, M. Lehtonen. M. Humayun
Abstract:
With the increasing Distributed Generation (DG) penetration, distribution systems are advancing towards the smart grid technology for least latency in tackling voltage control problem in a distributed manner. This paper proposes a Multi-agent based distributed voltage level control. In this method a flat architecture of agents is used and agents involved in the whole controlling procedure are On Load Tap Changer Agent (OLTCA), Static VAR Compensator Agent (SVCA), and the agents associated with DGs and loads at their locations. The objectives of the proposed voltage control model are to minimize network losses and DG curtailments while maintaining voltage value within statutory limits as close as possible to the nominal. The total loss cost is the sum of network losses cost, DG curtailment costs, and voltage damage cost (which is based on penalty function implementation). The total cost is iteratively calculated for various stricter limits by plotting voltage damage cost and losses cost against varying voltage limit band. The method provides the optimal limits closer to nominal value with minimum total loss cost. In order to achieve the objective of voltage control, the whole network is divided into multiple control regions; downstream from the controlling device. The OLTCA behaves as a supervisory agent and performs all the optimizations. At first, a token is generated by OLTCA on each time step and it transfers from node to node until the node with voltage violation is detected. Upon detection of such a node, the token grants permission to Load Agent (LA) for initiation of possible remedial actions. LA will contact the respective controlling devices dependent on the vicinity of the violated node. If the violated node does not lie in the vicinity of the controller or the controlling capabilities of all the downstream control devices are at their limits then OLTC is considered as a last resort. For a realistic study, simulations are performed for a typical Finnish residential medium-voltage distribution system using Matlab ®. These simulations are executed for two cases; simple Distributed Voltage Control (DVC) and DVC with optimized loss cost (DVC + Penalty Function). A sensitivity analysis is performed based on DG penetration. The results indicate that costs of losses and DG curtailments are directly proportional to the DG penetration, while in case 2 there is a significant reduction in total loss. For lower DG penetration, losses are reduced more or less 50%, while for higher DG penetration, loss reduction is not very significant. Another observation is that the newer stricter limits calculated by cost optimization moves towards the statutory limits of ±10% of the nominal with the increasing DG penetration as for 25, 45 and 65% limits calculated are ±5, ±6.25 and 8.75% respectively. Observed results conclude that the novel voltage control algorithm proposed in case 1 is able to deal with the voltage control problem instantly but with higher losses. In contrast, case 2 make sure to reduce the network losses through proposed iterative method of loss cost optimization by OLTCA, slowly with time.Keywords: distributed voltage control, distribution system, multi-agent systems, smart grids
Procedia PDF Downloads 312219 The Design of a Computer Simulator to Emulate Pathology Laboratories: A Model for Optimising Clinical Workflows
Authors: M. Patterson, R. Bond, K. Cowan, M. Mulvenna, C. Reid, F. McMahon, P. McGowan, H. Cormican
Abstract:
This paper outlines the design of a simulator to allow for the optimisation of clinical workflows through a pathology laboratory and to improve the laboratory’s efficiency in the processing, testing, and analysis of specimens. Often pathologists have difficulty in pinpointing and anticipating issues in the clinical workflow until tests are running late or in error. It can be difficult to pinpoint the cause and even more difficult to predict any issues which may arise. For example, they often have no indication of how many samples are going to be delivered to the laboratory that day or at a given hour. If we could model scenarios using past information and known variables, it would be possible for pathology laboratories to initiate resource preparations, e.g. the printing of specimen labels or to activate a sufficient number of technicians. This would expedite the clinical workload, clinical processes and improve the overall efficiency of the laboratory. The simulator design visualises the workflow of the laboratory, i.e. the clinical tests being ordered, the specimens arriving, current tests being performed, results being validated and reports being issued. The simulator depicts the movement of specimens through this process, as well as the number of specimens at each stage. This movement is visualised using an animated flow diagram that is updated in real time. A traffic light colour-coding system will be used to indicate the level of flow through each stage (green for normal flow, orange for slow flow, and red for critical flow). This would allow pathologists to clearly see where there are issues and bottlenecks in the process. Graphs would also be used to indicate the status of specimens at each stage of the process. For example, a graph could show the percentage of specimen tests that are on time, potentially late, running late and in error. Clicking on potentially late samples will display more detailed information about those samples, the tests that still need to be performed on them and their urgency level. This would allow any issues to be resolved quickly. In the case of potentially late samples, this could help to ensure that critically needed results are delivered on time. The simulator will be created as a single-page web application. Various web technologies will be used to create the flow diagram showing the workflow of the laboratory. JavaScript will be used to program the logic, animate the movement of samples through each of the stages and to generate the status graphs in real time. This live information will be extracted from an Oracle database. As well as being used in a real laboratory situation, the simulator could also be used for training purposes. ‘Bots’ would be used to control the flow of specimens through each step of the process. Like existing software agents technology, these bots would be configurable in order to simulate different situations, which may arise in a laboratory such as an emerging epidemic. The bots could then be turned on and off to allow trainees to complete the tasks required at that step of the process, for example validating test results.Keywords: laboratory-process, optimization, pathology, computer simulation, workflow
Procedia PDF Downloads 286218 Measuring Biobased Content of Building Materials Using Carbon-14 Testing
Authors: Haley Gershon
Abstract:
The transition from using fossil fuel-based building material to formulating eco-friendly and biobased building materials plays a key role in sustainable building. The growing demand on a global level for biobased materials in the building and construction industries heightens the importance of carbon-14 testing, an analytical method used to determine the percentage of biobased content that comprises a material’s ingredients. This presentation will focus on the use of carbon-14 analysis within the building materials sector. Carbon-14, also known as radiocarbon, is a weakly radioactive isotope present in all living organisms. Any fossil material older than 50,000 years will not contain any carbon-14 content. The radiocarbon method is thus used to determine the amount of carbon-14 content present in a given sample. Carbon-14 testing is performed according to ASTM D6866, a standard test method developed specifically for biobased content determination of material in solid, liquid, or gaseous form, which requires radiocarbon dating. Samples are combusted and converted into a solid graphite form and then pressed onto a metal disc and mounted onto a wheel of an accelerator mass spectrometer (AMS) machine for the analysis. The AMS instrument is used in order to count the amount of carbon-14 present. By submitting samples for carbon-14 analysis, manufacturers of building materials can confirm the biobased content of ingredients used. Biobased testing through carbon-14 analysis reports results as percent biobased content, indicating the percentage of ingredients coming from biomass sourced carbon versus fossil carbon. The analysis is performed according to standardized methods such as ASTM D6866, ISO 16620, and EN 16640. Products 100% sourced from plants, animals, or microbiological material are therefore 100% biobased, while products sourced only from fossil fuel material are 0% biobased. Any result in between 0% and 100% biobased indicates that there is a mixture of both biomass-derived and fossil fuel-derived sources. Furthermore, biobased testing for building materials allows manufacturers to submit eligible material for certification and eco-label programs such as the United States Department of Agriculture (USDA) BioPreferred Program. This program includes a voluntary labeling initiative for biobased products, in which companies may apply to receive and display the USDA Certified Biobased Product label, stating third-party verification and displaying a product’s percentage of biobased content. The USDA program includes a specific category for Building Materials. In order to qualify for the biobased certification under this product category, examples of product criteria that must be met include minimum 62% biobased content for wall coverings, minimum 25% biobased content for lumber, and a minimum 91% biobased content for floor coverings (non-carpet). As a result, consumers can easily identify plant-based products in the marketplace.Keywords: carbon-14 testing, biobased, biobased content, radiocarbon dating, accelerator mass spectrometry, AMS, materials
Procedia PDF Downloads 158217 Characterization of Surface Microstructures on Bio-Based PLA Fabricated with Nano-Imprint Lithography
Authors: D. Bikiaris, M. Nerantzaki, I. Koliakou, A. Francone, N. Kehagias
Abstract:
In the present study, the formation of structures in poly(lactic acid) (PLA) has been investigated with respect to producing areas of regular, superficial features with dimensions comparable to those of cells or biological macromolecules. Nanoimprint lithography, a method of pattern replication in polymers, has been used for the production of features ranging from tens of micrometers, covering areas up to 1 cm², down to hundreds of nanometers. Both micro- and nano-structures were faithfully replicated. Potentially, PLA has wide uses within biomedical fields, from implantable medical devices, including screws and pins, to membrane applications, such as wound covers, and even as an injectable polymer for, for example, lipoatrophy. The possibility of fabricating structured PLA surfaces, with structures of the dimensions associated with cells or biological macro- molecules, is of interest in fields such as cellular engineering. Imprint-based technologies have demonstrated the ability to selectively imprint polymer films over large areas resulting in 3D imprints over flat, curved or pre-patterned surfaces. Here, we compare nano-patterned with nano-patterned by nanoimprint lithography (NIL) PLA film. A silicon nanostructured stamp (provided by Nanotypos company) having positive and negative protrusions was used to pattern PLA films by means of thermal NIL. The polymer film was heated from 40°C to 60°C above its Tg and embossed with a pressure of 60 bars for 3 min. The stamp and substrate were demolded at room temperature. Scanning electron microscope (SEM) images showed good replication fidelity of the replicated Si stamp. Contact-angle measurements suggested that positive microstructuring of the polymer (where features protrude from the polymer surface) produced a more hydrophilic surface than negative micro-structuring. The ability to structure the surface of the poly(lactic acid), allied to the polymer’s post-processing transparency and proven biocompatibility. Films produced in this were also shown to enhance the aligned attachment behavior and proliferation of Wharton’s Jelly Mesenchymal Stem cells, leading to the observed growth contact guidance. The bacterial attachment patterns of some bacteria, highlighted that the nano-patterned PLA structure can reduce the propensity for the bacteria to attach to the surface, with a greater bactericidal being demonstrated activity against the Staphylococcus aureus cells. These biocompatible, micro- and nanopatterned PLA surfaces could be useful for polymer– cell interaction experiments at dimensions at, or below, that of individual cells. Indeed, post-fabrication modification of the microstructured PLA surface, with materials such as collagen (which can further reduce the hydrophobicity of the surface), will extend the range of applications, possibly through the use of PLA’s inherent biodegradability. Further study is being undertaken to examine whether these structures promote cell growth on the polymer surface.Keywords: poly(lactic acid), nano-imprint lithography, anti-bacterial properties, PLA
Procedia PDF Downloads 330216 Establishment and Evaluation of a Nutrition Therapy Guide and 7-Day Menu for Educating Hemodialysis Patients: A Case Study of Douala General Hospital, Cameroon
Authors: Ngwa Lodence Njwe
Abstract:
This study investigated the response of hemodialysis patients to an established nutrition therapy guide accompanied by a 7-day menu plan administered for a month. End Stage Renal Disease (ESRD), also known as End Stage Kidney Disease (ESKD), is a non-communicable disease primarily caused by hypertension and diabetes, posing significant challenges in both developed and developing nations. Hemodialysis is a key treatment for these patients. In this experimental study, 100 hemodialysis patients from Douala General Hospital in Cameroon participated. A questionnaire was used to collect data on sociodemographic and anthropometric characteristics, health status, and dietary intake, while medical records provided biomedical data. The levels of the biochemical parameters (Phosphorus, calcium and hemoglobin) were determined before and one month after the distribution of the nutrition education guide and the use of a 7-day menu plan. The Phosphorus and Calcium levels were measured using an LTCC03 semi-automatic chemistry analyzer. Blood was collected from each patient into a test tube, allowed to clot and centrifuged. 50µl of the serum was aspirated by the analyzer for Ca and P level analysis, and results were read from the display. The hemoglobin level was measured using the URIT–12 hemoglobin Meter. The blood sample was collected by hand prick and placed in a strip, and the results were read from the screen. The means of the biochemical parameters were then computed. The most prevalent age group was 40-49 years, with males constituting 70% and females 30% of respondents. Among these patients, 80% were hypertensive, 3% had both hypertension and diabetes, 9% were hypertensive, diabetic, and obese, and 1% suffered from hypertension and heart failure. Analysis of anthropometric parameters revealed a high prevalence of underweight, overweight, and obesity, highlighting the urgent need for targeted nutrition interventions to modify cooking methods, enhance food choices, and increase dietary variety for improved quality of life. Before the nutrition therapy guide was implemented, average calcium levels were 73.05 mg/L for males and 89.44 mg/L for females; post-implementation, these values increased to 77.55 mg/L and 91.44 mg/L, respectively. Conversely, average phosphorus levels decreased from 42.05 mg/L for males and 43.55 mg/L for females to 41.05 mg/L and 39.3 mg/L, respectively, after the intervention. Additionally, average hemoglobin levels increased from 8.35 g/dL for males and 8.5 g/dL for females to 9.2 g/dL and 8.95 g/dL, respectively. The findings confirm that the nutrition therapy guide and the 7-day menu significantly impacted the biomedical parameters of hemodialysis patients, underscoring the need for ongoing nutrition education and counseling for this population.Keywords: end stage kidney disease, nutrition therapy guide, nutritional status, anthropometric parameters, food frequency, biomedical data
Procedia PDF Downloads 27215 Strategic Asset Allocation Optimization: Enhancing Portfolio Performance Through PCA-Driven Multi-Objective Modeling
Authors: Ghita Benayad
Abstract:
Asset allocation, which affects the long-term profitability of portfolios by distributing assets to fulfill a range of investment objectives, is the cornerstone of investment management in the dynamic and complicated world of financial markets. This paper offers a technique for optimizing strategic asset allocation with the goal of improving portfolio performance by addressing the inherent complexity and uncertainty of the market through the use of Principal Component Analysis (PCA) in a multi-objective modeling framework. The study's first section starts with a critical evaluation of conventional asset allocation techniques, highlighting how poorly they are able to capture the intricate relationships between assets and the volatile nature of the market. In order to overcome these challenges, the project suggests a PCA-driven methodology that isolates important characteristics influencing asset returns by decreasing the dimensionality of the investment universe. This decrease provides a stronger basis for asset allocation decisions by facilitating a clearer understanding of market structures and behaviors. Using a multi-objective optimization model, the project builds on this foundation by taking into account a number of performance metrics at once, including risk minimization, return maximization, and the accomplishment of predetermined investment goals like regulatory compliance or sustainability standards. This model provides a more comprehensive understanding of investor preferences and portfolio performance in comparison to conventional single-objective optimization techniques. While applying the PCA-driven multi-objective optimization model to historical market data, aiming to construct portfolios better under different market situations. As compared to portfolios produced from conventional asset allocation methodologies, the results show that portfolios optimized using the proposed method display improved risk-adjusted returns, more resilience to market downturns, and better alignment with specified investment objectives. The study also looks at the implications of this PCA technique for portfolio management, including the prospect that it might give investors a more advanced framework for navigating financial markets. The findings suggest that by combining PCA with multi-objective optimization, investors may obtain a more strategic and informed asset allocation that is responsive to both market conditions and individual investment preferences. In conclusion, this capstone project improves the field of financial engineering by creating a sophisticated asset allocation optimization model that integrates PCA with multi-objective optimization. In addition to raising concerns about the condition of asset allocation today, the proposed method of portfolio management opens up new avenues for research and application in the area of investment techniques.Keywords: asset allocation, portfolio optimization, principle component analysis, multi-objective modelling, financial market
Procedia PDF Downloads 47214 Evaluation of Differential Interaction between Flavanols and Saliva Proteins by Diffusion and Precipitation Assays on Cellulose Membranes
Authors: E. Obreque-Slier, V. Contreras-Cortez, R. López-Solís
Abstract:
Astringency is a drying, roughing, and sometimes puckering sensation that is experienced on the various oral surfaces during or immediately after tasting foods. This sensation has been closely related to the interaction and precipitation between salivary proteins and polyphenols, specifically flavanols or proanthocyanidins. In addition, the type and concentration of proanthocyanidin influences significantly the intensity of the astringency and consequently the protein/proanthocyanidin interaction. However, most of the studies are based on the interaction between saliva and highly complex polyphenols, without considering the effect of monomeric proanthoancyanidins present in different foods. The aim of this study was to evaluate the effect of different monomeric proanthocyanidins on the diffusion and precipitation of salivary proteins. Thus, solutions of catechin, epicatechin, epigallocatechin and gallocatechin (0, 2.0, 4.0, 6.0, 8.0 and 10 mg/mL) were mixed with human saliva (1: 1 v/v). After incubation for 5 min at room temperature, 15 µL aliquots of each mix were dotted on a cellulose membrane and allowed to dry spontaneously at room temperature. The membrane was fixed, rinsed and stained for proteins with Coomassie blue. After exhaustive washing in 7% acetic acid, the membrane was rinsed once in distilled water and dried under a heat lamp. Both diffusion area and stain intensity of the protein spots were semiqualitative estimates for protein-tannin interaction (diffusion test). The rest of the whole saliva-phenol solution mixtures of the diffusion assay were centrifuged, and 15-μL aliquots from each of the supernatants were dotted on a cellulose membrane. The membrane was processed for protein staining as indicated above. The blue-stained area of protein distribution corresponding to each of the extract dilution-saliva mixtures was quantified by Image J 1.45 software. Each of the assays was performed at least three times. Initially, salivary proteins display a biphasic distribution on cellulose membranes, that is, when aliquots of saliva are placed on absorbing cellulose membranes, and free diffusion of saliva is allowed to occur, a non-diffusible protein fraction becomes surrounded by highly diffusible salivary proteins. In effect, once diffusion has ended, a protein-binding dye shows an intense blue-stained roughly circular area close to the spotting site (non-diffusible fraction) (NDF) which becomes surrounded by a weaker blue-stained outer band (diffusible fraction) (DF). Likewise, the diffusion test showed that epicatechin caused the complete disappearance of DF from saliva with 2 mg/mL. Also, epigallocatechin and gallocatechin caused a similar effect with 4 mg/mL, while catechin generated the same effect at 8 mg/mL. In the precipitation test, the use of epicatechin and gallocatechin generated evident precipitates at the bottom of the Eppendorf tubes. In summary, the flavanol type differentially affects the diffusion and precipitation of saliva, which would affect the sensation of astringency perceived by consumers.Keywords: astringency, polyphenols, tannins, tannin-protein interaction
Procedia PDF Downloads 199213 Linear Evolution of Compressible Görtler Vortices Subject to Free-Stream Vortical Disturbances
Authors: Samuele Viaro, Pierre Ricco
Abstract:
Görtler instabilities generate in boundary layers from an unbalance between pressure and centrifugal forces caused by concave surfaces. Their spatial streamwise evolution influences transition to turbulence. It is therefore important to understand even the early stages where perturbations, still small, grow linearly and could be controlled more easily. This work presents a rigorous theoretical framework for compressible flows using the linearized unsteady boundary region equations, where only the streamwise pressure gradient and streamwise diffusion terms are neglected from the full governing equations of fluid motion. Boundary and initial conditions are imposed through an asymptotic analysis in order to account for the interaction of the boundary layer with free-stream turbulence. The resulting parabolic system is discretize with a second-order finite difference scheme. Realistic flow parameters are chosen from wind tunnel studies performed at supersonic and subsonic conditions. The Mach number ranges from 0.5 to 8, with two different radii of curvature, 5 m and 10 m, frequencies up to 2000 Hz, and vortex spanwise wavelengths from 5 mm to 20 mm. The evolution of the perturbation flow is shown through velocity, temperature, pressure profiles relatively close to the leading edge, where non-linear effects can still be neglected, and growth rate. Results show that a global stabilizing effect exists with the increase of Mach number, frequency, spanwise wavenumber and radius of curvature. In particular, at high Mach numbers curvature effects are less pronounced and thermal streaks become stronger than velocity streaks. This increase of temperature perturbations saturates at approximately Mach 4 flows, and is limited in the early stage of growth, near the leading edge. In general, Görtler vortices evolve closer to the surface with respect to a flat plate scenario but their location shifts toward the edge of the boundary layer as the Mach number increases. In fact, a jet-like behavior appears for steady vortices having small spanwise wavelengths (less than 10 mm) at Mach 8, creating a region of unperturbed flow close to the wall. A similar response is also found at the highest frequency considered for a Mach 3 flow. Larger vortices are found to have a higher growth rate but are less influenced by the Mach number. An eigenvalue approach is also employed to study the amplification of the perturbations sufficiently downstream from the leading edge. These eigenvalue results are compared with the ones obtained through the initial value approach with inhomogeneous free-stream boundary conditions. All of the parameters here studied have a significant influence on the evolution of the instabilities for the Görtler problem which is indeed highly dependent on initial conditions.Keywords: compressible boundary layers, Görtler instabilities, receptivity, turbulence transition
Procedia PDF Downloads 253212 A Delphi Study of Factors Affecting the Forest Biorefinery Development in the Pulp and Paper Industry: The Case of Bio-Based Products
Authors: Natasha Gabriella, Josef-Peter Schöggl, Alfred Posch
Abstract:
Being a mature industry, pulp and paper industry (PPI) possess strength points coming from its existing infrastructure, technology know-how, and abundant availability of biomass. However, the declining trend of the wood-based products sales sends a clear signal to the industry to transform its business model in order to increase its profitability. With the emerging global attention on bio-based economy and circular economy, coupled with the low price of fossil feedstock, the PPI starts to integrate biorefinery as a value-added business model to keep the industry’s competitiveness. Nonetheless, biorefinery as an innovation exposes the PPI with some barriers, of which the uncertainty of the promising product becomes one of the major hurdles. This study aims to assess factors that affect the diffusion and development of forest biorefinery in the PPI, including drivers, barriers, advantages, disadvantages, as well as the most promising bio-based products of forest biorefinery. The study examines the identified factors according to the layer of business environment, being the macro-environment, industry, and strategic group level. Besides, an overview of future state of the identified factors is elaborated as to map necessary improvements for implementing forest biorefinery. A two-phase Delphi method is used to collect the empirical data for the study, comprising of an online-based survey and interviews. Delphi method is an effective communication tools to elicit ideas from a group of experts to further reach a consensus of forecasting future trends. Collaborating a total of 50 experts in the panel, the study reveals that influential factors are found in every layers of business of the PPI. The politic dimension is apparent to have a significant influence for tackling the economy barrier while reinforcing the environmental and social benefits in the macro-environment. In the industry level, the biomass availability appears to be a strength point of the PPI while the knowledge gap on technology and market seem to be barriers. Consequently, cooperation with academia and the chemical industry has to be improved. Human resources issue is indicated as one important premise behind the preceding barrier, along with the indication of the PPI’s resistance towards biorefinery implementation as an innovation. Further, cellulose-based products are acknowledged for near-term product development whereas lignin-based products are emphasized to gain importance in the long-term future.Keywords: forest biorefinery, pulp and paper, bio-based product, Delphi method
Procedia PDF Downloads 277211 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 195210 Investigating the Relationship between Job Satisfaction, Role Identity, and Turnover Intention for Nurses in Outpatient Department
Authors: Su Hui Tsai, Weir Sen Lin, Rhay Hung Weng
Abstract:
There are numerous outpatient departments at hospitals with enormous amounts of outpatients. Although the work of outpatient nursing staff does not include the ward, emergency and critical care units that involve patient life-threatening conditions, the work is cumbersome and requires facing and dealing with a large number of outpatients in a short period of time. Therefore, nursing staff often do not feel satisfied with their work and cannot identify with their professional role, leading to intentions to leave their job. Thus, the main purpose of this study is to explore the correlation between the job satisfaction and role identity of nursing staff with turnover intention. This research was conducted using a questionnaire, and the subjects were outpatient nursing staff in three regional hospitals in Southern Taiwan. A total of 175 questionnaires were distributed, and 166 valid questionnaires were returned. After collecting the data, the reliability and validity of the study variables were confirmed by confirmatory factor analysis. The influence of role identity and job satisfaction on nursing staff’s turnover intention was analyzed by descriptive analysis, one-way ANOVA, Pearson correlation analysis and multiple regression analysis. Results showed that 'role identity' had significant differences in different types of marriages. Job satisfaction of 'grasp of environment' had significant differences in different levels of education. Job satisfaction of 'professional growth' and 'shifts and days off' showed significant differences in different types of marriages. 'Role identity' and 'job satisfaction' were negatively correlated with turnover intention respectively. Job satisfaction of 'salary and benefits' and 'grasp of environment' were significant predictors of role identity. The higher the job satisfaction of 'salary and benefits' and 'grasp of environment', the higher the role identity. Job satisfaction of 'patient and family interaction' were significant predictors of turnover intention. The lower the job satisfaction of 'patient and family interaction', the higher the turnover intention. This study found that outpatient nursing staff had the lowest satisfaction towards salary structure. It is recommended that bonuses, promotion opportunities and other incentives be established to increase the role identity of outpatient nursing staff. The results showed that the higher the job satisfaction of 'salary and benefits' and 'grasp of environment', the higher the role identity. It is recommended that regular evaluations be conducted to reward nursing staff with excellent service and invite nursing staff to share their work experiences and thoughts, to enhance nursing staff’s expectation and identification of their occupational role, as well as instilling the concept of organizational service and organizational expectations of emotional display. The results showed that the lower the job satisfaction of 'patient and family interaction', the higher the turnover intention. It is recommended that interpersonal communication and workplace violence prevention educational training courses be organized to enhance the communication and interaction of nursing staff with patients and their families.Keywords: outpatient, job satisfaction, turnover, intention
Procedia PDF Downloads 146209 Holographic Art as an Approach to Enhance Visual Communication in Egyptian Community: Experimental Study
Authors: Diaa Ahmed Mohamed Ahmedien
Abstract:
Nowadays, it cannot be denied that the most important interactive arts trends have appeared as a result of significant scientific mutations in the modern sciences, and holographic art is not an exception, where it is considered as a one of the most important major contemporary interactive arts trends in visual arts. Holographic technique had been evoked through the modern physics application in late 1940s, for the improvement of the quality of electron microscope images by Denis Gabor, until it had arrived to Margaret Benyon’s art exhibitions, and then it passed through a lot of procedures to enhance its quality and artistic applications technically and visually more over 70 years in visual arts. As a modest extension to these great efforts, this research aimed to invoke extraordinary attempt to enroll sample of normal people in Egyptian community in holographic recording program to record their appreciated objects or antiques, therefore examine their abilities to interact with modern techniques in visual communication arts. So this research tried to answer to main three questions: 'can we use the analog holographic techniques to unleash new theoretical and practical knowledge in interactive arts for public in Egyptian community?', 'to what extent holographic art can be familiar with public and make them able to produce interactive artistic samples?', 'are there possibilities to build holographic interactive program for normal people which lead them to enhance their understanding to visual communication in public and, be aware of interactive arts trends?' This research was depending in its first part on experimental methods, where it conducted in Laser lab at Cairo University, using Nd: Yag Laser 532 nm, and holographic optical layout, with selected samples of Egyptian people that they have been asked to record their appreciated object, after they had already learned recording methods, and in its second part on a lot of discussion panel had conducted to discuss the result and how participants felt towards their holographic artistic products through survey, questionnaires, take notes and critiquing holographic artworks. Our practical experiments and final discussions have already lead us to say that this experimental research was able to make most of participants pass through paradigm shift in their visual and conceptual experiences towards more interaction with contemporary visual arts trends, as an attempt to emphasize to the role of mature relationship between the art, science and technology, to spread interactive arts out in our community through the latest scientific and artistic mutations around the world and the role of this relationship in our societies particularly with those who have never been enrolled in practical arts programs before.Keywords: Egyptian community, holographic art, laser art, visual art
Procedia PDF Downloads 479208 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'
Authors: Anthony Coogan
Abstract:
Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle
Procedia PDF Downloads 199207 Confessors in Im Sun-dŭk’s Short Stories: Interiority of Korean Women under the End of Japanese Colonial Rule
Authors: Min Koo Choi
Abstract:
The paper will examine Im Sun-dŭk’s two short stories, 'Iryoil' (Sunday, 1937) and 'Nazuoya' (A Godmother, 1942), which illuminate the subjects of Korean intellectuals going through the later period of a harsh and oppressive Japanese colonial rule. When Japan went to war against China in 1937, Korea, a colony of Japan since 1910, became an outpost for Japanese expansionism into China, and the Korean people were mobilized into the war effort. Nationalist movements and radical ideas that posed a threat and opposition to Japanese colonial rule in Korea and its colonial expansionism were ruthlessly suppressed, and Koreans were forcibly assimilated into becoming Japanese citizens without political rights. Racial discrimination between Koreans and Japanese was prevalent. Im Sun-dŭk, who participated in the Socialist movement in the 1930s, had his debut as a literary writer and a critic in the late 1930s, when Korean literary society was reincorporated in order to collaborate with the Japanese war effort through writing and public speech. Sun-duk's writing illuminates the unique internal landscape of a female subject who strives to live on while preserving her commitment and dignity under the circumstances that force Korean intellectuals either to collaborate with or acquiesce to Japanese colonial rule. 'Iryoil' (Sunday, 1937) foregrounds an educated intellectual, Hyeyŏng, who supplies her fiancé in prison for political involvement in resistance against Japan. On Sundays, she turns down her friends’ suggestion for enjoying holidays outside, due to her indebtedness to her fiancé. Her fiancé's imprisonment indicates the social conscience that still remains, and she seeks to share the commitment and suffering with her fiancé. The short story 'Nazuoya' (A Godmother, 1942), written in Japanese due to the suppression of Korean language publications at the time, also problematizes Japanese policy that forces Koreans to change their names into Japanese. Through the narrator I, who struggles to find a meaningful name for her cousin brother’s baby, she highlights how meaningful one’s name is for one’s life and identity. What makes her two stories unique is that her writing draws other people’s confessions into its own narrative through fragmentary forms, such as part of letter or reflection. The voices of others are intersected with the main character in 'Iryoil' (Sunday, 1937) and a narrator in 'Nazuoya' (A Godmother, 1942). In many ways, the narrator and main character provide the confessional voices who display the characters' gloomy interiorities. Even though these confessional voices do not share the commitment and values, both the main character and I in the stories reveal a more open set of viewpoints to them. In this way, they seek to form bonds and encouragement and acquire a more resilient sensibility that embraces those who strive to survive and endure in the gloomy days of the later period of Japanese colonial rule.Keywords: Im Sun-dŭk, Japanese colonial rule, Korean literature, socialist movement
Procedia PDF Downloads 279206 Benefits of Using Social Media and Collaborative Online Platforms in PBL
Authors: Susanna Graziano, Lydia Krstic Ward
Abstract:
The purpose of this presentation is to demonstrate the steps of using multimedia and collaborative platforms in project-based learning. The presentation will demonstrate the stages of the learning project with various components of independent and collaborative learning, where students research the topic, share information, prepare a survey, use social media (Facebook, Instagram, WhasApp) and collaborative platforms (wikispaces.com and Google docs) to collect, analyze and process data, then produce reports and logos to be displayed as a final product. At the beginning of the presentation participants will answer a questionnaire about project based learning and share their experience on using social media, real–world project work and collaborative learning. Using a PPP, the presentation will walk participants through the steps of a completed project where tertiary education students are involved in putting together a multimedia campaign for safe driving in Kuwait. The research component of the project entails taking a holistic view on the problem of the high death rate in traffic accidents. The final goal of the project is to lead students to raise public awareness about the importance of safe driving. The project steps involve using the social media and collaborative platforms for collecting data and sharing the required materials to be used in the final product – a display of written reports, slogans and videos, as well as oral presentations. The same structure can be used to organize a multimedia campaign focusing on other issues, whilst scaffolding on students’ ability to brainstorm, retrieve information, organize it and engage in collaborative/ cooperative learning whilst being immersed in content-based learning as well as in authentic tasks. More specifically, the project we carried out at Box Hill College was a real-world one and involved a multimedia Campaign for Safe Driving since reckless driving is one of the major problems in the country. The idea for the whole project started by a presentation given by a board member of the Kuwaiti Society for Traffic Safety who was invited to college and spoke about: • Driving laws in the country, • What causes car accidents, • Driving safety tips. The principal goal of this project was to let students consider problems of traffic in Kuwait from different points of view. They also had to address the number and causes of accidents, evaluate the effectiveness of the local traffic law in order to send a warning about the importance of safe driving and, finally, suggest ways of its improvement. Benefits included: • Engagement, • Autonomy, • Motivation, • Content knowledge, • Language mastery, • Enhanced critical thinking, • Increased metacognitive awareness, • Improved social skills, • Authentic experience.Keywords: social media, online learning platforms, collaborative platforms, project based learning
Procedia PDF Downloads 425205 Analysis of the Barriers and Aids That Lecturers Offer to Students with Disabilities
Authors: Anabel Moriña
Abstract:
In recent years, advances have been made in disability policy at Spanish universities, especially in terms of creating more inclusive learning environments. Nevertheless, while efforts to foster inclusion at the tertiary level -and the growing number of students with disabilities at university- are clear signs of progress, serious barriers to full participation in learning still exist. The research shows that university responses to diversity tend to be reactive, not proactive; as a result, higher education (HE) environments can be especially disabling. It has been demonstrated that the performance of students with disabilities is closely linked to the good will of university faculty and staff. Lectures are key players when it comes to helping or hindering students throughout the teaching/learning process. This paper presents an analysis of how lecturers respond to students with disabilities, the initial question being: do lecturers aid or hinder students? The general aim is to analyse-by listen to the students themselves-lecturers barriers and support identified as affecting academic performance and overall perception of the higher education (HE) experience. Biographical-narrative methodology was employed. This research analysed the results differentiating by fields of knowledge. The research was conducted in two phases: discussion groups along with individual oral/written interviews were set up with 44 students with disabilities and mini life histories were completed for 16 students who participated in the first stage. The study group consisted of students with disabilities enrolled during three academic years. The results of this paper noted that participating students identified many more barriers than bridges when speaking about the role lecturers play in their learning experience. Findings are grouped into several categories: Faculty attitudes when “dealing with” students with disabilities, teaching methodologies, curricular adaptations, and faculty training in working with students. Faculty does not always display appropriate attitudes towards students with disabilities. Study participants speak of them turning their backs on their problems-or behaving in an awkward manner. In many cases, it seems lecturers feel that curricular adaptations of any kind are a form of favouritism. Positive attitudes, however, often depend almost entirely on the good will of faculty and-although well received by students-are hard to come by. As the participants themselves suggest, this study confirms that good teaching practices not only benefit students with disabilities but the student body as a whole. In this sense, inclusive curricula provide new opportunities for all students. A general coincidence has been the lack of training on behalf of lecturers to adequately attend disabled students, and the need to cover this shortage. This can become a primary barrier and is more often due to deficient faculty training than to inappropriate attitudes on the part of lecturers. In conclusion, based on this research we can conclude that more barriers than bridges exist. That said, students do report receiving a good deal of support from their lecturers-although almost exclusively in a spirit of good will; when lecturers do help, however, it tends to have a very positive impact on students' academic performance.Keywords: barriers, disability, higher education, lecturers
Procedia PDF Downloads 255204 Narcissism in the Life of Howard Hughes: A Psychobiographical Exploration
Authors: Alida Sandison, Louise A. Stroud
Abstract:
Narcissism is a personality configuration which has both normal and pathological personality expressions. Narcissism is highly complex, and is linked to a broad field of research. There are both dimensional and categorical conceptualisations of narcissism, and a variety of theoretical formulations that have been put forward to understand the narcissistic personality configuration. Currently, Kernberg’s Object Relations theory is well supported for this purpose. The complexity and particular defense mechanisms at play in the narcissistic personality make it a difficult personality configuration worth further research. Psychobiography as a methodology allows for the exploration of the lived life, and is thus a useful methodology to surmount these inherent challenges. Narcissism has been a focus of academic interest for a long time, and although there is a lot of research done in this area, to the researchers' knowledge, narcissistic dynamics have never been explored within a psychobiographical format. Thus, the primary aim of the research was to explore and describe narcissism in the life of Howard Hughes, with the objective of gaining further insight into narcissism through the use of this unconventional research approach. Hughes was chosen as subject for the study as he is renowned as an eccentric billionaire who had his revolutionary effect on the world, but was concurrently disturbed within his personal pathologies. Hughes was dynamic in three different sectors, namely motion pictures, aviation and gambling. He became more and more reclusive as he entered into middle age. From his early fifties he was agoraphobic, and the social network of connectivity that could reasonably be expected from someone in the top of their field was notably distorted. Due to his strong narcissistic personality configuration, and the interpersonal difficulties he experienced, Hughes represents an ideal figure to explore narcissism. The study used a single case study design, and purposive sampling to select Hughes. Qualitative data was sampled, using secondary data sources. Given that Hughes was a famous figure, there is a plethora of information on his life, which is primarily autobiographical. This includes books written about his life, and archival material in the form of newspaper articles, interviews and movies. Gathered data were triangulated to avoid the effect of author bias, and increase the credibility of the data used. It was collected using Yin’s guidelines for data collection. Data was analysed using Miles and Huberman strategy of data analysis, which consists of three steps, namely, data reduction, data display, and conclusion drawing and verification. Patterns which emerged in the data highlighted the defense mechanisms used by Hughes, in particular that of splitting and projection, in defending his sense of self. These defense mechanisms help us to understand the high levels of entitlement and paranoia experienced by Hughes. Findings provide further insight into his sense of isolation and difference, and the consequent difficulty he experienced in maintaining connections with others. Findings furthermore confirm the effectiveness of Kernberg’s theory in understanding narcissism observing an individual life.Keywords: Howard Hughes, narcissism, narcissistic defenses, object relations
Procedia PDF Downloads 356203 Design of Smart Catheter for Vascular Applications Using Optical Fiber Sensor
Authors: Lamiek Abraham, Xinli Du, Yohan Noh, Polin Hsu, Tingting Wu, Tom Logan, Ifan Yen
Abstract:
In the field of minimally invasive, smart medical instruments such as catheters and guidewires are typically used at a remote distance to gain access to the diseased artery, often negotiating tortuous, complex, and diseased vessels in the process. Three optical fiber sensors with a diameter of 1.5mm each that are 120° apart from each other is proposed to be mounted into a catheter-based pump device with a diameter of 10mm. These sensors are configured to solve the challenges surgeons face during insertion through curvy major vessels such as the aortic arch. Moreover, these sensors deal with providing information on rubbing the walls and shape sensing. This study presents an experimental and mathematical models of the optical fiber sensors with 2 degrees of freedom. There are two eight gear-shaped tubes made up of 3D printed thermoplastic Polyurethane (TPU) material that are connected. The optical fiber sensors are mounted inside the first tube for protection from external light and used TPU material as a prototype for a catheter. The second tube is used as a flat reflection for the light intensity modulation-based optical fiber sensors. The first tube is attached to the linear guide for insertion and withdrawal purposes and can manually turn it 45° by manipulating the tube gear. A 3D hard material phantom was developed that mimics the aortic arch anatomy structure in which the test was carried out. During the insertion of the sensors into the 3D phantom, datasets are obtained in terms of voltage, distance, and position of the sensors. These datasets reflect the characteristics of light intensity modulation of the optical fiber sensors with a plane project of the aortic arch structure shape. Mathematical modeling of the light intensity was carried out based on the projection plane and experiment set-up. The performance of the system was evaluated in terms of its accuracy in navigating through the curvature and information on the position of the sensors by investigating 40 single insertions of the sensors into the 3D phantom. The experiment demonstrated that the sensors were effectively steered through the 3D phantom curvature and to desired target references in all 2 degrees of freedom. The performance of the sensors echoes the reflectance of light theory, where the smaller the radius of curvature, the more of the shining LED lights are reflected and received by the photodiode. A mathematical model results are in good agreement with the experiment result and the operation principle of the light intensity modulation of the optical fiber sensors. A prototype of a catheter using TPU material with three optical fiber sensors mounted inside has been developed that is capable of navigating through the different radius of curvature with 2 degrees of freedom. The proposed system supports operators with pre-scan data to make maneuverability and bendability through curvy major vessels easier, accurate, and safe. The mathematical modelling accurately fits the experiment result.Keywords: Intensity modulated optical fiber sensor, mathematical model, plane projection, shape sensing.
Procedia PDF Downloads 252202 A Grid Synchronization Method Based On Adaptive Notch Filter for SPV System with Modified MPPT
Authors: Priyanka Chaudhary, M. Rizwan
Abstract:
This paper presents a grid synchronization technique based on adaptive notch filter for SPV (Solar Photovoltaic) system along with MPPT (Maximum Power Point Tracking) techniques. An efficient grid synchronization technique offers proficient detection of various components of grid signal like phase and frequency. It also acts as a barrier for harmonics and other disturbances in grid signal. A reference phase signal synchronized with the grid voltage is provided by the grid synchronization technique to standardize the system with grid codes and power quality standards. Hence, grid synchronization unit plays important role for grid connected SPV systems. As the output of the PV array is fluctuating in nature with the meteorological parameters like irradiance, temperature, wind etc. In order to maintain a constant DC voltage at VSC (Voltage Source Converter) input, MPPT control is required to track the maximum power point from PV array. In this work, a variable step size P & O (Perturb and Observe) MPPT technique with DC/DC boost converter has been used at first stage of the system. This algorithm divides the dPpv/dVpv curve of PV panel into three separate zones i.e. zone 0, zone 1 and zone 2. A fine value of tracking step size is used in zone 0 while zone 1 and zone 2 requires a large value of step size in order to obtain a high tracking speed. Further, adaptive notch filter based control technique is proposed for VSC in PV generation system. Adaptive notch filter (ANF) approach is used to synchronize the interfaced PV system with grid to maintain the amplitude, phase and frequency parameters as well as power quality improvement. This technique offers the compensation of harmonics current and reactive power with both linear and nonlinear loads. To maintain constant DC link voltage a PI controller is also implemented and presented in this paper. The complete system has been designed, developed and simulated using SimPower System and Simulink toolbox of MATLAB. The performance analysis of three phase grid connected solar photovoltaic system has been carried out on the basis of various parameters like PV output power, PV voltage, PV current, DC link voltage, PCC (Point of Common Coupling) voltage, grid voltage, grid current, voltage source converter current, power supplied by the voltage source converter etc. The results obtained from the proposed system are found satisfactory.Keywords: solar photovoltaic systems, MPPT, voltage source converter, grid synchronization technique
Procedia PDF Downloads 594201 Adapting an Accurate Reverse-time Migration Method to USCT Imaging
Authors: Brayden Mi
Abstract:
Reverse time migration has been widely used in the Petroleum exploration industry to reveal subsurface images and to detect rock and fluid properties since the early 1980s. The seismic technology involves the construction of a velocity model through interpretive model construction, seismic tomography, or full waveform inversion, and the application of the reverse-time propagation of acquired seismic data and the original wavelet used in the acquisition. The methodology has matured from 2D, simple media to present-day to handle full 3D imaging challenges in extremely complex geological conditions. Conventional Ultrasound computed tomography (USCT) utilize travel-time-inversion to reconstruct the velocity structure of an organ. With the velocity structure, USCT data can be migrated with the “bend-ray” method, also known as migration. Its seismic application counterpart is called Kirchhoff depth migration, in which the source of reflective energy is traced by ray-tracing and summed to produce a subsurface image. It is well known that ray-tracing-based migration has severe limitations in strongly heterogeneous media and irregular acquisition geometries. Reverse time migration (RTM), on the other hand, fully accounts for the wave phenomena, including multiple arrives and turning rays due to complex velocity structure. It has the capability to fully reconstruct the image detectable in its acquisition aperture. The RTM algorithms typically require a rather accurate velocity model and demand high computing powers, and may not be applicable to real-time imaging as normally required in day-to-day medical operations. However, with the improvement of computing technology, such a computational bottleneck may not present a challenge in the near future. The present-day (RTM) algorithms are typically implemented from a flat datum for the seismic industry. It can be modified to accommodate any acquisition geometry and aperture, as long as sufficient illumination is provided. Such flexibility of RTM can be conveniently implemented for the application in USCT imaging if the spatial coordinates of the transmitters and receivers are known and enough data is collected to provide full illumination. This paper proposes an implementation of a full 3D RTM algorithm for USCT imaging to produce an accurate 3D acoustic image based on the Phase-shift-plus-interpolation (PSPI) method for wavefield extrapolation. In this method, each acquired data set (shot) is propagated back in time, and a known ultrasound wavelet is propagated forward in time, with PSPI wavefield extrapolation and a piece-wise constant velocity model of the organ (breast). The imaging condition is then applied to produce a partial image. Although each image is subject to the limitation of its own illumination aperture, the stack of multiple partial images will produce a full image of the organ, with a much-reduced noise level if compared with individual partial images.Keywords: illumination, reverse time migration (RTM), ultrasound computed tomography (USCT), wavefield extrapolation
Procedia PDF Downloads 74200 Identification, Synthesis, and Biological Evaluation of the Major Human Metabolite of NLRP3 Inflammasome Inhibitor MCC950
Authors: Manohar Salla, Mark S. Butler, Ruby Pelingon, Geraldine Kaeslin, Daniel E. Croker, Janet C. Reid, Jong Min Baek, Paul V. Bernhardt, Elizabeth M. J. Gillam, Matthew A. Cooper, Avril A. B. Robertson
Abstract:
MCC950 is a potent and selective inhibitor of the NOD-like receptor pyrin domain-containing protein 3 (NLRP3) inflammasome that shows early promise for treatment of inflammatory diseases. The identification of major metabolites of lead molecule is an important step during drug development process. It provides an information about the metabolically labile sites in the molecule and thereby helping medicinal chemists to design metabolically stable molecules. To identify major metabolites of MCC950, the compound was incubated with human liver microsomes and subsequent analysis by (+)- and (−)-QTOF-ESI-MS/MS revealed a major metabolite formed due to hydroxylation on 1,2,3,5,6,7-hexahydro-s-indacene moiety of MCC950. This major metabolite can lose two water molecules and three possible regioisomers were synthesized. Co-elution of major metabolite with each of the synthesized compounds using HPLC-ESI-SRM-MS/MS revealed the structure of the metabolite (±) N-((1-hydroxy-1,2,3,5,6,7-hexahydro-s-indacen-4-yl)carbamoyl)-4-(2-hydroxypropan-2-yl)furan-2-sulfonamide. Subsequent synthesis of individual enantiomers and coelution in HPLC-ESI-SRM-MS/MS using a chiral column revealed the metabolite was R-(+)- N-((1-hydroxy-1,2,3,5,6,7-hexahydro-s-indacen-4-yl)carbamoyl)-4-(2-hydroxypropan-2-yl)furan-2-sulfonamide. To study the possible cytochrome P450 enzyme(s) responsible for the formation of major metabolite, MCC950 was incubated with a panel of cytochrome P450 enzymes. The result indicated that CYP1A2, CYP2A6, CYP2B6, CYP2C9, CYP2C18, CYP2C19, CYP2J2 and CYP3A4 are most likely responsible for the formation of the major metabolite. The biological activity of the major metabolite and the other synthesized regioisomers was also investigated by screening for for NLRP3 inflammasome inhibitory activity and cytotoxicity. The major metabolite had 170-fold less inhibitory activity (IC50-1238 nM) than MCC950 (IC50-7.5 nM). Interestingly, one regioisomer had shown nanomolar inhibitory activity (IC50-232 nM). However, no evidence of cytotoxicity was observed with any of these synthesized compounds when tested in human embryonic kidney 293 cells (HEK293) and human liver hepatocellular carcinoma G2 cells (HepG2). These key findings give an insight into the SAR of the hexahydroindacene moiety of MCC950 and reveal a metabolic soft spot which could be blocked by chemical modification.Keywords: Cytochrome P450, inflammasome, MCC950, metabolite, microsome, NLRP3
Procedia PDF Downloads 252