Search results for: small technology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11739

Search results for: small technology

1509 Online Faculty Professional Development: An Approach to the Design Process

Authors: Marie Bountrogianni, Leonora Zefi, Krystle Phirangee, Naza Djafarova

Abstract:

Faculty development is critical for any institution as it impacts students’ learning experiences and faculty performance with regards to course delivery. With that in mind, The Chang School at Ryerson University embarked on an initiative to develop a comprehensive, relevant faculty development program for online faculty and instructors. Teaching Adult Learners Online (TALO) is a professional development program designed to build capacity among online teaching faculty to enhance communication/facilitation skills for online instruction and establish a Community of Practice to allow for opportunities for online faculty to network and exchange ideas and experiences. TALO is comprised of four online modules and each module provides three hours of learning materials. The topics focus on online teaching and learning experience, principles and practices, opportunities and challenges in online assessments as well as course design and development. TALO offers a unique experience for online instructors who are placed in the role of a student and an instructor through interactivities involving discussions, hands-on assignments, peer mentoring while experimenting with technological tools available for their online teaching. Through exchanges and informal peer mentoring, a small interdisciplinary community of practice has started to take shape. Successful participants have to meet four requirements for completion: i) participate actively in online discussions and activities, ii) develop a communication plan for the course they are teaching, iii) design one learning activity/or media component, iv) design one online learning module. This study adopted a mixed methods exploratory sequential design. For the qualitative phase of this study, a thorough literature review was conducted on what constitutes effective faculty development programs. Based on that review, the design team identified desired competencies for online teaching/facilitation and course design. Once the competencies were identified, a focus group interview with The Chang School teaching community was conducted as a needs assessment and to validate the competencies. In the quantitative phase, questionnaires were distributed to instructors and faculty after the program was launched to continue ongoing evaluation and revisions, in hopes of further improving the program to meet the teaching community’s needs. Four faculty members participated in a one-hour focus group interview. Major findings from the focus group interview revealed that for the training program, faculty wanted i) to better engage students online, ii) to enhance their online teaching with specific strategies, iii) to explore different ways to assess students online. 91 faculty members completed the questionnaire in which findings indicated that: i) the majority of faculty stated that they gained the necessary skills to demonstrate instructor presence through communication and use of technological tools provided, ii) increased faculty confidence with course management strategies, iii) learning from peers is most effective – the Community of Practice is strengthened and valued even more as program alumni become facilitators. Although this professional development program is not mandatory for online instructors, since its launch in Fall 2014, over 152 online instructors have successfully completed the program. A Community of Practice emerged as a result of the program and participants continue to exchange thoughts and ideas about online teaching and learning.

Keywords: community of practice, customized, faculty development, inclusive design

Procedia PDF Downloads 163
1508 Transmission Line Protection Challenges under High Penetration of Renewable Energy Sources and Proposed Solutions: A Review

Authors: Melake Kuflom

Abstract:

European power networks involve the use of multiple overhead transmission lines to construct a highly duplicated system that delivers reliable and stable electrical energy to the distribution level. The transmission line protection applied in the existing GB transmission network are normally independent unit differential and time stepped distance protection schemes, referred to as main-1 & main-2 respectively, with overcurrent protection as a backup. The increasing penetration of renewable energy sources, commonly referred as “weak sources,” into the power network resulted in the decline of fault level. Traditionally, the fault level of the GB transmission network has been strong; hence the fault current contribution is more than sufficient to ensure the correct operation of the protection schemes. However, numerous conventional coal and nuclear generators have been or about to shut down due to the societal requirement for CO2 emission reduction, and this has resulted in a reduction in the fault level on some transmission lines, and therefore an adaptive transmission line protection is required. Generally, greater utilization of renewable energy sources generated from wind or direct solar energy results in a reduction of CO2 carbon emission and can increase the system security and reliability but reduces the fault level, which has an adverse effect on protection. Consequently, the effectiveness of conventional protection schemes under low fault levels needs to be reviewed, particularly for future GB transmission network operating scenarios. The proposed paper will evaluate the transmission line challenges under high penetration of renewable energy sources andprovides alternative viable protection solutions based on the problem observed. The paper will consider the assessment ofrenewable energy sources (RES) based on a fully rated converter technology. The DIgSILENT Power Factory software tool will be used to model the network.

Keywords: fault level, protection schemes, relay settings, relay coordination, renewable energy sources

Procedia PDF Downloads 186
1507 Atmospheric Circulation Patterns Inducing Coastal Upwelling in the Baltic Sea

Authors: Ewa Bednorz, Marek Polrolniczak, Bartosz Czernecki, Arkadiusz Marek Tomczyk

Abstract:

This study is meant as a contribution to the research of the upwelling phenomenon, which is one of the most pronounced examples of the sea-atmosphere coupling. The aim is to confirm the atmospheric forcing of the sea waters circulation and sea surface temperature along the variously oriented Baltic Sea coasts and to find out macroscale and regional circulation patterns triggering upwelling along different sections of this relatively small and semi-closed sea basin. The mean daily sea surface temperature data from the summer seasons (June–August) of the years 1982–2017 made the basis for the detection of upwelling cases. For the atmospheric part of the analysis, monthly indices of the Northern Hemisphere macroscale circulation patterns were used. Besides, in order to identify the local direction of airflow, the daily zonal and meridional regional circulation indices were constructed and introduced to the analysis. Finally, daily regional circulation patterns over the Baltic Sea region were distinguished by applying the principal component analysis to the gridded mean daily sea level pressure data. Within the Baltic Sea, upwelling is the most frequent along the zonally oriented northern coast of the Gulf of Finland, southern coasts of Sweden, and along the middle part of the western Gulf of Bothnia coast. Among the macroscale circulation patterns, the Scandinavian type (SCAND), with a primary circulation center located over Scandinavia, has the strongest impact on the horizontal flow of surface sea waters in the Baltic Sea, which triggers upwelling. An anticyclone center over Scandinavia in the positive phase of SCAND enhances the eastern airflow, which increases upwelling frequency along southeastern Baltic coasts. It was proved in the study that the zonal circulation has a stronger impact on upwelling occurrence than the meridional one, and it could increase/decrease a chance of upwelling formation by more than 70% in some coastal sections. Positive and negative phases of six distinguished regional daily circulation patterns made 12 different synoptic situations which were analyzed in the terms of their influence on the upwelling formation. Each of them revealed some impact on the frequency of upwelling in some coastal section of the Baltic Sea; however, two kinds of synoptic situations seemed to have the strongest influence, namely, the first kind representing pressure patterns enhancing the zonal flow and the second kind representing synoptic patterns with a cyclone/anticyclone centers over southern Scandinavia. Upwelling occurrence appeared to be particularly strongly reliant on the atmospheric conditions in some specific coastal sections, namely: the Gulf of Finland, the south eastern Baltic coasts (Polish and Latvian-Lithuanian section), and the western part of the Gulf of Bothnia. Concluding, it can be stated that atmospheric conditions strongly control the occurrence of upwelling within the Baltic Sea basin. Both local and macroscale circulation patterns expressed by the location of the pressure centers influence the frequency of this phenomenon; however, the impact strength varies, depending on the coastal region. Acknowledgment: This research was funded by the National Science Centre, Poland, grant number 2016/21/B/ST10/01440.

Keywords: Baltic Sea, circulation patterns, coastal upwelling, synoptic conditions

Procedia PDF Downloads 115
1506 Formulation and Evaluation of Metformin Hydrochloride Microparticles via BÜCHI Nano-Spray Dryer B-90

Authors: Tamer Shehata

Abstract:

Recently, nanotechnology acquired a great interest in the field of pharmaceutical production. Several pharmaceutical equipment were introduced into the research field for production of nanoparticles, among them, BÜCHI’ fourth generation nano-spray dryer B-90. B-90 is specialized with single step of production and drying of nano and microparticles. Currently, our research group is investigating several pharmaceutical formulations utilizing BÜCHI Nano-Spray Dryer B-90 technology. One of our projects is the formulation and evaluation of metformin hydrochloride mucoadhesive microparticles for treatment of type 2-diabetis. Several polymers were investigated, among them, gelatin and sodium alginate. The previous polymers are natural polymers with mucoadhesive properties. Preformulation studies such as atomization head mesh size, flow rate, head temperature, polymer solution viscosity and surface tension were performed. Postformulation characters such as particle size, flowability, surface scan and dissolution profile were evaluated. Finally, the pharmacological activity of certain selected formula was evaluated in streptozotocin-induced diabetic rats. B-90’spray head was 7 µm hole heated to 120 with air flow rate 3.5 mL/min. The viscosity of the solution was less than 11.5 cP with surface tension less than 70.1 dyne/cm. Successfully, discrete, non-aggregated particles and free flowing powders with particle size was less than 2000 nm were obtained. Gelatin and Sodium alginate combination in ratio 1:3 were successfully sustained the in vitro release profile of the drug. Hypoglycemic evaluation of the previous formula showed a significant reduction of blood glucose level over 24 h. In conclusion, mucoadhesive metformin hydrochloride microparticles obtained from B-90 could offer a convenient dosage form with enhanced hypoglycemic activity.

Keywords: mucoadhesive, microparticles, metformin hydrochloride, nano-spray dryer

Procedia PDF Downloads 296
1505 Factors Promoting French-English Tweets in France

Authors: Taoues Hadour

Abstract:

Twitter has become a popular means of communication used in a variety of fields, such as politics, journalism, and academia. This widely used online platform has an impact on the way people express themselves and is changing language usage worldwide at an unprecedented pace. The language used online reflects the linguistic battle that has been going on for several decades in French society. This study enables a deeper understanding of users' linguistic behavior online. The implications are important and allow for a rise in awareness of intercultural and cross-language exchanges. This project investigates the mixing of French-English language usage among French users of Twitter using a topic analysis approach. This analysis draws on Gumperz's theory of conversational switching. In order to collect tweets at a large scale, the data was collected in R using the rtweet package to access and retrieve French tweets data through Twitter’s REST and stream APIs (Application Program Interface) using the software RStudio, the integrated development environment for R. The dataset was filtered manually and certain repetitions of themes were observed. A total of nine topic categories were identified and analyzed in this study: entertainment, internet/social media, events/community, politics/news, sports, sex/pornography, innovation/technology, fashion/make up, and business. The study reveals that entertainment is the most frequent topic discussed on Twitter. Entertainment includes movies, music, games, and books. Anglicisms such as trailer, spoil, and live are identified in the data. Change in language usage is inevitable and is a natural result of linguistic interactions. The use of different languages online is just an example of what the real world would look like without linguistic regulations. Social media reveals a multicultural and multilinguistic richness which can deepen and expand our understanding of contemporary human attitudes.

Keywords: code-switching, French, sociolinguistics, Twitter

Procedia PDF Downloads 121
1504 Understanding Stock-Out of Pharmaceuticals in Timor-Leste: A Case Study in Identifying Factors Impacting on Pharmaceutical Quantification in Timor-Leste

Authors: Lourenco Camnahas, Eileen Willis, Greg Fisher, Jessie Gunson, Pascale Dettwiller, Charlene Thornton

Abstract:

Stock-out of pharmaceuticals is a common issue at all level of health services in Timor-Leste, a small post-conflict country. This lead to the research questions: what are the current methods used to quantify pharmaceutical supplies; what factors contribute to the on-going pharmaceutical stock-out? The study examined factors that influence the pharmaceutical supply chain system. Methodology: Privett and Goncalvez dependency model has been adopted for the design of the qualitative interviews. The model examines pharmaceutical supply chain management at three management levels: management of individual pharmaceutical items, health facilities, and health systems. The interviews were conducted in order to collect information on inventory management, logistics management information system (LMIS) and the provision of pharmaceuticals. Andersen' behavioural model for healthcare utilization also informed the interview schedule, specifically factors linked to environment (healthcare system and external environment) and the population (enabling factors). Forty health professionals (bureaucrats, clinicians) and six senior officers from a United Nations Agency, a global multilateral agency and a local non-governmental organization were interviewed on their perceptions of factors (healthcare system/supply chain and wider environment) impacting on stock out. Additionally, policy documents for the entire healthcare system, along with population data were collected. Findings: An analysis using Pozzebon’s critical interpretation identified a range of difficulties within the system from poor coordination to failure to adhere to policy guidelines along with major difficulties with inventory management, quantification, forecasting, and budgetary constraints. Weak logistics management information system, lack of capacity in inventory management, monitoring and supervision are additional organizational factors that also contributed to the issue. There were various methods of quantification of pharmaceuticals applied in the government sector, and non-governmental organizations. Lack of reliable data is one of the major problems in the pharmaceutical provision. Global Fund has the best quantification methods fed by consumption data and malaria cases. There are other issues that worsen stock-out: political intervention, work ethic and basic infrastructure such as unreliable internet connectivity. Major issues impacting on pharmaceutical quantification have been identified. However, current data collection identified limitations within the Andersen model; specifically, a failure to take account of predictors in the healthcare system and the environment (culture/politics/social. The next step is to (a) compare models used by three non-governmental agencies with the government model; (b) to run the Andersen explanatory model for pharmaceutical expenditure for 2 to 5 drug items used by these three development partners in order to see how it correlates with the present model in terms of quantification and forecasting the needs; (c) to repeat objectives (a) and (b) using the government model; (d) to draw a conclusion about the strength.

Keywords: inventory management, pharmaceutical forecasting and quantification, pharmaceutical stock-out, pharmaceutical supply chain management

Procedia PDF Downloads 217
1503 Urinary Exosome miR-30c-5p as a Biomarker for Early-Stage Clear Cell Renal Cell Carcinoma

Authors: Shangqing Song, Bin Xu, Yajun Cheng, Zhong Wang

Abstract:

miRNAs derived from exosomes exist in a body fluid such as urine were regarded as potential biomarkers for various human cancers diagnosis and prognosis, as mature miRNAs can be steadily preserved by exosomes. However, its potential value in clear cell renal cell carcinoma (ccRCC) diagnosis and prognosis remains unclear. In the present study, differentially expressed miRNAs from urinal exosomes were identified by next-generation sequencing (NGS) technology. The 16 differentially expressed miRNAs were identified between ccRCC patients and healthy donors. To explore the specific diagnosis biomarker of ccRCC, we validated these urinary exosomes from 70 early-stage renal cancer patients, 30 healthy people and other urinary system cancers, including 30 early-stage prostate cancer patients and 30 early-stage bladder cancer patients by qRT-PCR. The results showed that urinary exosome miR-30c-5p could be stably amplified and meanwhile the expression of miR-30c-5p has no significant difference between other urinary system cancers and healthy control, however, expression level of miR-30c-5p in urinary exosomal of ccRCC patients was lower than healthy people and receiver operation characterization (ROC) curve showed that the area under the curve (AUC) values was 0.8192 (95% confidence interval was 0.7388-0.8996, P= 0.0000). In addition, up-regulating miR-30c-5p expression could inhibit renal cell carcinoma cells growth. Lastly, HSP5A was found as a direct target gene of miR-30c-5p. HSP5A depletion reversed the promoting effect of ccRCC growth casued by miR-30c-5p inhibitor, respectively. In conclusion, this study demonstrated that urinary exosomal miR-30c-5p is readily accessible as diagnosis biomarker of early-stage ccRCC, and miR-30c-5p might modulate the expression of HSPA5, which correlated with the progression of ccRCC.

Keywords: clear cell renal cell carcinoma, exosome, HSP5A, miR-30c-5p

Procedia PDF Downloads 245
1502 Assessing Brain Targeting Efficiency of Ionisable Lipid Nanoparticles Encapsulating Cas9 mRNA/gGFP Following Different Routes of Administration in Mice

Authors: Meiling Yu, Nadia Rouatbi, Khuloud T. Al-Jamal

Abstract:

Background: Treatment of neurological disorders with modern medical and surgical approaches remains difficult. Gene therapy, allowing the delivery of genetic materials that encodes potential therapeutic molecules, represents an attractive option. The treatment of brain diseases with gene therapy requires the gene-editing tool to be delivered efficiently to the central nervous system. In this study, we explored the efficiency of different delivery routes, namely intravenous (i.v.), intra-cranial (i.c.), and intra-nasal (i.n.), to deliver stable nucleic acid-lipid particles (SNALPs) containing gene-editing tools namely Cas9 mRNA and sgRNA encoding for GFP as a reporter protein. We hypothesise that SNALPs can reach the brain and perform gene-editing to different extents depending on the administration route. Intranasal administration (i.n.) offers an attractive and non-invasive way to access the brain circumventing the blood–brain barrier. Successful delivery of gene-editing tools to the brain offers a great opportunity for therapeutic target validation and nucleic acids therapeutics delivery to improve treatment options for a range of neurodegenerative diseases. In this study, we utilised Rosa26-Cas9 knock-in mice, expressing GFP, to study brain distribution and gene-editing efficiency of SNALPs after i.v.; i.c. and i.n. routes of administration. Methods: Single guide RNA (sgRNA) against GFP has been designed and validated by in vitro nuclease assay. SNALPs were formulated and characterised using dynamic light scattering. The encapsulation efficiency of nucleic acids (NA) was measured by RiboGreen™ assay. SNALPs were incubated in serum to assess their ability to protect NA from degradation. Rosa26-Cas9 knock-in mice were i.v., i.n., or i.c. administered with SNALPs to test in vivo gene-editing (GFP knockout) efficiency. SNALPs were given as three doses of 0.64 mg/kg sgGFP following i.v. and i.n. or a single dose of 0.25 mg/kg sgGFP following i.c.. knockout efficiency was assessed after seven days using Sanger Sequencing and Inference of CRISPR Edits (ICE) analysis. In vivo, the biodistribution of DiR labelled SNALPs (SNALPs-DiR) was assessed at 24h post-administration using IVIS Lumina Series III. Results: Serum-stable SNALPs produced were 130-140 nm in diameter with ~90% nucleic acid loading efficiency. SNALPs could reach and stay in the brain for up to 24h following i.v.; i.n. and i.c. administration. Decreasing GFP expression (around 50% after i.v. and i.c. and 20% following i.n.) was confirmed by optical imaging. Despite the small number of mice used, ICE analysis confirmed GFP knockout in mice brains. Additional studies are currently taking place to increase mice numbers. Conclusion: Results confirmed efficient gene knockout achieved by SNALPs in Rosa26-Cas9 knock-in mice expressing GFP following different routes of administrations in the following order i.v.= i.c.> i.n. Each of the administration routes has its pros and cons. The next stages of the project involve assessing gene-editing efficiency in wild-type mice and replacing GFP as a model target with therapeutic target genes implicated in Motor Neuron Disease pathology.

Keywords: CRISPR, nanoparticles, brain diseases, administration routes

Procedia PDF Downloads 83
1501 Enhanced Multi-Scale Feature Extraction Using a DCNN by Proposing Dynamic Soft Margin SoftMax for Face Emotion Detection

Authors: Armin Nabaei, M. Omair Ahmad, M. N. S. Swamy

Abstract:

Many facial expression and emotion recognition methods in the traditional approaches of using LDA, PCA, and EBGM have been proposed. In recent years deep learning models have provided a unique platform addressing by automatically extracting the features for the detection of facial expression and emotions. However, deep networks require large training datasets to extract automatic features effectively. In this work, we propose an efficient emotion detection algorithm using face images when only small datasets are available for training. We design a deep network whose feature extraction capability is enhanced by utilizing several parallel modules between the input and output of the network, each focusing on the extraction of different types of coarse features with fined grained details to break the symmetry of produced information. In fact, we leverage long range dependencies, which is one of the main drawback of CNNs. We develop this work by introducing a Dynamic Soft-Margin SoftMax.The conventional SoftMax suffers from reaching to gold labels very soon, which take the model to over-fitting. Because it’s not able to determine adequately discriminant feature vectors for some variant class labels. We reduced the risk of over-fitting by using a dynamic shape of input tensor instead of static in SoftMax layer with specifying a desired Soft- Margin. In fact, it acts as a controller to how hard the model should work to push dissimilar embedding vectors apart. For the proposed Categorical Loss, by the objective of compacting the same class labels and separating different class labels in the normalized log domain.We select penalty for those predictions with high divergence from ground-truth labels.So, we shorten correct feature vectors and enlarge false prediction tensors, it means we assign more weights for those classes with conjunction to each other (namely, “hard labels to learn”). By doing this work, we constrain the model to generate more discriminate feature vectors for variant class labels. Finally, for the proposed optimizer, our focus is on solving weak convergence of Adam optimizer for a non-convex problem. Our noteworthy optimizer is working by an alternative updating gradient procedure with an exponential weighted moving average function for faster convergence and exploiting a weight decay method to help drastically reducing the learning rate near optima to reach the dominant local minimum. We demonstrate the superiority of our proposed work by surpassing the first rank of three widely used Facial Expression Recognition datasets with 93.30% on FER-2013, and 16% improvement compare to the first rank after 10 years, reaching to 90.73% on RAF-DB, and 100% k-fold average accuracy for CK+ dataset, and shown to provide a top performance to that provided by other networks, which require much larger training datasets.

Keywords: computer vision, facial expression recognition, machine learning, algorithms, depp learning, neural networks

Procedia PDF Downloads 60
1500 Shape Management Method of Large Structure Based on Octree Space Partitioning

Authors: Gichun Cha, Changgil Lee, Seunghee Park

Abstract:

The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."

Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning

Procedia PDF Downloads 286
1499 Impact of Anthropogenic Stresses on Plankton Biodiversity in Indian Sundarban Megadelta: An Approach towards Ecosystem Conservation and Sustainability

Authors: Dibyendu Rakshit, Santosh K. Sarkar

Abstract:

The study illustrates a comprehensive account of large-scale changes plankton community structure in relevance to water quality characteristics due to anthropogenic stresses, mainly concerned for Annual Gangasagar Festival (AGF) at the southern tip of Sagar Island of Indian Sundarban wetland for 3-year duration (2012-2014; n=36). This prograding, vulnerable and tide-dominated megadelta has been formed in the estuarine phase of the Hooghly Estuary infested by largest continuous tract of luxurious mangrove forest, enriched with high native flora and fauna. The sampling strategy was designed to characterize the changes in plankton community and water quality considering three diverse phases, namely during festival period (January) and its pre - (December) as well as post (February) events. Surface water samples were collected for estimation of different environmental variables as well as for phytoplankton and microzooplankton biodiversity measurement. The preservation and identification techniques of both biotic and abiotic parameters were carried out by standard chemical and biological methods. The intensive human activities lead to sharp ecological changes in the context of poor water quality index (WQI) due to high turbidity (14.02±2.34 NTU) coupled with low chlorophyll a (1.02±0.21 mg m-3) and dissolved oxygen (3.94±1.1 mg l-1), comparing to pre- and post-festival periods. Sharp reduction in abundance (4140 to 2997 cells l-1) and diversity (H′=2.72 to 1.33) of phytoplankton and microzooplankton tintinnids (450 to 328 ind l-1; H′=4.31 to 2.21) was very much pronounced. The small size tintinnid (average lorica length=29.4 µm; average LOD=10.5 µm) composed of Tintinnopsis minuta, T. lobiancoi, T. nucula, T. gracilis are predominant and reached some of the greatest abundances during the festival period. Results of ANOVA revealed a significant variation in different festival periods with phytoplankton (F= 1.77; p=0.006) and tintinnid abundance (F= 2.41; P=0.022). RELATE analyses revealed a significant correlation between the variations of planktonic communities with the environmental data (R= 0.107; p= 0.005). Three distinct groups were delineated from principal component analysis, in which a set of hydrological parameters acted as the causative factor(s) for maintaining diversity and distribution of the planktonic organisms. The pronounced adverse impact of anthropogenic stresses on plankton community could lead to environmental deterioration, disrupting the productivity of benthic and pelagic ecosystems as well as fishery potentialities which directly related to livelihood services. The festival can be considered as multiple drivers of changes in relevance to beach erosion, shoreline changes, pollution from discarded plastic and electronic wastes and destruction of natural habitats resulting loss of biodiversity. In addition, deterioration in water quality was also evident from immersion of idols, causing detrimental effects on aquatic biota. The authors strongly recommend for adopting integrated scientific and administrative strategies for resilience, sustainability and conservation of this megadelta.

Keywords: Gangasagar festival, phytoplankton, Sundarban megadelta, tintinnid

Procedia PDF Downloads 216
1498 Dynamic Control Theory: A Behavioral Modeling Approach to Demand Forecasting amongst Office Workers Engaged in a Competition on Energy Shifting

Authors: Akaash Tawade, Manan Khattar, Lucas Spangher, Costas J. Spanos

Abstract:

Many grids are increasing the share of renewable energy in their generation mix, which is causing the energy generation to become less controllable. Buildings, which consume nearly 33% of all energy, are a key target for demand response: i.e., mechanisms for demand to meet supply. Understanding the behavior of office workers is a start towards developing demand response for one sector of building technology. The literature notes that dynamic computational modeling can be predictive of individual action, especially given that occupant behavior is traditionally abstracted from demand forecasting. Recent work founded on Social Cognitive Theory (SCT) has provided a promising conceptual basis for modeling behavior, personal states, and environment using control theoretic principles. Here, an adapted linear dynamical system of latent states and exogenous inputs is proposed to simulate energy demand amongst office workers engaged in a social energy shifting game. The energy shifting competition is implemented in an office in Singapore that is connected to a minigrid of buildings with a consistent 'price signal.' This signal is translated into a 'points signal' by a reinforcement learning (RL) algorithm to influence participant energy use. The dynamic model functions at the intersection of the points signals, baseline energy consumption trends, and SCT behavioral inputs to simulate future outcomes. This study endeavors to analyze how the dynamic model trains an RL agent and, subsequently, the degree of accuracy to which load deferability can be simulated. The results offer a generalizable behavioral model for energy competitions that provides the framework for further research on transfer learning for RL, and more broadly— transactive control.

Keywords: energy demand forecasting, social cognitive behavioral modeling, social game, transfer learning

Procedia PDF Downloads 97
1497 Consistent Testing for an Implication of Supermodular Dominance with an Application to Verifying the Effect of Geographic Knowledge Spillover

Authors: Chung Danbi, Linton Oliver, Whang Yoon-Jae

Abstract:

Supermodularity, or complementarity, is a popular concept in economics which can characterize many objective functions such as utility, social welfare, and production functions. Further, supermodular dominance captures a preference for greater interdependence among inputs of those functions, and it can be applied to examine which input set would produce higher expected utility, social welfare, or production. Therefore, we propose and justify a consistent testing for a useful implication of supermodular dominance. We also conduct Monte Carlo simulations to explore the finite sample performance of our test, with critical values obtained from the recentered bootstrap method, with and without the selective recentering, and the subsampling method. Under various parameter settings, we confirmed that our test has reasonably good size and power performance. Finally, we apply our test to compare the geographic and distant knowledge spillover in terms of their effects on social welfare using the National Bureau of Economic Research (NBER) patent data. We expect localized citing to supermodularly dominate distant citing if the geographic knowledge spillover engenders greater social welfare than distant knowledge spillover. Taking subgroups based on firm and patent characteristics, we found that there is industry-wise and patent subclass-wise difference in the pattern of supermodular dominance between localized and distant citing. We also compare the results from analyzing different time periods to see if the development of Internet and communication technology has changed the pattern of the dominance. In addition, to appropriately deal with the sparse nature of the data, we apply high-dimensional methods to efficiently select relevant data.

Keywords: supermodularity, supermodular dominance, stochastic dominance, Monte Carlo simulation, bootstrap, subsampling

Procedia PDF Downloads 120
1496 Assessment of Hydrogen Demand for Different Technological Pathways to Decarbonise the Aviation Sector in Germany

Authors: Manish Khanra, Shashank Prabhu

Abstract:

The decarbonization of hard-to-abate sectors is currently high on the agenda in the EU and its member states, as these sectors have substantial shares in overall GHG emissions while it is facing serious challenges to decarbonize. In particular, the aviation sector accounts for 2.8% of global anthropogenic CO₂ emissions. These emissions are anticipated to grow dramatically unless immediate mitigating efforts are implemented. Hydrogen and its derivatives based on renewable electricity can have a key role in the transition towards CO₂-neutral flights. The substantial shares of energy carriers in the form of drop-in fuel, direct combustion and Hydrogen-to-Electric are promising in most scenarios towards 2050. For creating appropriate policies to ramp up the production and utilisation of hydrogen commodities in the German aviation sector, a detailed analysis of the spatial distribution of supply-demand sites is essential. The objective of this research work is to assess the demand for hydrogen-based alternative fuels in the German aviation sector to achieve the perceived goal of the ‘Net Zero’ scenario by 2050. Here, the analysis of the technological pathways for the production and utilisation of these fuels in various aircraft options is conducted for reaching mitigation targets. Our method is based on data-driven bottom-up assessment, considering production and demand sites and their spatial distribution. The resulting energy demand and its spatial distribution with consideration of technology diffusion lead to a possible transition pathway of the aviation sector to meet short-term and long-term mitigation targets. Additionally, to achieve mitigation targets in this sector, costs and policy aspects are discussed, which would support decision-makers from airline industries, policymakers and the producers of energy commodities.

Keywords: the aviation sector, hard-to-abate sectors, hydrogen demand, alternative fuels, technological pathways, data-driven approach

Procedia PDF Downloads 115
1495 Development of Scenarios for Sustainable Next Generation Nuclear System

Authors: Muhammad Minhaj Khan, Jaemin Lee, Suhong Lee, Jinyoung Chung, Johoo Whang

Abstract:

The Republic of Korea has been facing strong storage crisis from nuclear waste generation as At Reactor (AR) temporary storage sites are about to reach saturation. Since the country is densely populated with a rate of 491.78 persons per square kilometer, Construction of High-level waste repository will not be a feasible option. In order to tackle the storage waste generation problem which is increasing at a rate of 350 tHM/Yr. and 380 tHM/Yr. in case of 20 PWRs and 4 PHWRs respectively, the study strongly focuses on the advancement of current nuclear power plants to GEN-IV sustainable and ecological nuclear systems by burning TRUs (Pu, MAs). First, Calculations has made to estimate the generation of SNF including Pu and MA from PWR and PHWR NPPS by using the IAEA code Nuclear Fuel Cycle Simulation System (NFCSS) for the period of 2016, 2030 (including the saturation period of each site from 2024~2028), 2089 and 2109 as the number of NPPS will increase due to high import cost of non-nuclear energy sources. 2ndly, in order to produce environmentally sustainable nuclear energy systems, 4 scenarios to burnout the Plutonium and MAs are analyzed with the concentration on burning of MA only, MA and Pu together by utilizing SFR, LFR and KALIMER-600 burner reactor after recycling the spent oxide fuel from PWR through pyro processing technology developed by Korea Atomic Energy Research Institute (KAERI) which shows promising and sustainable future benefits by minimizing the HLW generation with regard to waste amount, decay heat, and activity. Finally, With the concentration on front and back end fuel cycles for open and closed fuel cycles of PWR and Pyro-SFR respectively, an overall assessment has been made which evaluates the quantitative as well as economical combativeness of SFR metallic fuel against PWR once through nuclear fuel cycle.

Keywords: GEN IV nuclear fuel cycle, nuclear waste, waste sustainability, transmutation

Procedia PDF Downloads 342
1494 Enhancing Employee Innovative Behaviours Through Human Resource Wellbeing Practices

Authors: Jarrod Haar, David Brougham

Abstract:

The present study explores the links between supporting employee well-being and the potential benefits to employee performance. We focus on employee innovative work behaviors (IWBs), which have three stages: (1) development, (2) adoption, and (3) implementation of new ideas and work methods. We explore the role of organizational support focusing on employee well-being via High-Performance Work Systems (HPWS). HPWS are HR practices that are designed to enhance employees’ skills, commitment, and ultimately, productivity. HPWS influence employee performance through building their skills, knowledge, and abilities and there is meta-analytic support for firm-level HPWS influencing firm performance, but less attention towards employee outcomes, especially innovation. We explore HPWS-wellbeing being offered (e.g., EAPs, well-being App, etc.) to capture organizational commitment to employee well-being. Under social exchange theory, workers should reciprocate their firm's offering of HPWS-wellbeing with greater efforts towards IWBs. Further, we explore playful work design as a mediator, which represents employees proactively creating work conditions that foster enjoyment/challenge but don’t require any design change to the job itself. We suggest HPWS-wellbeing can encourage employees to become more playful, and ultimately more innovative. Finally, beyond direct effects, we examine whether these relations are similar by gender and ultimately test a moderated mediation model. Using N=1135 New Zealand employees, we established measures with confirmatory factor analysis (CFA), and all measures had good psychometric properties (α>.80). We controlled for age, tenure, education, and hours worked and analyzed data using the PROCESS macro (version 4.2) specifically model 8 (moderated mediation). We analyzed overall IWB, and then again across the three stages. Overall, we find HPWS-wellbeing is significantly related to overall IWBs and the three stages (development, adoption, and implementation) individually. Similarly, HPWS-wellbeing shapes playful work design and playful work design predicts overall IWBs and the three stages individually. It only partially mediates the effects of HPWS-wellbeing, which retains a significant indirect effect. Moderation effects are supported, with males reporting a more significant effect from HPWS-wellbeing on playful work design but not IWB (or any of the three stages) than females. Females report higher playful work design when HPWS-wellbeing is low, but the effects are reversed when HPWS-wellbeing is high (males higher). Thus, males respond stronger under social exchange theory from HPWS-wellbeing, at least towards expressing playful work design. Finally, evidence of moderated mediation effects is found on overall IWBs and the three stages. Males report a significant indirect effect from HPWS-wellbeing on IWB (through playful work design), while female employees report no significant indirect effect. The benefits of playful work design fully account for their IWBs. The models account for small amounts of variance towards playful work design (12%) but larger for IWBs (26%). The study highlights a gap in the literature on HPWS-wellbeing and provides empirical evidence of their importance towards worker innovation. Further, gendered effects suggest these benefits might not be equal. The findings provide useful insights for organizations around how providing HR practices that support employee well-being are important, although how they work for different genders needs further exploration.

Keywords: human resource practices, wellbeing, innovation, playful work design

Procedia PDF Downloads 69
1493 Polymer Mixing in the Cavity Transfer Mixer

Authors: Giovanna Grosso, Martien A. Hulsen, Arash Sarhangi Fard, Andrew Overend, Patrick. D. Anderson

Abstract:

In many industrial applications and, in particular in polymer industry, the quality of mixing between different materials is fundamental to guarantee the desired properties of finished products. However, properly modelling and understanding polymer mixing often presents noticeable difficulties, because of the variety and complexity of the physical phenomena involved. This is the case of the Cavity Transfer Mixer (CTM), for which a clear understanding of mixing mechanisms is still missing, as well as clear guidelines for the system optimization. This device, invented and patented by Gale at Rapra Technology Limited, is an add-on to be mounted downstream of existing extruders, in order to improve distributive mixing. It consists of two concentric cylinders, the rotor and stator, both provided with staggered rows of hemispherical cavities. The inner cylinder (rotor) rotates, while the outer (stator) remains still. At the same time, the pressure load imposed upstream, pushes the fluid through the CTM. Mixing processes are driven by the flow field generated by the complex interaction between the moving geometry, the imposed pressure load and the rheology of the fluid. In such a context, the present work proposes a complete and accurate three dimensional modelling of the CTM and results of a broad range of simulations assessing the impact on mixing of several geometrical and functioning parameters. Among them, we find: the number of cavities per row, the number of rows, the size of the mixer, the rheology of the fluid and the ratio between the rotation speed and the fluid throughput. The model is composed of a flow part and a mixing part: a finite element solver computes the transient velocity field, which is used in the mapping method implementation in order to simulate the concentration field evolution. Results of simulations are summarized in guidelines for the device optimization.

Keywords: Mixing, non-Newtonian fluids, polymers, rheology.

Procedia PDF Downloads 366
1492 Design Transformation to Reduce Cost in Irrigation Using Value Engineering

Authors: F. S. Al-Anzi, M. Sarfraz, A. Elmi, A. R. Khan

Abstract:

Researchers are responding to the environmental challenges of Kuwait in localized, innovative, effective and economic ways. One of the vital and significant examples of the natural challenges is lack or water and desertification. In this research, the project team focuses on redesigning a prototype, using Value Engineering Methodology, which would provide similar functionalities to the well-known technology of Waterboxx kits while reducing the capital and operational costs and simplifying the process of manufacturing and usability by regular farmers. The design employs used tires and recycled plastic sheets as raw materials. Hence, this approach is going to help not just fighting desertification but also helping in getting rid of ever growing huge tire dumpsters in Kuwait, as well as helping in avoiding hazards of tire fires yielding in a safer and friendlier environment. Several alternatives for implementing the prototype have been considered. The best alternative in terms of value has been selected after thorough Function Analysis System Technique (FAST) exercise has been developed. A prototype has been fabricated and tested in a controlled simulated lab environment that is being followed by real environment field testing. Water and soil analysis conducted on the site of the experiment to cross compare between the composition of the soil before and after the experiment to insure that the prototype being tested is actually going to be environment safe. Experimentation shows that the design was equally as effective as, and may exceed, the original design with significant savings in cost. An estimated total cost reduction using the VE approach of 43.84% over the original design. This cost reduction does not consider the intangible costs of environmental issue of waste recycling which many further intensify the total savings of using the alternative VE design. This case study shows that Value Engineering Methodology can be an important tool in innovating new designs for reducing costs.

Keywords: desertification, functional analysis, scrap tires, value engineering, waste recycling, water irrigation rationing

Procedia PDF Downloads 191
1491 Effects of the In-Situ Upgrading Project in Afghanistan: A Case Study on the Formally and Informally Developed Areas in Kabul

Authors: Maisam Rafiee, Chikashi Deguchi, Akio Odake, Minoru Matsui, Takanori Sata

Abstract:

Cities in Afghanistan have been rapidly urbanized; however, many parts of these cities have been developed with no detailed land use plan or infrastructure. In other words, they have been informally developed without any government leadership. The new government started the In-situ Upgrading Project in Kabul to upgrade roads, the water supply network system, and the surface water drainage system on the existing street layout in 2002, with the financial support of international agencies. This project is an appropriate emergency improvement for living life, but not an essential improvement of living conditions and infrastructure problems because the life expectancies of the improved facilities are as short as 10–15 years, and residents cannot obtain land tenure in the unplanned areas. The Land Readjustment System (LRS) conducted in Japan has good advantages that rearrange irregularly shaped land lots and develop the infrastructure effectively. This study investigates the effects of the In-situ Upgrading Project on private investment, land prices, and residents’ satisfaction with projects in Kart-e-Char, where properties are registered, and in Afshar-e-Silo Lot 1, where properties are unregistered. These projects are located 5 km and 7 km from the CBD area of Kabul, respectively. This study discusses whether LRS should be applied to the unplanned area based on the questionnaire and interview responses of experts experienced in the In-situ Upgrading Project who have knowledge of LRS. The analysis results reveal that, in Kart-e-Char, a lot of private investment has been made in the construction of medium-rise (five- to nine-story) buildings for commercial and residential purposes. Land values have also incrementally increased since the project, and residents are commonly satisfied with the road pavement, drainage systems, and water supplies, but dissatisfied with the poor delivery of electricity as well as the lack of public facilities (e.g., parks and sport facilities). In Afshar-e-Silo Lot 1, basic infrastructures like paved roads and surface water drainage systems have improved from the project. After the project, a few four- and five-story residential buildings were built with very low-level private investments, but significant increases in land prices were not evident. The residents are satisfied with the contribution ratio, drainage system, and small increase in land price, but there is still no drinking water supply system or tenure security; moreover, there are substandard paved roads and a lack of public facilities, such as parks, sport facilities, mosques, and schools. The results of the questionnaire and interviews with the four engineers highlight the problems that remain to be solved in the unplanned areas if LRS is applied—namely, land use differences, types and conditions of the infrastructure still to be installed by the project, and time spent for positive consensus building among the residents, given the project’s budget limitation.

Keywords: in-situ upgrading, Kabul city, land readjustment, land value, planned area, private investment, residents' satisfaction, unplanned area

Procedia PDF Downloads 188
1490 Interpretations of Disaster: A Comparative Study on Disaster Film Cycles

Authors: Chi-Ying Yu

Abstract:

In real life, the occurrence of disasters is always dreadful and heartbreaking, yet paradoxically, disaster film is a genre that has been popular at periodic intervals in motion picture history. This study attempts to compare the disaster film cycles of the 1970s, 1990s, and the early 21st century. Two research questions are addressed: First, how this genre has responded to the existing conditions of society in different periods in terms of the disaster proposition? Second, how this genre reflects a certain eternal substance of the human mind in light of its lasting appeal? Through cinematic textual analysis and literature review, this study finds that the emergence of disaster films in the 1970s reflected the turmoil in international relations and domestic politics situation in contemporary American society, and cinema screens showed such disaster stories as shipwrecks, air accidents, and skyscraper blazes due to human negligence. The 1990s saw the fervor of millennial apocalypse legends, and the awakening of environmental consciousness, which, together with the rapid advances in digital technology, once again gave rise to a frenzy of disaster films, with natural disasters and threats from aliens as the major themes of disasters. Since the beginning of the 21st century, the 911 Incident and natural disasters around the world have generated a consciousness of imminent crisis. Cinematic images simulated actual disasters, while aesthetic techniques focused on creating a kind of ‘empathetic’ experience in their exploration of the essence of the disaster experience. At the same time, post-apocalypse films that focus on post-disaster reconstruction have become an even more popular theme. Taking the approach of Jungian/post-Jungian film study, this study also reviews and interprets the commonly exhibited subliminal feelings in the disaster films of the three different periods. The imagination of disaster seems to serve as an underlying state of the human mind.

Keywords: disaster film, Jungian/post-Jungian film studies, stimulation, sublime

Procedia PDF Downloads 243
1489 Legal Personality and Responsibility of Robots

Authors: Mehrnoosh Abouzari, Shahrokh Sahraei

Abstract:

Arrival of artificial intelligence or smart robots in the modern world put them in charge on pericise and at risk. So acting human activities with robots makes criminal or civil responsibilities for their acts or behavior. The practical usage of smart robots has entered them in to a unique situation when naturalization happens and smart robots are identifies as members of society. There would be some legal situation by adopting these new smart citizens. The first situation is about legal responsibility of robots. Recognizing the naturalization of robot involves some basic right , so humans have the rights of employment, property, housing, using energy and other human rights may be employed for robots. So how would be the practice of these rights in the society and if some problems happens with these rights, how would the civil responsibility and punishment? May we consider them as population and count on the social programs? The second episode is about the criminal responsibility of robots in important activity instead of human that is the aim of inventing robots with handling works in AI technology , but the problem arises when some accidents are happened by robots who are in charge of important activities like army, surgery, transporting, judgement and so on. Moreover, recognizing independent identification for robots in the legal world by register ID cards, naturalization and civilian rights makes and prepare the same rights and obligations of human. So, the civil responsibility is not avoidable and if the robot commit a crime it would have criminal responsibility and have to be punished. The basic component of criminal responsibility may changes in so situation. For example, if designation for criminal responsibility bounds to human by sane, maturity, voluntariness, it would be for robots by being intelligent, good programming, not being hacked and so on. So it is irrational to punish robots by prisoning , execution and other human punishments for body. We may determine to make digital punishments like changing or repairing programs, exchanging some parts of its body or wreck it down completely. Finally the responsibility of the smart robot creators, programmers, the boss in chief, the organization who employed robot, the government which permitted to use robot in important bases and activities , will be analyzing and investigating in their article.

Keywords: robot, artificial intelligence, personality, responsibility

Procedia PDF Downloads 133
1488 Organizational Inertia: As a Control Mechanism for Organizational Creativity And Agility In Disruptive Environment

Authors: Doddy T. P. Enggarsyah, Soebowo Musa

Abstract:

Covid-19 pandemic has changed business environments and has spread economic contagion rapidly, as the stringent lockdowns and social distancing, which were initially intended to cut off the spread, have instead cut off the flow of economies. With no existing experience or playbook to deal with such a crisis, the prolonged pandemic can lead to bankruptcies, despite the fact that there are cases of companies that are not only able to survive but also to increase sales and create more jobs amid the economic crisis. This quantitative research study clarifies conflicting findings on organizational inertia whether it is a better strategy to implement during a disruptive environment. 316 respondents who worked in diverse firms operating in various industry types in Indonesia have completed the survey with a response rate of 63.2%. Further, this study clarifies the roles and relationships between organizational inertia, organizational creativity, organizational agility, and organizational resilience that potentially have determinants factors on firm performance in a disruptive environment. The findings of the study confirm that the organizational inertia of the firm will set up strong protection on the organization's fundamental orientation, which eventually will confine organizations to build adequate creative and adaptability responses—such fundamental orientation built from path dependency along with past success and prolonged firm performance. Organizational inertia acts like a control mechanism to ensure the adequacy of the given responses. The term adequate is important, as being overly creative during a disruptive environment may have a contradictory result since it can burden the firm performance. During a disruptive environment, organizations will limit creativity by focusing more on creativity that supports the resilience and new technology adoption will be limited since the cost of learning and implementation are perceived as greater than the potential gains. The optimal path towards firm performance is gained through organizational resilience, as in a disruptive environment, the survival of the organization takes precedence over firm performance.

Keywords: disruptive environment, organizational agility, organizational creativity, organizational inertia, organizational resilience

Procedia PDF Downloads 102
1487 Financial Policies in the Process of Global Crisis: Case Study Kosovo, Case Kosovo

Authors: Shpetim Rezniqi

Abstract:

Financial Policies in the process of global crisis the current crisis has swept the world with special emphasis, most developed countries, those countries which have most gross -product world and you have a high level of living.Even those who are not experts can describe the consequences of the crisis to see the reality that is seen, but how far will it go this crisis is impossible to predict. Even the biggest experts have conjecture and large divergence, but agree on one thing: - The devastating effects of this crisis will be more severe than ever before and can not be predicted.Long time, the world was dominated economic theory of free market laws. With the belief that the market is the regulator of all economic problems. The market, as river water will flow to find the best and will find the necessary solution best. Therefore much less state market barriers, less state intervention and market itself is an economic self-regulation. Free market economy became the model of global economic development and progress, it transcends national barriers and became the law of the development of the entire world economy. Globalization and global market freedom were principles of development and international cooperation. All international organizations like the World Bank, states powerful economic, development and cooperation principles laid free market economy and the elimination of state intervention. The less state intervention much more freedom of action was this market- leading international principle. We live in an era of financial tragic. Financial markets and banking in particular economies are in a state of thy good, US stock markets fell about 40%, in other words, this time, was one of the darkest moments 5 since 1920. Prior to her rank can only "collapse" of the stock of Wall Street in 1929, technological collapse of 2000, the crisis of 1973 after the Yom Kippur war, while the price of oil quadrupled and famous collapse of 1937 / '38, when Europe was beginning World war II In 2000, even though it seems like the end of the world was the corner, the world economy survived almost intact. Of course, that was small recessions in the United States, Europe, or Japan. Much more difficult the situation was at crisis 30s, or 70s, however, succeeded the world. Regarding the recent financial crisis, it has all the signs to be much sharper and with more consequences. The decline in stock prices is more a byproduct of what is really happening. Financial markets began dance of death with the credit crisis, which came as a result of the large increase in real estate prices and household debt. It is these last two phenomena can be matched very well with the gains of the '20s, a period during which people spent fists as if there was no tomorrow. All is not away from the mouth of the word recession, that fact no longer a sudden and abrupt. But as much as the financial markets melt, the greater is the risk of a problematic economy for years to come. Thus, for example, the banking crisis in Japan proved to be much more severe than initially expected, partly because the assets which were based more loans had, especially the land that falling in value. The price of land in Japan is about 15 years that continues to fall. (ADRI Nurellari-Published in the newspaper "Classifieds"). At this moment, it is still difficult to çmosh to what extent the crisis has affected the economy and what would be the consequences of the crisis. What we know is that many banks will need more time to reduce the award of credit, but banks have this primary function, this means huge loss.

Keywords: globalisation, finance, crisis, recomandation, bank, credits

Procedia PDF Downloads 375
1486 Synthesis of Double Dye-Doped Silica Nanoparticles and Its Application in Paper-Based Chromatography

Authors: Ka Ho Yau, Jan Frederick Engels, Kwok Kei Lai, Reinhard Renneberg

Abstract:

Lateral flow test is a prevalent technology in various sectors such as food, pharmacology and biomedical sciences. Colloidal gold (CG) is widely used as the signalling molecule because of the ease of synthesis, bimolecular conjugation and its red colour due to intrinsic SPRE. However, the production of colloidal gold is costly and requires vigorous conditions. The stability of colloidal gold are easily affected by environmental factors such as pH, high salt content etc. Silica nanoparticles are well known for its ease of production and stability over a wide range of solvents. Using reverse micro-emulsion (w/o), silica nanoparticles with different sizes can be produced precisely by controlling the amount of water. By incorporating different water-soluble dyes, a rainbow colour of the silica nanoparticles could be produced. Conjugation with biomolecules such as antibodies can be achieved after surface modification of the silica nanoparticles with organosilane. The optimum amount of the antibodies to be labelled was determined by Bradford Assay. In this work, we have demonstrated the ability of the dye-doped silica nanoparticles as a signalling molecule in lateral flow test, which showed a semi-quantitative measurement of the analyte. The image was further analysed for the LOD=10 ng of the analyte. The working range and the linear range of the test were from 0 to 2.15μg/mL and from 0 to 1.07 μg/mL (R2=0.988) respectively. The performance of the tests was comparable to those using colloidal gold with the advantages of lower cost, enhanced stability and having a wide spectrum of colours. The positives lines can be imaged by naked eye or by using a mobile phone camera for a better quantification. Further research has been carried out in multicolour detection of different biomarkers simultaneously. The preliminary results were promising as there was little cross-reactivity being observed for an optimized system. This approach provides a platform for multicolour detection for a set of biomarkers that enhances the accuracy of diseases diagnostics.

Keywords: colorimetric detection, immunosensor, paper-based biosensor, silica

Procedia PDF Downloads 371
1485 Cultural Barriers in the Communication of Breast Cancer in Sub-Saharan Africa

Authors: Kayum Fokoue Carole

Abstract:

This paper aims at verifying the effectiveness of reaching target populations while paying attention to their cultural background when communicating new knowledge, ideas or technology in a multicultural world. Our case study is an experiment on the communication of knowledge on breast cancer in three sub-Saharan countries (Ghana, Tchad, and Cameroon health). The methodology consisted of submitting a semi-structured questionnaire to local populations in some localities in these target countries in order to determine the cultural barriers hindering the effective communication of knowledge on breast cancer. Once this done, sensitization documents on breast cancer were translated into Ewe (Ghana), Mbaye (Tchad), Ghomala’, Ewondo, and Fufulde (Cameroon). In each locality, a sensitization programme was organised for two groups. For one group, the cultural barriers discovered were taken into consideration while communicating during the programme whereas in the other group, they were not. Another questionnaire was disseminated after three months to verify the level of appropriation of those who attended the campaign based on Chumbow’s appropriation theory. This paper, therefore, discusses some spiritual beliefs, representations and practices in the target African communities hindering effective communication of issues on breast cancer in the target localities. Findings reveal that only 38% of respondents in the group of those for whom cultural barriers were not taken into account during the programme had a high level of appropriation while for the other group, 86% had a high level of appropriation. This is evidence that the communication of issues on breast cancer can be more effective by reaching different populations in a language they best master while paying attention to their culture. Therefore, international communication of new knowledge should be culturally contextualised. Suggestions at the end of the paper are directed towards the achievement of these goals. The present work promotes international partnership in addressing and resolving global health preoccupations since research findings from one community/country can be mutualized in partnership with other communities and countries.

Keywords: cultural barriers, communication, health, breast cancer

Procedia PDF Downloads 321
1484 Environmental Impacts Assessment of Power Generation via Biomass Gasification Systems: Life Cycle Analysis (LCA) Approach for Tars Release

Authors: Grâce Chidikofan, François Pinta, A. Benoist, G. Volle, J. Valette

Abstract:

Statement of the Problem: biomass gasification systems may be relevant for decentralized power generation from recoverable agricultural and wood residues available in rural areas. In recent years, many systems have been implemented in all over the world as especially in Cambodgia, India. Although they have many positive effects, these systems can also affect the environment and human health. Indeed, during the process of biomass gasification, black wastewater containing tars are produced and generally discharged in the local environment either into the rivers or on soil. However, in most environmental assessment studies of biomass gasification systems, the impact of these releases are underestimated, due to the difficulty of identification of their chemical substances. This work deal with the analysis of the environmental impacts of tars from wood gasification in terms of human toxicity cancer effect, human toxicity non-cancer effect, and freshwater ecotoxicity. Methodology: A Life Cycle Assessment (LCA) approach was adopted. The inventory of tars chemicals substances was based on experimental data from a downdraft gasification system. The composition of six samples from two batches of raw materials: one batch made of tree wood species (oak+ plane tree +pine) at 25 % moisture content and the second batch made of oak at 11% moisture content. The tests were carried out for different gasifier load rates, respectively in the range 50-75% and 50-100%. To choose the environmental impacts assessment method, we compared the methods available in SIMAPRO tool (8.2.0) which are taking into account most of the chemical substances. The environmental impacts for 1kg of tars discharged were characterized by ILCD 2011+ method (V.1.08). Findings Experimental results revealed 38 important chemical substances in varying proportion from one test to another. Only 30 are characterized by ILCD 2011+ method, which is one of the best performing methods. The results show that wood species or moisture content have no significant impact on human toxicity noncancer effect (HTNCE) and freshwater ecotoxicity (FWE) for water release. For human toxicity cancer effect (HTCE), a small gap is observed between impact factors of the two batches, either 3.08E-7 CTUh/kg against 6.58E-7 CTUh/kg. On the other hand, it was found that the risk of negative effects is higher in case of tar release into water than on soil for all impact categories. Indeed, considering the set of samples, the average impact factor obtained for HTNCE varies respectively from 1.64 E-7 to 1.60E-8 CTUh/kg. For HTCE, the impact factor varies between 4.83E-07 CTUh/kg and 2.43E-08 CTUh/kg. The variability of those impact factors is relatively low for these two impact categories. Concerning FWE, the variability of impact factor is very high. It is 1.3E+03 CTUe/kg for tars release into water against 2.01E+01 CTUe/kg for tars release on soil. Statement concluding: The results of this study show that the environmental impacts of tars emission of biomass gasification systems can be consequent and it is important to investigate the ways to reduce them. For environmental research, these results represent an important step of a global environmental assessment of the studied systems. It could be used to better manage the wastewater containing tars to reduce as possible the impacts of numerous still running systems all over the world.

Keywords: biomass gasification, life cycle analysis, LCA, environmental impact, tars

Procedia PDF Downloads 268
1483 To Include or Not to Include: Resolving Ethical Concerns over the 20% High Quality Cassava Flour Inclusion in Wheat Flour Policy in Nigeria

Authors: Popoola I. Olayinka, Alamu E. Oladeji, B. Maziya-Dixon

Abstract:

Cassava, an indigenous crop grown locally by subsistence farmers in Nigeria has potential to bring economic benefits to the country. Consumption of bread and other confectionaries has been on the rise due to lifestyle changes of Nigerian consumers. However, wheat, being the major ingredient for bread and confectionery production does not thrive well under Nigerian climate hence the huge spending on wheat importation. To reduce spending on wheat importation, the Federal Government of Nigeria intends passing into law mandatory inclusion of 20% high-quality cassava flour (HQCF) in wheat flour. While the proposed policy may reduce post harvest loss of cassava, and also increase food security and domestic agricultural productivity, there are downsides to the policy which include reduction in nutritional quality and low sensory appeal of cassava-wheat bread, reluctance of flour millers to use HQCF, technology and processing challenges among others. The policy thus presents an ethical dilemma which must be resolved for its successful implementation. While inclusion of HQCF to wheat flour in bread and confectionery is a topic that may have been well addressed, resolving the ethical dilemma resulting from the act has not received much attention. This paper attempts to resolve this dilemma using various approaches in food ethics (cost benefits, utilitarianism, deontological and deliberative). The Cost-benefit approach did not provide adequate resolution of the dilemma as all the costs and benefits of the policy could not be stated in the quantitative term. The utilitarianism approach suggests that the policy delivers greatest good to the greatest number while the deontological approach suggests that the act (inclusion of HQCF to wheat flour) is right hence the policy is not utterly wrong. The deliberative approach suggests a win-win situation through deliberation with the parties involved.

Keywords: HQCF, ethical dilemma, food security, composite flour, cassava bread

Procedia PDF Downloads 396
1482 Effect of Resistance Exercise on Hypothalamic-Pituitary-Gonadal Axis

Authors: Alireza Barari, Saeed Shirali, Ahmad Abdi

Abstract:

Abstract: Introduction: Physical activity may be related to male reproductive function by affecting on thehypothalamic-pituitary-gonadal(HPG) axis. Our aim was to determine the effects of 6 weeks resistance exercise on reproductive hormones, HPG axis. The hypothalamic-pituitary-gonadal (HPG) axis refers tothe effects of endocrine glands in three-level including (i) the hypothalamic releasing hormone GnRH, which is synthesized in in a small heterogenous neuronal population and released in a pulsatile fashion, (ii) the anterior pituitary hormones, follicle-stimulating hormone(FSH) and luteinizing hormone (LH) and (iii) the gonadal hormones, which include both steroid such as testosterone (T), estradiol and progesterone and peptide hormones (such as inhibin). Hormonal changes that create a more anabolic environment have been suggested to contribute to the adaptation to strength exercise. Physical activity has an extensive impact on male reproductive function depending upon the intensity and duration of the exercise and the fitness level of the individual. However, strenuous exercise represents a physical stress and inflammation changed that challenges homeostasis. Materials and methods: Sixteen male volunteered were included in a 6-week control period followed by 6 weeks of resistance training (leg press, lat pull, chest press, squat, seatedrow, abdominal crunch, shoulder press, biceps curl and triceps press down) four times per week. intensity of training loading was 60%-75% of one maximum repetition. Participants performed 3 sets of 10 repetitions. Rest periods were two min between exercises and sets. Start with warm up exercises include: The muscles relax and stretch the body, which was for 10 minutes. Body composition, VO2max and the circulating level of free testosterone (fT), luteinizing hormone (LH), follicle-stimulating hormone (FSH), sex hormone binding globulin (SHBG) and inhibin B measured prior and post 6-week intervention. The hormonal levels of each serum sample were measured using commercially available ELISA kits. Analysis of anthropometrical data and hormonal level were compared using the independent samples t- test in both groups and using SPSS (version 19). P ≤ 0.05 was considered statistically significant. Results: For muscle strength, both lower- and upper-body strength were increased significantly. Aerobic fitness level improved in trained participant from 39.4 ± 5.6 to 41.9 ± 5.3 (P = 0.002). fT concentration rise progressively in the trained group and was significantly greater than those in the control group (P = 0.000). By the end of the 6-week resistance training, serum SHBG significantly increased in the trained group compared with the control group (P = 0.013). In response to resistance training, LH, FSH and inhibin B were not significantly changed. Discussion: According to our finfings, 6 weeks of resistance training induce fat loss without any changes in body weight and BMI. A decline of 25.3% in percentage of body fat with statiscally same weight was due to increase in muscle mass that happened during resistance exercise periods . Six weeks of resistance training resulted in significant improvement in BF%, VO2max and increasing strength and the level of fT and SHBG.

Keywords: resistance, hypothalamic, pituitary, gonadal axis

Procedia PDF Downloads 389
1481 Ultrasonic Agglomeration of Protein Matrices and Its Effect on Thermophysical, Macro- and Microstructural Properties

Authors: Daniela Rivera-Tobar Mario Perez-Won, Roberto Lemus-Mondaca, Gipsy Tabilo-Munizaga

Abstract:

Different dietary trends worldwide seek to consume foods with anti-inflammatory properties, rich in antioxidants, proteins, and unsaturated fatty acids that lead to better metabolic, intestinal, mental, and cardiac health. In this sense, food matrices with high protein content based on macro and microalgae are an excellent alternative to meet the new needs of consumers. An emerging and environmentally friendly technology for producing protein matrices is ultrasonic agglomeration. It consists of the formation of permanent bonds between particles, improving the agglomeration of the matrix compared to conventionally agglomerated products (compression). Among the advantages of this process are the reduction of nutrient loss and the avoidance of binding agents. The objective of this research was to optimize the ultrasonic agglomeration process in matrices composed of Spirulina (Arthrospira platensis) powder and Cochayuyo (Durvillae Antartica) flour, by means of the response variable (Young's modulus) and the independent variables were the process conditions (percentage of ultrasonic amplitude: 70, 80 and 90; ultrasonic agglomeration times and cycles: 20, 25 and 30 seconds, and 3, 4 and 5). It was evaluated using a central composite design and analyzed using response surface methodology. In addition, the effects of agglomeration on thermophysical and microstructural properties were evaluated. It was determined that ultrasonic compression with 80 and 90% amplitude caused conformational changes according to Fourier infrared spectroscopy (FTIR) analysis, the best condition with respect to observed microstructure images (SEM) and differential scanning calorimetry (DSC) analysis, was the condition of 90% amplitude 25 and 30 seconds with 3 and 4 cycles of ultrasound. In conclusion, the agglomerated matrices present good macro and microstructural properties which would allow the design of food systems with better nutritional and functional properties.

Keywords: ultrasonic agglomeration, physical properties of food, protein matrices, macro and microalgae

Procedia PDF Downloads 48
1480 Creative Skills Supported by Multidisciplinary Learning: Case Innovation Course at the Seinäjoki University of Applied Sciences

Authors: Satu Lautamäki

Abstract:

This paper presents findings from a multidisciplinary course (bachelor level) implemented at Seinäjoki University of Applied Sciences, Finland. The course aims to develop innovative thinking of students, by having projects given by companies, using design thinking methods as a tool for creativity and by integrating students into multidisciplinary teams working on the given projects. The course is obligatory for all first year bachelor students across four faculties (business and culture, food and agriculture, health care and social work, and technology). The course involves around 800 students and 30 pedagogical coaches, and it is implemented as an intensive one-week course each year. The paper discusses the pedagogy, structure and coordination of the course. Also, reflections on methods for the development of creative skills are given. Experts in contemporary, global context often work in teams, which consist of people who have different areas of expertise and represent various professional backgrounds. That is why there is a strong need for new training methods where multidisciplinary approach is at the heart of learning. Creative learning takes place when different parties bring information to the discussion and learn from each other. When students in different fields are looking for professional growth for themselves and take responsibility for the professional growth of other learners, they form a mutual learning relationship with each other. Multidisciplinary team members make decisions both individually and collectively, which helps them to understand and appreciate other disciplines. Our results show that creative and multidisciplinary project learning can develop diversity of knowledge and competences, for instance, students’ cultural knowledge, teamwork and innovation competences, time management and presentation skills as well as support a student’s personal development as an expert. It is highly recommended that higher education curricula should include various studies for students from different study fields to work in multidisciplinary teams.

Keywords: multidisciplinary learning, creative skills, innovative thinking, project-based learning

Procedia PDF Downloads 95