Search results for: attention-based fully convolutional network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6345

Search results for: attention-based fully convolutional network

1155 Estimation of Twist Loss in the Weft Yarn during Air-Jet Weft Insertion

Authors: Muhammad Umair, Yasir Nawab, Khubab Shaker, Muhammad Maqsood, Adeel Zulfiqar, Danish Mahmood Baitab

Abstract:

Fabric is a flexible woven material consisting of a network of natural or artificial fibers often referred to as thread or yarn. Today fabrics are produced by weaving, braiding, knitting, tufting and non-woven. Weaving is a method of fabric production in which warp and weft yarns are interlaced perpendicular to each other. There is infinite number of ways for the interlacing of warp and weft yarn. Each way produces a different fabric structure. The yarns parallel to the machine direction are called warp yarns and the yarns perpendicular to the machine direction are called weft or filling yarns. Air jet weaving is the modern method of weft insertion and considered as high speed loom. The twist loss in air jet during weft insertion affects the strength. The aim of this study was to investigate the effect of twist change in weft yarn during air-jet weft insertion. A total number of 8 samples were produced using 1/1 plain and 3/1 twill weave design with two fabric widths having same loom settings. Two different types of yarns like cotton and PC blend were used. The effect of material type, weave design and fabric width on twist change of weft yarn was measured and discussed. Twist change in the different types of weft yarn and weave design was measured and compared the twist change in the weft yarn with the yarn before weft yarn insertion and twist loss is measured. Wider fabric leads to higher twist loss in the yarn.

Keywords: air jet loom, twist per inch, twist loss, weft yarn

Procedia PDF Downloads 382
1154 Neuroinflammation in Late-Life Depression: The Role of Glial Cells

Authors: Chaomeng Liu, Li Li, Xiao Wang, Li Ren, Qinge Zhang

Abstract:

Late-life depression (LLD) is a prevalent mental disorder among the elderly, frequently accompanied by significant cognitive decline, and has emerged as a worldwide public health concern. Microglia, astrocytes, and peripheral immune cells play pivotal roles in regulating inflammatory responses within the central nervous system (CNS) across diverse cerebral disorders. This review commences with the clinical research findings and accentuates the recent advancements pertaining to microglia and astrocytes in the neuroinflammation process of LLD. The reciprocal communication network between the CNS and immune system is of paramount importance in the pathogenesis of depression and cognitive decline. Stress-induced downregulation of tight and gap junction proteins in the brain results in increased blood-brain barrier permeability and impaired astrocyte function. Concurrently, activated microglia release inflammatory mediators, initiating the kynurenine metabolic pathway and exacerbating the quinolinic acid/kynurenic acid imbalance. Moreover, the balance between Th17 and Treg cells is implicated in the preservation of immune homeostasis within the cerebral milieu of individuals suffering from LLD. The ultimate objective of this review is to present future strategies for the management and treatment of LLD, informed by the most recent advancements in research, with the aim of averting or postponing the onset of AD.

Keywords: neuroinflammation, late-life depression, microglia, astrocytes, central nervous system, blood-brain barrier, Kynurenine pathway

Procedia PDF Downloads 12
1153 Implementation of Conceptual Real-Time Embedded Functional Design via Drive-By-Wire ECU Development

Authors: Ananchai Ukaew, Choopong Chauypen

Abstract:

Design concepts of real-time embedded system can be realized initially by introducing novel design approaches. In this literature, model based design approach and in-the-loop testing were employed early in the conceptual and preliminary phase to formulate design requirements and perform quick real-time verification. The design and analysis methodology includes simulation analysis, model based testing, and in-the-loop testing. The design of conceptual drive-by-wire, or DBW, algorithm for electronic control unit, or ECU, was presented to demonstrate the conceptual design process, analysis, and functionality evaluation. The concepts of DBW ECU function can be implemented in the vehicle system to improve electric vehicle, or EV, conversion drivability. However, within a new development process, conceptual ECU functions and parameters are needed to be evaluated. As a result, the testing system was employed to support conceptual DBW ECU functions evaluation. For the current setup, the system components were consisted of actual DBW ECU hardware, electric vehicle models, and control area network or CAN protocol. The vehicle models and CAN bus interface were both implemented as real-time applications where ECU and CAN protocol functionality were verified according to the design requirements. The proposed system could potentially benefit in performing rapid real-time analysis of design parameters for conceptual system or software algorithm development.

Keywords: drive-by-wire ECU, in-the-loop testing, model-based design, real-time embedded system

Procedia PDF Downloads 339
1152 In-Farm Wood Gasification Energy Micro-Generation System in Brazil: A Monte Carlo Viability Simulation

Authors: Erich Gomes Schaitza, Antônio Francisco Savi, Glaucia Aparecida Prates

Abstract:

The penetration of renewable energy into the electricity supply in Brazil is high, one of the highest in the World. Centralized hydroelectric generation is the main source of energy, followed by biomass and wind. Surprisingly, mini and micro-generation are negligible, with less than 2,000 connections to the national grid. In 2015, a new regulatory framework was put in place to change this situation. In the agricultural sector, the framework was complemented by the offer of low interest rate loans to in-farm renewable generation. Brazil proposed to more than double its area of planted forests as part of its INDC- Intended Nationally Determined Contributions to the UNFCCC-U.N. Framework Convention on Climate Change (UNFCCC). This is an ambitious target which will be achieved only if forests are attractive to farmers. Therefore, this paper analyses whether planting forests for in-farm energy generation with a with a woodchip gasifier is economically viable for microgeneration under the new framework and at if they could be an economic driver for forest plantation. At first, a static case was analyzed with data from Eucalyptus plantations in five farms. Then, a broader analysis developed with the use of Monte Carlo technique. Planting short rotation forests to generate energy could be a viable alternative and the low interest loans contribute to that. There are some barriers to such systems such as the inexistence of a mature market for small scale equipment and of a reference network of good practices and examples.

Keywords: biomass, distribuited generation, small-scale, Monte Carlo

Procedia PDF Downloads 272
1151 Global Digital Peer-to-Peer (P2P) Lending Platform Empowering Rural India: Determinants of Funding

Authors: Ankur Mehra, M. V. Shivaani

Abstract:

With increasing digitization, the world is coming closer, not only in terms of informational flow but also in terms of capital flows. And micro-finance institutions (MFIs) have perfectly leveraged this digital world by resorting to the innovative digital social peer-to-peer (P2P) lending platforms, such as, Kiva. These digital P2P platforms bring together micro-borrowers and lenders from across the world. The main objective of this study is to understand the funding preferences of social investors primarily from developed countries (such as US, UK, Australia), lending money to borrowers from rural India at zero interest rates through Kiva. Further, the objective of this study is to increase awareness about such a platform among various MFIs engaged in providing micro-loans to those in need. The sample comprises of India based micro-loan applications posted by various MFIs on Kiva lending platform over the period Sept 2012-March 2016. Out of 7,359 loans, 256 loans failed to get funded by social investors. On an average a micro-loan with 30 days to expiry gets fully funded in 7,593 minutes or 5.27 days. 62% of the loans raised on Kiva are related to livelihood, 32.5% of the loans are for funding basic necessities and balance 5.5% loans are for funding education. 47% of the loan applications have more than one borrower; while, currency exchange risk is on the social lenders for 45% of the loans. Controlling for the loan amount and loan tenure, the analyses suggest that those loan applications where the number of borrowers is more than one have a lower chance of getting funded as compared to the loan applications made by a sole borrower. Such group applications also take more time to get funded. Further, loan application by a solo woman not only has a higher chance of getting funded but as such get funded faster. The results also suggest that those loan applications which are supported by an MFI that has a religious affiliation, not only have a lower chance of getting funded, but also take longer to get funded as compared to the loan applications posted by secular MFIs. The results do not support cross-border currency risk to be a factor in explaining the determinants of loan funding. Finally, analyses suggest that loans raised for the purpose of earning livelihood and education have a higher chance of getting funded and such loans get funded faster as compared to the loans applied for purposes related to basic necessities such a clothing, housing, food, health, and personal use. The results are robust to controls for ‘MFI dummy’ and ‘year dummy’. The key implication from this study is that global social investors tend to develop an emotional connect with single woman borrowers and consequently they get funded faster Hence, MFIs should look for alternative ways for funding loans whose purpose is to meet basic needs; while, more loans related to livelihood and education should be raised via digital platforms.

Keywords: P2P lending, social investing, fintech, financial inclusion

Procedia PDF Downloads 124
1150 Deep Learning Approach for Chronic Kidney Disease Complications

Authors: Mario Isaza-Ruget, Claudia C. Colmenares-Mejia, Nancy Yomayusa, Camilo A. González, Andres Cely, Jossie Murcia

Abstract:

Quantification of risks associated with complications development from chronic kidney disease (CKD) through accurate survival models can help with patient management. A retrospective cohort that included patients diagnosed with CKD from a primary care program and followed up between 2013 and 2018 was carried out. Time-dependent and static covariates associated with demographic, clinical, and laboratory factors were included. Deep Learning (DL) survival analyzes were developed for three CKD outcomes: CKD stage progression, >25% decrease in Estimated Glomerular Filtration Rate (eGFR), and Renal Replacement Therapy (RRT). Models were evaluated and compared with Random Survival Forest (RSF) based on concordance index (C-index) metric. 2.143 patients were included. Two models were developed for each outcome, Deep Neural Network (DNN) model reported C-index=0.9867 for CKD stage progression; C-index=0.9905 for reduction in eGFR; C-index=0.9867 for RRT. Regarding the RSF model, C-index=0.6650 was reached for CKD stage progression; decreased eGFR C-index=0.6759; RRT C-index=0.8926. DNN models applied in survival analysis context with considerations of longitudinal covariates at the start of follow-up can predict renal stage progression, a significant decrease in eGFR and RRT. The success of these survival models lies in the appropriate definition of survival times and the analysis of covariates, especially those that vary over time.

Keywords: artificial intelligence, chronic kidney disease, deep neural networks, survival analysis

Procedia PDF Downloads 119
1149 Aluminum Based Hexaferrite and Reduced Graphene Oxide a Suitable Microwave Absorber for Microwave Application

Authors: Sanghamitra Acharya, Suwarna Datar

Abstract:

Extensive use of digital and smart communication createsprolong expose of unwanted electromagnetic (EM) radiations. This harmful radiation creates not only malfunctioning of nearby electronic gadgets but also severely affects a human being. So, a suitable microwave absorbing material (MAM) becomes a necessary urge in the field of stealth and radar technology. Initially, Aluminum based hexa ferrite was prepared by sol-gel technique and for carbon derived composite was prepared by the simple one port chemical reduction method. Finally, composite films of Poly (Vinylidene) Fluoride (PVDF) are prepared by simple gel casting technique. Present work demands that aluminum-based hexaferrite phase conjugated with graphene in PVDF matrix becomes a suitable candidate both in commercially important X and Ku band. The structural and morphological nature was characterized by X-Ray diffraction (XRD), Field emission-scanning electron microscope (FESEM) and Raman spectra which conforms that 30-40 nm particles are well decorated over graphene sheet. Magnetic force microscopy (MFM) and conducting force microscopy (CFM) study further conforms the magnetic and conducting nature of composite. Finally, shielding effectiveness (SE) of the composite film was studied by using Vector network analyzer (VNA) both in X band and Ku band frequency range and found to be more than 30 dB and 40 dB, respectively. As prepared composite films are excellent microwave absorbers.

Keywords: carbon nanocomposite, microwave absorbing material, electromagnetic shielding, hexaferrite

Procedia PDF Downloads 161
1148 Generative Adversarial Network Based Fingerprint Anti-Spoofing Limitations

Authors: Yehjune Heo

Abstract:

Fingerprint Anti-Spoofing approaches have been actively developed and applied in real-world applications. One of the main problems for Fingerprint Anti-Spoofing is not robust to unseen samples, especially in real-world scenarios. A possible solution will be to generate artificial, but realistic fingerprint samples and use them for training in order to achieve good generalization. This paper contains experimental and comparative results with currently popular GAN based methods and uses realistic synthesis of fingerprints in training in order to increase the performance. Among various GAN models, the most popular StyleGAN is used for the experiments. The CNN models were first trained with the dataset that did not contain generated fake images and the accuracy along with the mean average error rate were recorded. Then, the fake generated images (fake images of live fingerprints and fake images of spoof fingerprints) were each combined with the original images (real images of live fingerprints and real images of spoof fingerprints), and various CNN models were trained. The best performances for each CNN model, trained with the dataset of generated fake images and each time the accuracy and the mean average error rate, were recorded. We observe that current GAN based approaches need significant improvements for the Anti-Spoofing performance, although the overall quality of the synthesized fingerprints seems to be reasonable. We include the analysis of this performance degradation, especially with a small number of samples. In addition, we suggest several approaches towards improved generalization with a small number of samples, by focusing on what GAN based approaches should learn and should not learn.

Keywords: anti-spoofing, CNN, fingerprint recognition, GAN

Procedia PDF Downloads 174
1147 The Misuse of Social Media in Order to Exploit "Generation Y"; The Tactics of IS

Authors: Ali Riza Perçin, Eser Bingül

Abstract:

Internet technologies have created opportunities with which people share their ideologies, thoughts and products. This virtual world, named social media has given the chance of gathering individual users and people from the world's remote locations and establishing an interaction between them. However, to an increasingly higher degree terrorist organizations today use the internet and most notably social-network media to create the effects they desire through a series of on-line activities. These activities, designed to support their activities, include information collection (intelligence), target selection, propaganda, fundraising and recruitment to name a few. Meanwhile, these have been used as the most important tool for recruitment especially from the different region of the world, especially disenfranchised youth, in the West in order to mobilize support and recruit “foreign fighters.” The recruits have obtained the statue, which is not accessible in their society and have preferred the style of life that is offered by the terrorist organizations instead of their current life. Like other terrorist groups, for a while now the terrorist organization Islamic State (IS) in Iraq and Syria has employed a social-media strategy in order to advance their strategic objectives. At the moment, however, IS seems to be more successful in their on-line activities than other similar organizations. IS uses social media strategically as part of its armed activities and for the sustainability of their military presence in Syria and Iraq. In this context, “Generation Y”, which could exist at the critical position and undertake active role, has been examined. Additionally, the explained characteristics of “Generation Y” have been put forward and the duties of families and society have been stated as well.

Keywords: social media, "generation Y", terrorist organization, islamic state IS

Procedia PDF Downloads 412
1146 A Review on the Level of Development of Macedonia and Iran's Organic Agriculture as Compared to Nigeria

Authors: Yusuf Ahmad Sani, Adamu Alhaji Yakubu, Alhaji Abdullahi Jamilu, Joel Omeke, Ibrahim Jumare Sambo

Abstract:

With the rising global threat of food security, cancer, and related diseases (carcinogenic) because of increased usage of inorganic substances in agricultural food production, the Ministry of Food Agriculture and Livestock of the Republic of Turkey organized an International Workshop on Organic Agriculture between 8 – 12th December 2014 at the International Agricultural Research and Training Center, Izmir. About 21 countries, including Nigeria, were invited to attend the training workshop. Several topics on organic agriculture were presented by renowned scholars, ranging from regulation, certification, crop, animal, seed production, pest and disease management, soil composting, and marketing of organic agricultural products, among others. This paper purposely selected two countries (Macedonia and Iran) out of the 21 countries to assess their level of development in terms of organic agriculture as compared to Nigeria. Macedonia, with a population of only 2.1 million people as of 2014, started organic agriculture in 2005 with only 266ha of land and has grown significantly to over 5,000ha in 2010, covering such crops as cereals (62%), forage (20%) fruit orchard (7%), vineyards (5%), vegetables (4%), oil seed and industrial crops (1%) each. Others are organic beekeeping from 110 hives to over 15,000 certified colonies. As part of government commitment, the level of government subsidy for organic products was 30% compared to the direct support for conventional agricultural products. About 19 by-laws were introduced on organic agricultural production that was fully consistent with European Union regulations. The republic of Iran, on the other hand, embarked on organic agriculture for the fact that the country recorded the highest rate of cancer disease in the world, with over 30,000 people dying every year and 297 people diagnosed every day. However, the host country, Turkey, is well advanced in organic agricultural production and now being the largest exporter of organic products to Europe and other parts of the globe. A technical trip to one of the villages that are under the government scheme on organic agriculture reveals that organic agriculture was based on market-demand-driven and the support of the government was very visible, linking the farmers with private companies that provide inputs to them while the companies purchase the products at harvest with high premium price. However, in Nigeria, research on organic agriculture was very recent, and there was very scanty information on organic agriculture due to poor documentation and very low awareness, even among the elites. The paper, therefore, recommends that the government should provide funds to NARIs to conduct research on organic agriculture and to establish clear government policy and good pre-conditions for sustainable organic agricultural production in the country.

Keywords: organic agriculture, food security, food safety, food nutrition

Procedia PDF Downloads 10
1145 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning

Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar

Abstract:

As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling, and proposes the challenges and improvement directions for DRL-based resource scheduling algorithms.

Keywords: resource scheduling, deep reinforcement learning, distributed system, artificial intelligence

Procedia PDF Downloads 92
1144 Bioresorbable Medicament-Eluting Grommet Tube for Otitis Media with Effusion

Authors: Chee Wee Gan, Anthony Herr Cheun Ng, Yee Shan Wong, Subbu Venkatraman, Lynne Hsueh Yee Lim

Abstract:

Otitis media with effusion (OME) is the leading cause of hearing loss in children worldwide. Surgery to insert grommet tube into the eardrum is usually indicated for OME unresponsive to antimicrobial therapy. It is the most common surgery for children. However, current commercially available grommet tubes are non-bioresorbable, not drug-treated, with unpredictable duration of retention on the eardrum to ventilate middle ear. Their functionality is impaired when clogged or chronically infected, requiring additional surgery to remove/reinsert grommet tubes. We envisaged that a novel fully bioresorbable grommet tube with sustained antibiotic release technology could address these drawbacks. In this study, drug-loaded bioresorbable poly(L-lactide-co-ε-caprolactone)(PLC) copolymer grommet tubes were fabricated by microinjection moulding technique. In vitro drug release and degradation model of PLC tubes were studied. Antibacterial property was evaluated by incubating PLC tubes with P. aeruginosa broth. Surface morphology was analyzed using scanning electron microscopy. A preliminary animal study was conducted using guinea pigs as an in vivo model to evaluate PLC tubes with and without drug, with commercial Mini Shah grommet tube as comparison. Our in vitro data showed sustained drug release over 3 months. All PLC tubes revealed exponential degradation profiles over time. Modeling predicted loss of tube functionality in water to be approximately 14 weeks and 17 weeks for PLC with and without drug, respectively. Generally, PLC tubes had less bacteria adherence, which were attributed to the much smoother tube surfaces compared to Mini Shah. Antibiotic from PLC tube further made bacteria adherence on surface negligible. They showed neither inflammation nor otorrhea after 18 weeks post-insertion in the eardrums of guinea pigs, but had demonstrated severe degree of bioresorption. Histology confirmed the new PLC tubes were biocompatible. Analyses on the PLC tubes in the eardrums showed bioresorption profiles close to our in vitro degradation models. The bioresorbable antibiotic-loaded grommet tubes showed good predictability in functionality. The smooth surface and sustained release technology reduced the risk of tube infection. Tube functional duration of 18 weeks allowed sufficient ventilation period to treat OME. Our ongoing studies include modifying the surface properties with protein coating, optimizing the drug dosage in the tubes to enhance their performances, evaluating their functional outcome on hearing after full resoption of grommet tube and healing of eardrums, and developing animal model with OME to further validate our in vitro models.

Keywords: bioresorbable polymer, drug release, grommet tube, guinea pigs, otitis media with effusion

Procedia PDF Downloads 437
1143 An Application of Path Planning Algorithms for Autonomous Inspection of Buried Pipes with Swarm Robots

Authors: Richard Molyneux, Christopher Parrott, Kirill Horoshenkov

Abstract:

This paper aims to demonstrate how various algorithms can be implemented within swarms of autonomous robots to provide continuous inspection within underground pipeline networks. Current methods of fault detection within pipes are costly, time consuming and inefficient. As such, solutions tend toward a more reactive approach, repairing faults, as opposed to proactively seeking leaks and blockages. The paper presents an efficient inspection method, showing that autonomous swarm robotics is a viable way of monitoring underground infrastructure. Tailored adaptations of various Vehicle Routing Problems (VRP) and path-planning algorithms provide a customised inspection procedure for complicated networks of underground pipes. The performance of multiple algorithms is compared to determine their effectiveness and feasibility. Notable inspirations come from ant colonies and stigmergy, graph theory, the k-Chinese Postman Problem ( -CPP) and traffic theory. Unlike most swarm behaviours which rely on fast communication between agents, underground pipe networks are a highly challenging communication environment with extremely limited communication ranges. This is due to the extreme variability in the pipe conditions and relatively high attenuation of acoustic and radio waves with which robots would usually communicate. This paper illustrates how to optimise the inspection process and how to increase the frequency with which the robots pass each other, without compromising the routes they are able to take to cover the whole network.

Keywords: autonomous inspection, buried pipes, stigmergy, swarm intelligence, vehicle routing problem

Procedia PDF Downloads 151
1142 Potential of Aerodynamic Feature on Monitoring Multilayer Rough Surfaces

Authors: Ibtissem Hosni, Lilia Bennaceur Farah, Saber Mohamed Naceur

Abstract:

In order to assess the water availability in the soil, it is crucial to have information about soil distributed moisture content; this parameter helps to understand the effect of humidity on the exchange between soil, plant cover and atmosphere in addition to fully understanding the surface processes and the hydrological cycle. On the other hand, aerodynamic roughness length is a surface parameter that scales the vertical profile of the horizontal component of the wind speed and characterizes the surface ability to absorb the momentum of the airflow. In numerous applications of the surface hydrology and meteorology, aerodynamic roughness length is an important parameter for estimating momentum, heat and mass exchange between the soil surface and atmosphere. It is important on this side, to consider the atmosphere factors impact in general, and the natural erosion in particular, in the process of soil evolution and its characterization and prediction of its physical parameters. The study of the induced movements by the wind over soil vegetated surface, either spaced plants or plant cover, is motivated by significant research efforts in agronomy and biology. The known major problem in this side concerns crop damage by wind, which presents a booming field of research. Obviously, most models of soil surface require information about the aerodynamic roughness length and its temporal and spatial variability. We have used a bi-dimensional multi-scale (2D MLS) roughness description where the surface is considered as a superposition of a finite number of one-dimensional Gaussian processes each one having a spatial scale using the wavelet transform and the Mallat algorithm to describe natural surface roughness. We have introduced multi-layer aspect of the humidity of the soil surface, to take into account a volume component in the problem of backscattering radar signal. As humidity increases, the dielectric constant of the soil-water mixture increases and this change is detected by microwave sensors. Nevertheless, many existing models in the field of radar imagery, cannot be applied directly on areas covered with vegetation due to the vegetation backscattering. Thus, the radar response corresponds to the combined signature of the vegetation layer and the layer of soil surface. Therefore, the key issue of the numerical estimation of soil moisture is to separate the two contributions and calculate both scattering behaviors of the two layers by defining the scattering of the vegetation and the soil blow. This paper presents a synergistic methodology, and it is for estimating roughness and soil moisture from C-band radar measurements. The methodology adequately represents a microwave/optical model which has been used to calculate the scattering behavior of the aerodynamic vegetation-covered area by defining the scattering of the vegetation and the soil below.

Keywords: aerodynamic, bi-dimensional, vegetation, synergistic

Procedia PDF Downloads 255
1141 A Program Evaluation of TALMA Full-Year Fellowship Teacher Preparation

Authors: Emilee M. Cruz

Abstract:

Teachers take part in short-term teaching fellowships abroad, and their preparation before, during, and after the experience is critical to affecting teachers’ feelings of success in the international classroom. A program evaluation of the teacher preparation within TALMA: The Israel Program for Excellence in English (TALMA) full-year teaching fellowship was conducted. A questionnaire was developed that examined professional development, deliberate reflection, and cultural and language immersion offered before, during, and after the short-term experience. The evaluation also surveyed teachers’ feelings of preparedness for the Israeli classroom and any recommendations they had for future teacher preparation within the fellowship program. The review suggests the TALMA program includes integrated professional learning communities between fellows and Israeli co-teachers, more opportunities for immersive Hebrew language learning, a broader professional network with Israelis, and opportunities for guided discussion with the TALMA community continued participation in TALMA events and learning following the full-year fellowship. Similar short-term international programs should consider the findings in the design of their participation preparation programs. The review also offers direction for future program evaluation of short-term participant preparation, including the need for frequent response item updates to match current offerings and evaluation of participant feelings of preparedness before, during, and after the full-year fellowship.

Keywords: educational program evaluation, international teaching, short-term teaching, teacher beliefs, teaching fellowship, teacher preparation

Procedia PDF Downloads 168
1140 SIP Flooding Attacks Detection and Prevention Using Shannon, Renyi and Tsallis Entropy

Authors: Neda Seyyedi, Reza Berangi

Abstract:

Voice over IP (VOIP) network, also known as Internet telephony, is growing increasingly having occupied a large part of the communications market. With the growth of each technology, the related security issues become of particular importance. Taking advantage of this technology in different environments with numerous features put at our disposal, there arises an increasing need to address the security threats. Being IP-based and playing a signaling role in VOIP networks, Session Initiation Protocol (SIP) lets the invaders use weaknesses of the protocol to disable VOIP service. One of the most important threats is denial of service attack, a branch of which in this article we have discussed as flooding attacks. These attacks make server resources wasted and deprive it from delivering service to authorized users. Distributed denial of service attacks and attacks with a low rate can mislead many attack detection mechanisms. In this paper, we introduce a mechanism which not only detects distributed denial of service attacks and low rate attacks, but can also identify the attackers accurately. We detect and prevent flooding attacks in SIP protocol using Shannon (FDP-S), Renyi (FDP-R) and Tsallis (FDP-T) entropy. We conducted an experiment to compare the percentage of detection and rate of false alarm messages using any of the Shannon, Renyi and Tsallis entropy as a measure of disorder. Implementation results show that, according to the parametric nature of the Renyi and Tsallis entropy, by changing the parameters, different detection percentages and false alarm rates will be gained with the possibility to adjust the sensitivity of the detection mechanism.

Keywords: VOIP networks, flooding attacks, entropy, computer networks

Procedia PDF Downloads 389
1139 An Emergentist Defense of Incompatibility between Morally Significant Freedom and Causal Determinism

Authors: Lubos Rojka

Abstract:

The common perception of morally responsible behavior is that it presupposes freedom of choice, and that free decisions and actions are not determined by natural events, but by a person. In other words, the moral agent has the ability and the possibility of doing otherwise when making morally responsible decisions, and natural causal determinism cannot fully account for morally significant freedom. The incompatibility between a person’s morally significant freedom and causal determinism appears to be a natural position. Nevertheless, some of the most influential philosophical theories on moral responsibility are compatibilist or semi-compatibilist, and they exclude the requirement of alternative possibilities, which contradicts the claims of classical incompatibilism. The compatibilists often employ Frankfurt-style thought experiments to prove their theory. The goal of this paper is to examine the role of imaginary Frankfurt-style examples in compatibilist accounts. More specifically, the compatibilist accounts defended by John Martin Fischer and Michael McKenna will be inserted into the broader understanding of a person elaborated by Harry Frankfurt, Robert Kane and Walter Glannon. Deeper analysis reveals that the exclusion of alternative possibilities based on Frankfurt-style examples is problematic and misleading. A more comprehensive account of moral responsibility and morally significant (source) freedom requires higher order complex theories of human will and consciousness, in which rational and self-creative abilities and a real possibility to choose otherwise, at least on some occasions during a lifetime, are necessary. Theoretical moral reasons and their logical relations seem to require a sort of higher-order agent-causal incompatibilism. The ability of theoretical or abstract moral reasoning requires complex (strongly emergent) mental and conscious properties, among which an effective free will, together with first and second-order desires. Such a hierarchical theoretical model unifies reasons-responsiveness, mesh theory and emergentism. It is incompatible with physical causal determinism, because such determinism only allows non-systematic processes that may be hard to predict, but not complex (strongly) emergent systems. An agent’s effective will and conscious reflectivity is the starting point of a morally responsible action, which explains why a decision is 'up to the subject'. A free decision does not always have a complete causal history. This kind of an emergentist source hyper-incompatibilism seems to be the best direction of the search for an adequate explanation of moral responsibility in the traditional (merit-based) sense. Physical causal determinism as a universal theory would exclude morally significant freedom and responsibility in the traditional sense because it would exclude the emergence of and supervenience by the essential complex properties of human consciousness.

Keywords: consciousness, free will, determinism, emergence, moral responsibility

Procedia PDF Downloads 150
1138 Internet of Things for Smart Dedicated Outdoor Air System in Buildings

Authors: Dararat Tongdee, Surapong Chirarattananon, Somchai Maneewan, Chantana Punlek

Abstract:

Recently, the Internet of Things (IoT) is the important technology that connects devices to the network and people can access real-time communication. This technology is used to report, collect, and analyze the big data for achieving a purpose. For a smart building, there are many IoT technologies that enable management and building operators to improve occupant thermal comfort, indoor air quality, and building energy efficiency. In this research, we propose monitoring and controlling performance of a smart dedicated outdoor air system (SDOAS) based on IoT platform. The SDOAS was specifically designed with the desiccant unit and thermoelectric module. The designed system was intended to monitor, notify, and control indoor environmental factors such as temperature, humidity, and carbon dioxide (CO₂) level. The SDOAS was tested under the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE 62.2) and indoor air quality standard. The system will notify the user by Blynk notification when the status of the building is uncomfortable or tolerable limits are reached according to the conditions that were set. The user can then control the system via a Blynk application on a smartphone. The experimental result indicates that the temperature and humidity of indoor fresh air in the comfort zone are approximately 26 degree Celsius and 58% respectively. Furthermore, the CO₂ level was controlled lower than 1000 ppm by indoor air quality standard condition. Therefore, the proposed system can efficiently work and be easy to use for buildings.

Keywords: internet of things, indoor air quality, smart dedicated outdoor air system, thermal comfort

Procedia PDF Downloads 181
1137 The Prevalence and Impact of Anxiety Among Medical Students in the MENA Region: A Systematic Review, Meta-Analysis, and Meta-Regression

Authors: Kawthar F. Albasri, Abdullah M. AlHudaithi, Dana B. AlTurairi, Abdullaziz S. AlQuraini, Adoub Y. AlDerazi, Reem A. Hubail, Haitham A. Jahrami

Abstract:

Several studies have found that medical students have a significant prevalence of anxiety. The purpose of this review paper is to carefully evaluate the current research on anxiety among medical students in the MENA region and, as a result, estimate the prevalence of these disturbances. Multiple databases, including the CINAHL (Cumulative Index to Nursing and Allied Health Literature), Cochrane Library, Embase, MEDLINE (Medical Literature Analysis and Retrieval System Online), PubMed, PsycINFO (Psychological Information Database), Scopus, Web of Science, UpToDate, ClinicalTrials.gov, WHO Global Health Library, EbscoHost, ProQuest, JAMA Network, and ScienceDirect, were searched. The retrieved article reference lists were rigorously searched and rated for quality. A random effects meta-analysis was performed to compute estimates. The current meta-analysis revealed an alarming estimated pooled prevalence of anxiety (K = 46, N = 27023) of 52.5% [95%CI: 43.3%–61.6%]. A total of 62.0% [95% CI 42.9%; 78.0%] of the students (K = 18, N = 16466) suffered from anxiety during the COVID-19 pandemic, while 52.5% [95% CI 43.3%; 61.6%] had anxiety before COVID-19. Based on the GAD-7 measure, a total of 55.7% [95%CI 30.5%; 78.3%] of the students (K = 10, N = 5830) had anxiety, and a total of 54.7% of the students (K = 18, N = 12154) [95%CI 42.8%; 66.0%] had anxiety using the DASS-21 or 42 measure. Anxiety is a common issue among medical students, making it a genuine problem. Further research should be conducted post-COVD 19, with a focus on anxiety prevention and intervention initiatives for medical students.

Keywords: anxiety, medical students, MENA, meta-analysis, prevalence

Procedia PDF Downloads 52
1136 Cut-Off of CMV Cobas® Taqman® (CAP/CTM Roche®) for Introduction of Ganciclovir Pre-Emptive Therapy in Allogeneic Hematopoietic Stem Cell Transplant Recipients

Authors: B. B. S. Pereira, M. O. Souza, L. P. Zanetti, L. C. S. Oliveira, J. R. P. Moreno, M. P. Souza, V. R. Colturato, C. M. Machado

Abstract:

Background: The introduction of prophylactic or preemptive therapies has effectively decreased the CMV mortality rates after hematopoietic stem cell transplantation (HSCT). CMV antigenemia (pp65) or quantitative PCR are methods currently approved for CMV surveillance in pre-emptive strategies. Commercial assays are preferred as cut-off levels defined by in-house assays may vary among different protocols and in general show low reproducibility. Moreover, comparison of published data among different centers is only possible if international standards of quantification are included in the assays. Recently, the World Health Organization (WHO) established the first international standard for CMV detection. The real time PCR COBAS Ampliprep/ CobasTaqMan (CAP/CTM) (Roche®) was developed using the WHO standard for CMV quantification. However, the cut-off for the introduction of antiviral has not been determined yet. Methods: We conducted a retrospective study to determine: 1) the sensitivity and specificity of the new CMV CAP/CTM test in comparison with pp65 antigenemia to detect episodes of CMV infection/reactivation, and 2) the cut-off of viral load for introduction of ganciclovir (GCV). Pp65 antigenemia was performed and the corresponding plasma samples were stored at -20°C for further CMV detection by CAP/CTM. Comparison of tests was performed by kappa index. The appearance of positive antigenemia was considered the state variable to determine the cut-off of CMV viral load by ROC curve. Statistical analysis was performed using SPSS software version 19 (SPSS, Chicago, IL, USA.). Results: Thirty-eight patients were included and followed from August 2014 through May 2015. The antigenemia test detected 53 episodes of CMV infection in 34 patients (89.5%), while CAP/CTM detected 37 episodes in 33 patients (86.8%). AG and PCR results were compared in 431 samples and Kappa index was 30.9%. The median time for first AG detection was 42 (28-140) days, while CAP/CTM detected at a median of 7 days earlier (34 days, ranging from 7 to 110 days). The optimum cut-off value of CMV DNA was 34.25 IU/mL to detect positive antigenemia with 88.2% of sensibility, 100% of specificity and AUC of 0.91. This cut-off value is below the limit of detection and quantification of the equipment which is 56 IU/mL. According to CMV recurrence definition, 16 episodes of CMV recurrence were detected by antigenemia (47.1%) and 4 (12.1%) by CAP/CTM. The duration of viremia as detected by antigenemia was shorter (60.5% of the episodes lasted ≤ 7 days) in comparison to CAP/CTM (57.9% of the episodes lasting 15 days or more). This data suggests that the use of antigenemia to define the duration of GCV therapy might prompt early interruption of antiviral, which may favor CMV reactivation. The CAP/CTM PCR could possibly provide a safer information concerning the duration of GCV therapy. As prolonged treatment may increase the risk of toxicity, this hypothesis should be confirmed in prospective trials. Conclusions: Even though CAP/CTM by ROCHE showed great qualitative correlation with the antigenemia technique, the fully automated CAP/CTM did not demonstrate increased sensitivity. The cut-off value below the limit of detection and quantification may result in delayed introduction of pre-emptive therapy.

Keywords: antigenemia, CMV COBAS/TAQMAN, cytomegalovirus, antiviral cut-off

Procedia PDF Downloads 177
1135 Optimized Techniques for Reducing the Reactive Power Generation in Offshore Wind Farms in India

Authors: Pardhasaradhi Gudla, Imanual A.

Abstract:

The generated electrical power in offshore needs to be transmitted to grid which is located in onshore by using subsea cables. Long subsea cables produce reactive power, which should be compensated in order to limit transmission losses, to optimize the transmission capacity, and to keep the grid voltage within the safe operational limits. Installation cost of wind farm includes the structure design cost and electrical system cost. India has targeted to achieve 175GW of renewable energy capacity by 2022 including offshore wind power generation. Due to sea depth is more in India, the installation cost will be further high when compared to European countries where offshore wind energy is already generating successfully. So innovations are required to reduce the offshore wind power project cost. This paper presents the optimized techniques to reduce the installation cost of offshore wind firm with respect to electrical transmission systems. This technical paper provides the techniques for increasing the current carrying capacity of subsea cable by decreasing the reactive power generation (capacitance effect) of the subsea cable. There are many methods for reactive power compensation in wind power plants so far in execution. The main reason for the need of reactive power compensation is capacitance effect of subsea cable. So if we diminish the cable capacitance of cable then the requirement of the reactive power compensation will be reduced or optimized by avoiding the intermediate substation at midpoint of the transmission network.

Keywords: offshore wind power, optimized techniques, power system, sub sea cable

Procedia PDF Downloads 172
1134 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 458
1133 Multi-Objective Optimization of Intersections

Authors: Xiang Li, Jian-Qiao Sun

Abstract:

As the crucial component of city traffic network, intersections have significant impacts on urban traffic performance. Despite of the rapid development in transportation systems, increasing traffic volumes result in severe congestions especially at intersections in urban areas. Effective regulation of vehicle flows at intersections has always been an important issue in the traffic control system. This study presents a multi-objective optimization method at intersections with cellular automata to achieve better traffic performance. Vehicle conflicts and pedestrian interference are considered. Three categories of the traffic performance are studied including transportation efficiency, energy consumption and road safety. The left-turn signal type, signal timing and lane assignment are optimized for different traffic flows. The multi-objective optimization problem is solved with the cell mapping method. The optimization results show the conflicting nature of different traffic performance. The influence of different traffic variables on the intersection performance is investigated. It is observed that the proposed optimization method is effective in regulating the traffic at the intersection to meet multiple objectives. Transportation efficiency can be usually improved by the permissive left-turn signal, which sacrifices safety. Right-turn traffic suffers significantly when the right-turn lanes are shared with the through vehicles. The effect of vehicle flow on the intersection performance is significant. The display pattern of the optimization results can be changed remarkably by the traffic volume variation. Pedestrians have strong interference with the traffic system.

Keywords: cellular automata, intersection, multi-objective optimization, traffic system

Procedia PDF Downloads 563
1132 Vascularized Adipose Tissue Engineering by Using Adipose ECM/Fibroin Hydrogel

Authors: Alisan Kayabolen, Dilek Keskin, Ferit Avcu, Andac Aykan, Fatih Zor, Aysen Tezcaner

Abstract:

Adipose tissue engineering is a promising field for regeneration of soft tissue defects. However, only very thin implants can be used in vivo since vascularization is still a problem for thick implants. Another problem is finding a biocompatible scaffold with good mechanical properties. In this study, the aim is to develop a thick vascularized adipose tissue that will integrate with the host, and perform its in vitro and in vivo characterizations. For this purpose, a hydrogel of decellularized adipose tissue (DAT) and fibroin was produced, and both endothelial cells and adipocytes that were differentiated from adipose derived stem cells were encapsulated in this hydrogel. Mixing DAT with fibroin allowed rapid gel formation by vortexing. It also provided to adjust mechanical strength by changing fibroin to DAT ratio. Based on compression tests, gels of DAT/fibroin ratio with similar mechanical properties to adipose tissue was selected for cell culture experiments. In vitro characterizations showed that DAT is not cytotoxic; on the contrary, it has many natural ECM components which provide biocompatibility and bioactivity. Subcutaneous implantation of hydrogels resulted with no immunogenic reaction or infection. Moreover, localized empty hydrogels gelled successfully around host vessel with required shape. Implantations of cell encapsulated hydrogels and histological analyses are under study. It is expected that endothelial cells inside the hydrogel will form a capillary network and they will bind to the host vessel passing through hydrogel.

Keywords: adipose tissue engineering, decellularization, encapsulation, hydrogel, vascularization

Procedia PDF Downloads 514
1131 Binderless Naturally-extracted Metal-free Electrocatalyst for Efficient NOₓ Reduction

Authors: Hafiz Muhammad Adeel Sharif, Tian Li, Changping Li

Abstract:

Recently, the emission of nitrogen-sulphur oxides (NOₓ, SO₂) has become a global issue and causing serious threats to health and the environment. Catalytic reduction of NOx and SOₓ gases into friendly gases is considered one of the best approaches. However, regeneration of the catalyst, higher bond-dissociation energy for NOx, i.e., 150.7 kcal/mol, escape of intermediate gas (N₂O, a greenhouse gas) with treated flue-gas, and limited activity of catalyst remains a great challenge. Here, a cheap, binderless naturally-extracted bass-wood thin carbon electrode (TCE) is presented, which shows excellent catalytic activity towards NOx reduction. The bass-wood carbonization at 900 ℃ followed by thermal activation in the presence of CO2 gas at 750 ℃. The thermal activation resulted in an increase in epoxy groups on the surface of the TCE and enhancement in the surface area as well as the degree of graphitization. The TCE unique 3D strongly inter-connected network through hierarchical micro/meso/macro pores that allow large electrode/electrolyte interface. Owing to these characteristics, the TCE exhibited excellent catalytic efficiency towards NOx (~83.3%) under ambient conditions and enhanced catalytic response under pH and sulphite exposure as well as excellent stability up to 168 hours. Moreover, a temperature-dependent activity trend was found where the highest catalytic activity was achieved at 80 ℃, beyond which the electrolyte became evaporative and resulted in a performance decrease. The designed electrocatalyst showed great potential for effective NOx-reduction, which is highly cost-effective, green, and sustainable.

Keywords: electrocatalyst, NOx-reduction, bass-wood electrode, integrated wet-scrubbing, sustainable

Procedia PDF Downloads 58
1130 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects

Authors: Ma Yuzhe, Burra Venkata Durga Kumar

Abstract:

The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.

Keywords: Linux, operating system, system management, security

Procedia PDF Downloads 92
1129 Leveraging Remote Assessments and Central Raters to Optimize Data Quality in Rare Neurodevelopmental Disorders Clinical Trials

Authors: Pamela Ventola, Laurel Bales, Sara Florczyk

Abstract:

Background: Fully remote or hybrid administration of clinical outcome measures in rare neurodevelopmental disorders trials is increasing due to the ongoing pandemic and recognition that remote assessments reduce the burden on families. Many assessments in rare neurodevelopmental disorders trials are complex; however, remote/hybrid trials readily allow for the use of centralized raters to administer and score the scales. The use of centralized raters has many benefits, including reducing site burden; however, a specific impact on data quality has not yet been determined. Purpose: The current study has two aims: a) evaluate differences in data quality between administration of a standardized clinical interview completed by centralized raters compared to those completed by site raters and b) evaluate improvement in accuracy of scoring standardized developmental assessments when scored centrally compared to when scored by site raters. Methods: For aim 1, the Vineland-3, a widely used measure of adaptive functioning, was administered by site raters (n= 52) participating in one of four rare disease trials. The measure was also administered as part of two additional trials that utilized central raters (n=7). Each rater completed a comprehensive training program on the assessment. Following completion of the training, each clinician completed a Vineland-3 with a mock caregiver. Administrations were recorded and reviewed by a neuropsychologist for administration and scoring accuracy. Raters were able to certify for the trials after demonstrating an accurate administration of the scale. For site raters, 25% of each rater’s in-study administrations were reviewed by a neuropsychologist for accuracy of administration and scoring. For central raters, the first two administrations and every 10th administration were reviewed. Aim 2 evaluated the added benefit of centralized scoring on the accuracy of scoring of the Bayley-3, a comprehensive developmental assessment widely used in rare neurodevelopmental disorders trials. Bayley-3 administrations across four rare disease trials were centrally scored. For all administrations, the site rater who administered the Bayley-3 scored the scale, and a centralized rater reviewed the video recordings of the administrations and also scored the scales to confirm accuracy. Results: For aim 1, site raters completed 138 Vineland-3 administrations. Of the138 administrations, 53 administrations were reviewed by a neuropsychologist. Four of the administrations had errors that compromised the validity of the assessment. The central raters completed 180 Vineland-3 administrations, 38 administrations were reviewed, and none had significant errors. For aim 2, 68 administrations of the Bayley-3 were reviewed and scored by both a site rater and a centralized rater. Of these administrations, 25 had errors in scoring that were corrected by the central rater. Conclusion: In rare neurodevelopmental disorders trials, sample sizes are often small, so data quality is critical. The use of central raters inherently decreases site burden, but it also decreases rater variance, as illustrated by the small team of central raters (n=7) needed to conduct all of the assessments (n=180) in these trials compared to the number of site raters (n=53) required for even fewer assessments (n=138). In addition, the use of central raters dramatically improves the quality of scoring the assessments.

Keywords: neurodevelopmental disorders, clinical trials, rare disease, central raters, remote trials, decentralized trials

Procedia PDF Downloads 147
1128 Enhancing Industrial Wastewater Treatment: Efficacy and Optimization of Ultrasound-Assisted Laccase Immobilized on Magnetic Fe₃O₄ Nanoparticles

Authors: K. Verma, v. S. Moholkar

Abstract:

In developed countries, water pollution caused by industrial discharge has emerged as a significant environmental concern over the past decades. However, despite ongoing efforts, a fully effective and sustainable remediation strategy has yet to be identified. This paper describes how enzymatic and sonochemical treatments have demonstrated great promise in degrading bio-refractory pollutants. Mainly, a compelling area of interest lies in the combined technique of sono-enzymatic treatment, which has exhibited a synergistic enhancement effect surpassing that of the individual techniques. This study employed the covalent attachment method to immobilize Laccase from Trametes versicolor onto amino-functionalized magnetic Fe₃O₄ nanoparticles. To comprehensively characterize the synthesized free nanoparticles and the laccase-immobilized nanoparticles, various techniques such as X-ray diffraction (XRD), Fourier transform infrared spectroscopy (FT-IR), scanning electron microscope (SEM), vibrating sample magnetometer (VSM), and surface area through Brunauer-Emmett-Teller (BET) were employed. The size of immobilized Fe₃O₄@Laccase was found to be 60 nm, and the maximum loading of laccase was found to be 24 mg/g of nanoparticle. An investigation was conducted to study the effect of various process parameters, such as immobilized Fe₃O₄ Laccase dose, temperature, and pH, on the % Chemical oxygen demand (COD) removal as a response. The statistical design pinpointed the optimum conditions (immobilized Fe₃O₄ Laccase dose = 1.46 g/L, pH = 4.5, and temperature = 66 oC), resulting in a remarkable 65.58% COD removal within 60 minutes. An even more significant improvement (90.31% COD removal) was achieved with ultrasound-assisted enzymatic reaction utilizing a 10% duty cycle. The investigation of various kinetic models for free and immobilized laccase, such as the Haldane, Yano, and Koga, and Michaelis-Menten, showed that ultrasound application impacted the kinetic parameters Vmax and Km. Specifically, Vmax values for free and immobilized laccase were found to be 0.021 mg/L min and 0.045 mg/L min, respectively, while Km values were 147.2 mg/L for free laccase and 136.46 mg/L for immobilized laccase. The lower Km and higher Vmax for immobilized laccase indicate its enhanced affinity towards the substrate, likely due to ultrasound-induced alterations in the enzyme's confirmation and increased exposure of active sites, leading to more efficient degradation. Furthermore, the toxicity and Liquid chromatography-mass spectrometry (LC-MS) analysis revealed that after the treatment process, the wastewater exhibited 70% less toxicity than before treatment, with over 25 compounds degrading by more than 75%. At last, the prepared immobilized laccase had excellent recyclability retaining 70% activity up to 6 consecutive cycles. A straightforward manufacturing strategy and outstanding performance make the recyclable magnetic immobilized Laccase (Fe₃O₄ Laccase) an up-and-coming option for various environmental applications, particularly in water pollution control and treatment.

Keywords: kinetic, laccase enzyme, sonoenzymatic, ultrasound irradiation

Procedia PDF Downloads 45
1127 Stem Cell Fate Decision Depending on TiO2 Nanotubular Geometry

Authors: Jung Park, Anca Mazare, Klaus Von Der Mark, Patrik Schmuki

Abstract:

In clinical application of TiO2 implants on tooth and hip replacement, migration, adhesion and differentiation of neighboring mesenchymal stem cells onto implant surfaces are critical steps for successful bone regeneration. In a recent decade, accumulated attention has been paid on nanoscale electrochemical surface modifications on TiO2 layer for improving bone-TiO2 surface integration. We generated, on titanium surfaces, self-assembled layers of vertically oriented TiO2 nanotubes with defined diameters between 15 and 100 nm and here we show that mesenchymal stem cells finely sense TiO2 nanotubular geometry and quickly decide their cell fate either to differentiation into osteoblasts or to programmed cell death (apoptosis) on TiO2 nanotube layers. These cell fate decisions are critically dependent on nanotube size differences (15-100nm in diameters) of TiO2 nanotubes sensing by integrin clustering. We further demonstrate that nanoscale topography-sensing is feasible not only in mesenchymal stem cells but rather seems as generalized nanoscale microenvironment-cell interaction mechanism in several cell types composing bone tissue network including osteoblasts, osteoclast, endothelial cells and hematopoietic stem cells. Additionally we discuss the synergistic effect of simultaneous stimulation by nanotube-bound growth factor and nanoscale topographic cues on enhanced bone regeneration.

Keywords: TiO2 nanotube, stem cell fate decision, nano-scale microenvironment, bone regeneration

Procedia PDF Downloads 417
1126 Study on Electromagnetic Plasma Acceleration Using Rotating Magnetic Field Scheme

Authors: Takeru Furuawa, Kohei Takizawa, Daisuke Kuwahara, Shunjiro Shinohara

Abstract:

In the field of a space propulsion, an electric propulsion system has been developed because its fuel efficiency is much higher than a conventional chemical one. However, the practical electric propulsion systems, e.g., an ion engine, have a problem of short lifetime due to a damage of generation and acceleration electrodes of the plasma. A helicon plasma thruster is proposed as a long-lifetime electric thruster which has non-direct contact electrodes. In this system, both generation and acceleration methods of a dense plasma are executed by antennas from the outside of a discharge tube. Development of the helicon plasma thruster has been conducting under the Helicon Electrodeless Advanced Thruster (HEAT) project. Our helicon plasma thruster has two important processes. First, we generate a dense source plasma using a helicon wave with an excitation frequency between an ion and an electron cyclotron frequencies, fci and fce, respectively, applied from the outside of a discharge using a radio frequency (RF) antenna. The helicon plasma source can provide a high-density (~1019 m-3), a high-ionization ratio (up to several tens of percent), and a high particle generation efficiency. Second, in order to achieve high thrust and specific impulse, we accelerate the dense plasma by the axial Lorentz force fz using the product of the induced azimuthal current jθ and the static radial magnetic field Br, shown as fz = jθ × Br. The HEAT project has proposed several kinds of electrodeless acceleration schemes, and in our particular case, a Rotating Magnetic Field (RMF) method has been extensively studied. The RMF scheme was originally developed as a concept to maintain the Field Reversed Configuration (FRC) in a magnetically confined fusion research. Here, RMF coils are expected to generate jθ due to a nonlinear effect shown below. First, the rotating magnetic field Bω is generated by two pairs of RMF coils with AC currents, which have a phase difference of 90 degrees between the pairs. Due to the Faraday’s law, an axial electric field is induced. Second, an axial current is generated by the effects of an electron-ion and an electron-neutral collisions through the Ohm’s law. Third, the azimuthal electric field is generated by the nonlinear term, and the retarding torque generated by the collision effects again. Then, azimuthal current jθ is generated as jθ = - nₑ er ∙ 2π fRMF. Finally, the axial Lorentz force fz for plasma acceleration is generated. Here, jθ is proportional to nₑ and frequency of RMF coil current fRMF, when Bω is fully penetrated into the plasma. Our previous study has achieved 19 % increase of ion velocity using the 5 MHz and 50 A of the RMF coil power supply. In this presentation, we will show the improvement of the ion velocity using the lower frequency and higher current supplied by RMF power supply. In conclusion, helicon high-density plasma production and electromagnetic acceleration by the RMF scheme with a concept of electrodeless condition have been successfully executed.

Keywords: electric propulsion, electrodeless thruster, helicon plasma, rotating magnetic field

Procedia PDF Downloads 247