Search results for: free length
860 Caffeic Acid in Cosmetic Formulations: An Innovative Assessment
Authors: Caroline M. Spagnol, Vera L. B. Isaac, Marcos A. Corrêa, Hérida R. N. Salgado
Abstract:
Phenolic compounds are abundant in the Brazilian plant kingdom and they are part of a large and complex group of organic substances. Cinnamic acids are part of this group of organic compounds, and caffeic acid (CA) is one of its representatives. Antioxidants are compounds which act as free radical scavengers and, in other cases, such as metal chelators, both in the initiation stage and the propagation of oxidative process. The tyrosinase, polyphenol oxidase, is an enzyme that acts at various stages of melanin biosynthesis within the melanocytes and is considered a key molecule in this process. Some phenolic compounds exhibit inhibitory effects on melanogenesis by inhibiting the tyrosinase enzymatic activity and therefore has been the subject of studies. However, few studies have reported the effectiveness of these products and their safety. Objectives: To assess the inhibitory activity of tyrosinase, the antioxidant activity of CA and its cytotoxic potential. The method to evaluate the inhibitory activity of tyrosinase aims to assess the reduction transformation of L-dopa into dopaquinone reactions catalyzed by the enzyme. For evaluating the antioxidant activity was used the analytical methodology of DPPH radical inhibition. The cytotoxicity evaluation was carried out using the MTT method (3-(4,5-dimethyl-2-thiazolyl)-2,5-diphenyl-2H-tetrazolium bromide), a colorimetric assay which determines the amount of insoluble violet crystals formed by the reduction of MTT in the mitochondria of living cells. Based on the results obtained during the study, CA has low activity as a depigmenting agent. However, it is a more potent antioxidant than ascorbic acid (AA), since a lower amount of CA is sufficient to inhibit 50% of DPPH radical. The results are promising since CA concentration that promoted 50% toxicity in HepG2 cells (IC50=781.8 μg/mL) is approximately 330 to 400 times greater than the concentration required to inhibit 50% of DPPH (IC50 DPPH= 2.39 μg/mL) and ABTS (IC50 ABTS= 1.96 μg/mL) radicals scavenging activity, respectively. The maximum concentration of caffeic acid tested (1140 mg /mL) did not reach 50% of cell death in HaCat cells. Thus, it was concluded that the caffeic acid does not cause toxicity in HepG2 and HaCat cells in the concentrations required to promote antioxidant activity in vitro, and it can be applied in topical products.Keywords: caffeic acid, antioxidant, cytotoxicity, cosmetic
Procedia PDF Downloads 379859 A Lightweight Blockchain: Enhancing Internet of Things Driven Smart Buildings Scalability and Access Control Using Intelligent Direct Acyclic Graph Architecture and Smart Contracts
Authors: Syed Irfan Raza Naqvi, Zheng Jiangbin, Ahmad Moshin, Pervez Akhter
Abstract:
Currently, the IoT system depends on a centralized client-servant architecture that causes various scalability and privacy vulnerabilities. Distributed ledger technology (DLT) introduces a set of opportunities for the IoT, which leads to practical ideas for existing components at all levels of existing architectures. Blockchain Technology (BCT) appears to be one approach to solving several IoT problems, like Bitcoin (BTC) and Ethereum, which offer multiple possibilities. Besides, IoTs are resource-constrained devices with insufficient capacity and computational overhead to process blockchain consensus mechanisms; the traditional BCT existing challenge for IoTs is poor scalability, energy efficiency, and transaction fees. IOTA is a distributed ledger based on Direct Acyclic Graph (DAG) that ensures M2M micro-transactions are free of charge. IOTA has the potential to address existing IoT-related difficulties such as infrastructure scalability, privacy and access control mechanisms. We proposed an architecture, SLDBI: A Scalable, lightweight DAG-based Blockchain Design for Intelligent IoT Systems, which adapts the DAG base Tangle and implements a lightweight message data model to address the IoT limitations. It enables the smooth integration of new IoT devices into a variety of apps. SLDBI enables comprehensive access control, energy efficiency, and scalability in IoT ecosystems by utilizing the Masked Authentication Message (MAM) protocol and the IOTA Smart Contract Protocol (ISCP). Furthermore, we suggest proof-of-work (PoW) computation on the full node in an energy-efficient way. Experiments have been carried out to show the capability of a tangle to achieve better scalability while maintaining energy efficiency. The findings show user access control management at granularity levels and ensure scale up to massive networks with thousands of IoT nodes, such as Smart Connected Buildings (SCBDs).Keywords: blockchain, IOT, direct acyclic graphy, scalability, access control, architecture, smart contract, smart connected buildings
Procedia PDF Downloads 122858 Mapping Forest Biodiversity Using Remote Sensing and Field Data in the National Park of Tlemcen (Algeria)
Authors: Bencherif Kada
Abstract:
In forest management practice, landscape and Mediterranean forest are never posed as linked objects. But sustainable forestry requires the valorization of the forest landscape and this aim involves assessing the spatial distribution of biodiversity by mapping forest landscaped units and subunits and by monitoring the environmental trends. This contribution aims to highlight, through object-oriented classifications, the landscaped biodiversity of the National Park of Tlemcen (Algeria). The methodology used is based on ground data and on the basic processing units of object-oriented classification that are segments, so-called image-objects, representing a relatively homogenous units on the ground. The classification of Landsat Enhanced Thematic Mapper plus (ETM+) imagery is performed on image objects, and not on pixels. Advantages of object-oriented classification are to make full use of meaningful statistic and texture calculation, uncorrelated shape information (e.g., length-to-width ratio, direction and area of an object, etc.) and topological features (neighbor, super-object, etc.), and the close relation between real-world objects and image objects. The results show that per object classification using the k-nearest neighbor’s method is more efficient than per pixel one. It permits to simplify the content of the image while preserving spectrally and spatially homogeneous types of land covers such as Aleppo pine stands, cork oak groves, mixed groves of cork oak, holm oak and zen oak, mixed groves of holm oak and thuja, water plan, dense and open shrub-lands of oaks, vegetable crops or orchard, herbaceous plants and bare soils. Texture attributes seem to provide no useful information while spatial attributes of shape, compactness seem to be performant for all the dominant features, such as pure stands of Aleppo pine and/or cork oak and bare soils. Landscaped sub-units are individualized while conserving the spatial information. Continuously dominant dense stands over a large area were formed into a single class, such as dense, fragmented stands with clear stands. Low shrublands formations and high wooded shrublands are well individualized but with some confusion with enclaves for the former. Overall, a visual evaluation of the classification shows that the classification reflects the actual spatial state of the study area at the landscape level.Keywords: forest, oaks, remote sensing, biodiversity, shrublands
Procedia PDF Downloads 30857 Endoscopic Stenting of the Main Pancreatic Duct in Patients With Pancreatic Fluid Collections After Pancreas Transplantation
Authors: Y. Teterin, S. Suleymanova, I. Dmitriev, P. Yartcev
Abstract:
Introduction: One of the most common complications after pancreas transplantation are pancreatic fluid collections (PFCs), which are often complicated not only by infection and subsequent disfunction of the pancreatoduodenal graft (PDG), but also with a rather high mortality rate of recipients. Drainage is not always effective and often requires repeated open surgical interventions, which worsens the outcome of the surgery. Percutaneous drainage of PFCs combined with endoscopic stenting of the main pancreatic duct of the pancreatoduodenal graft (MPDPDG) showed high efficiency in the treatment of PFCs. Aims & Methods: From 01.01.2012 to 31.12.2021 at the Sklifosovsky Research Institute for Emergency Medicine were performed 64 transplantations of PDG. In 11 cases (17.2%), the early postoperative period was complicated by the formation of PFCs. Of these, 7 patients underwent percutaneous drainage of pancreonecrosis with high efficiency and did not required additional methods of treatment. In the remaining 4 patients, drainage was ineffective and was an indication for endoscopic stenting of the MPDPDG. They were the ones who made up the study group. Among them were 3 men and 1 woman. The mean age of the patients was 36,4 years.PFCs in these patients formed on days 1, 12, 18, and 47 after PDG transplantation. We used a gastroscope to stent the MPDPDG, due to anatomical features of the location of the duodenoduodenal anastomosis after PDG transplantation. Through the endoscope channel was performed selective catheterization of the MPDPDG, using a catheter and a guidewire, followed by its contrasting with a water-soluble contrast agent. Due to the extravasation of the contrast, was determined the localization of the defect in the PDG duct system. After that, a plastic pancreatic stent with a diameter of 7 Fr. and a length of 7 cm. was installed along guidewire. The stent was installed in such a way that its proximal edge completely covered the defect zone, and the distal one was determined in the intestinal lumen. Results: In all patients PDG pancreaticography revealed extravasation of a contrast in the area of the isthmus and body of the pancreas, which required stenting of the MPDPDG. In 1 (25%) case, the patient had a dislocation of the stent into the intestinal lumen (III degree according to Clavien-Dindo (2009)). This patient underwent repeated endoscopic stenting of the MPDPDG. On average 23 days after endoscopic stenting of the MPDPDG, the drainage tubes were removed and after approximately 40 days all patients were discharged in a satisfactory condition with follow-up endocrinologist and surgeon consultation. Pancreatic stents were removed after 6 months ± 7 days. Conclusion: Endoscopic stenting of the main pancreatic duct of the donor pancreas is by far the most highly effective and minimally invasive method in the treatment of PFCs after transplantation of the pancreatoduodenal complex.Keywords: pancreas transplantation, endoscopy surgery, diabetes, stenting, main pancreatic duct
Procedia PDF Downloads 86856 Identification of Lipo-Alkaloids and Fatty Acids in Aconitum carmichaelii Using Liquid Chromatography–Mass Spectrometry and Gas Chromatography–Mass Spectrometry
Authors: Ying Liang, Na Li
Abstract:
Lipo-alkaloid is a kind of C19-norditerpenoid alkaloids existed in Aconitum species, which usually contains an aconitane skeleton and one or two fatty acid residues. The structures are very similar to that of diester-type alkaloids, which are considered as the main bioactive components in Aconitum carmichaelii. They have anti-inflammatory, anti-nociceptive, and anti-proliferative activities. So far, more than 200 lipo-alkaloids were reported from plants, semisynthesis, and biotransformations. In our research, by the combination of ultra-high performance liquid chromatography-quadruple-time of flight mass spectrometry (UHPLC-Q-TOF-MS) and an in-house database, 148 lipo-alkaloids were identified from A. carmichaelii, including 93 potential new compounds and 38 compounds with oxygenated fatty acid moieties. To our knowledge, this is the first time of the reporting of the oxygenated fatty acids as the side chains in naturally-occurring lipo-alkaloids. Considering the fatty acid residues in lipo-alkaloids should come from the free acids in the plant, the fatty acids and their relationship with lipo-alkaloids were further investigated by GC-MS and LC-MS. Among 17 fatty acids identified by GC-MS, 12 were detected as the side chains of lipo-alkaloids, which accounted for about 1/3 of total lipo-alkaloids, while these fatty acid residues were less than 1/4 of total fatty acid residues. And, total of 37 fatty acids were determined by UHPCL-Q-TOF-MS, including 18 oxidized fatty acids firstly identified from A. carmichaelii. These fatty acids were observed as the side chains of lipo-alkaloids. In addition, although over 140 lipo-alkaloids were identified, six lipo-alkaloids, 8-O-linoleoyl-14-benzoylmesaconine (1), 8-O-linoleoyl-14-benzoylaconine (2), 8-O-palmitoyl-14-benzoylmesaconine (3), 8-O-oleoyl-14-benzoylmesaconine (4), 8-O-pal-benzoylaconine (5), and 8-O-ole-Benzoylaconine (6), were found to be the main components, which accounted for over 90% content of total lipo-alkaloids. Therefore, using these six components as standards, a UHPLC-Triple Quadrupole-MS (UHPLC-QQQ-MS) approach was established to investigate the influence of processing on the contents of lipo-alkaloids. Although it was commonly supposed that the contents of lipo-alkaloids increased after processing, our research showed that no significant change was observed before and after processing. Using the same methods, the lipo-alkaloids in the lateral roots of A. carmichaelii and the roots of A. kusnezoffii were determined and quantified. The contents of lipo-alkaloids in A. kusnezoffii were close to that of the parent roots of A. carmichaelii, while the lateral roots had less lipo-alkaloids than the parent roots. This work was supported by Macao Science and Technology Development Fund (086/2013/A3 and 003/2016/A1).Keywords: Aconitum carmichaelii, fatty acids, GC-MS, LC-MS, lipo-alkaloids
Procedia PDF Downloads 299855 Analysis of the Brazilian Trade Balance in Relation to Mercosur: A Comparison between the Period 1989-1994 and 1994-2012
Authors: Luciana Aparecida Bastos, Tatiana Diair L. F. Rosa, Jesus Creapldi
Abstract:
The idea of Latin American integration occurred from the ideals of Simón Bolívar that, in 1824, called the Ibero-American nations to Amphictyonic Congress of Panama, on June 22, 1826, where he would defend the importance of Latin American unity. However, this congress was frustrating and the idea of Bolívar went no further. It was only after the European Union to start the process, driven by the end of World War II that the subject returned to emerge in Latin America. Thus, in 1960, supported by the European integration process, started in 1957 with the excellent result of the ECSC - European Coal and Steel Community, a result of the Customs Union of the BENELUX (integration between Belgium, the Netherlands and Luxembourg) in 1948, was created in Latin America, LAFTA - Latin American Free Trade Association, in 1960. In 1980, LAFTA was replaced by LAAI- Latin American Association, both with the same goal: to integrate Latin America, it´s economy and its trade. Most researchers in this period agree that the regional market would be expanded through the integration. The creation of one or more economic blocs in the region would provide the union of Latin American countries through a fusion of common interests and by their geographical proximity, which would try to develop common projects to promote mutual growth and economic development, tariff reductions, promotion of increased trade between, among many other goals set together. Thus, taking into account Mercosur, the main Latin-American block, created in 1994, the aim of this paper is to make a brief analysis of the trade balance performance of Brazil (larger economy of the block) in Mercosur in the periods: 1989-1994 and 1994-2012. The choice of this period was because the objective is to compare the period before and after the integration of Brazil in Mercosur. The methodologies used were the literature review and descriptive statistics. The results showed that after the integration of Brazil in Mercosur, the exports and imports grew within the bloc and the country turned out to become the leading importer of other economies of Mercosur after integration, that is, Brazil, after integration to Mercosur, was largely responsible for promoting the expansion of regional trade through the import of products from other members of the block.Keywords: Brazil, mercosur, integration, trade balance, comparison
Procedia PDF Downloads 324854 Therapeutic Role of T Subpopulations Cells (CD4, CD8 and Treg (CD25 and FOXP3+ Cells) of UC MSC Isolated from Three Different Methods in Various Disease
Authors: Kumari Rekha, Mathur K Dhananjay, Maheshwari Deepanshu, Nautiyal Nidhi, Shubham Smriti, Laal Deepika, Sinha Swati, Kumar Anupam, Biswas Subhrajit, Shiv Kumar Sarin
Abstract:
Background: Mesenchymal stem cells are multipotent stem cells derived from mesoderm and are used for therapeutic purposes because of their self-renewal, homing capacity, Immunomodulatory capability, low immunogenicity and mitochondrial transfer signaling. MSCs have the ability to regulate the mechanism of both innate as well as adaptive immune responses through the modulation of cellular response and the secretion of inflammatory mediators. Different sources of MSC are UC MSC, BM MSC, Dental Pulp, and Adipose MSC. The most frequent source used is umbilical cord tissue due to its being easily available and free of limitations of collection procedures from respective hospitals. The immunosuppressive role of MSCs is particularly interesting for clinical use since it confers resistance to rejection by the host immune response. Methodology: In this study, T helper cells (TH4), Cytotoxic T cells (CD-8), immunoregulatory cells (CD25 +FOXP3+) are compared from isolated MSC from three different methods, UC Dissociation Kit (Miltenyi), Explant Culture and Collagenase Type-IV. To check the immunomodulatory property, these MSCs were seeded with PBMC(Coculture) in CD3 coated 24 well plates. Cd28 antibody was added in coculture for six days. The coculture was analyzed in FACS Verse flow cytometry. Results: From flow cytometry analysis of coculture, it found that All over T helper cells (CD4+) number p<0.0264 increases in (All Enzymes) MSC rather than explant MSC(p>0.0895) as compared to Collagenase(p>0.7889) in a coculture of Activated T cell and Mesenchymal Stem Cell. Similar T reg cells (CD25+, FOXP3+) expression p<0.0234increases in All Enzymes), decreases in Explant and Collagenase. Experiments have shown that MSCs can also directly prevent the cytotoxic activity of CD8 lymphocytes mainly by blocking their proliferation rather than by inhibiting the cytotoxic effect. And promoting the t-reg cells, which helps in the mediation of immune response in various diseases. Conclusion: MSC suppress Cytotoxic CD8 T cell and Enhance immunoregulatory T reg (CD4+, CD25+, FOXP3+) Cell expression. Thus, MSC maintains a proper balance(ratio) between CD4 T cells and Cytotoxic CD8 T cells.Keywords: MSC, disease, T cell, T regulatory
Procedia PDF Downloads 114853 3D-Printing of Waveguide Terminations: Effect of Material Shape and Structuring on Their Characteristics
Authors: Lana Damaj, Vincent Laur, Azar Maalouf, Alexis Chevalier
Abstract:
Matched termination is an important part of the passive waveguide components. It is typically used at the end of a waveguide transmission line to prevent reflections and improve signal quality. Waveguide terminations (loads) are commonly used in microwave and RF applications. In traditional microwave architectures, usually, waveguide termination consists of a standard rectangular waveguide made by a lossy resistive material, and ended by shorting metallic plate. These types of terminations are used, to dissipate the energy as heat. However, these terminations may increase the size and the weight of the overall system. New alternative solution consists in developing terminations based on 3D-printing of materials. Designing such terminations is very challenging since it should meet the requirements imposed by the system. These requirements include many parameters such as the absorption, the power handling capability in addition to the cost, the size and the weight that have to be minimized. 3D-printing is a shaping process that enables the production of complex geometries. It allows to find best compromise between requirements. In this paper, a comparison study has been made between different existing and new shapes of waveguide terminations. Indeed, 3D printing of absorbers makes it possible to study not only standard shapes (wedge, pyramid, tongue) but also more complex topologies such as exponential ones. These shapes have been designed and simulated using CST MWS®. The loads have been printed using the carbon-filled PolyLactic Acid, conductive PLA from ProtoPasta. Since the terminations has been characterized in the X-band (from 8GHz to 12GHz), the rectangular waveguide standard WR-90 has been selected. The classical wedge shape has been used as a reference. First, all loads have been simulated with the same length and two parameters have been compared: the absorption level (level of |S11|) and the dissipated power density. This study shows that the concave exponential pyramidal shape has the better absorption level and the convex exponential pyramidal shape has the better dissipated power density level. These two loads have been printed in order to measure their properties. A good agreement between the simulated and measured reflection coefficient has been obtained. Furthermore, a study of material structuring based on the honeycomb hexagonal structure has been investigated in order to vary the effective properties. In the final paper, the detailed methodology and the simulated and measured results will be presented in order to show how 3D-printing can allow controlling mass, weight, absorption level and power behaviour.Keywords: additive manufacturing, electromagnetic composite materials, microwave measurements, passive components, power handling capacity (PHC), 3D-printing
Procedia PDF Downloads 20852 Data Model to Predict Customize Skin Care Product Using Biosensor
Authors: Ashi Gautam, Isha Shukla, Akhil Seghal
Abstract:
Biosensors are analytical devices that use a biological sensing element to detect and measure a specific chemical substance or biomolecule in a sample. These devices are widely used in various fields, including medical diagnostics, environmental monitoring, and food analysis, due to their high specificity, sensitivity, and selectivity. In this research paper, a machine learning model is proposed for predicting the suitability of skin care products based on biosensor readings. The proposed model takes in features extracted from biosensor readings, such as biomarker concentration, skin hydration level, inflammation presence, sensitivity, and free radicals, and outputs the most appropriate skin care product for an individual. This model is trained on a dataset of biosensor readings and corresponding skin care product information. The model's performance is evaluated using several metrics, including accuracy, precision, recall, and F1 score. The aim of this research is to develop a personalised skin care product recommendation system using biosensor data. By leveraging the power of machine learning, the proposed model can accurately predict the most suitable skin care product for an individual based on their biosensor readings. This is particularly useful in the skin care industry, where personalised recommendations can lead to better outcomes for consumers. The developed model is based on supervised learning, which means that it is trained on a labeled dataset of biosensor readings and corresponding skin care product information. The model uses these labeled data to learn patterns and relationships between the biosensor readings and skin care products. Once trained, the model can predict the most suitable skin care product for an individual based on their biosensor readings. The results of this study show that the proposed machine learning model can accurately predict the most appropriate skin care product for an individual based on their biosensor readings. The evaluation metrics used in this study demonstrate the effectiveness of the model in predicting skin care products. This model has significant potential for practical use in the skin care industry for personalised skin care product recommendations. The proposed machine learning model for predicting the suitability of skin care products based on biosensor readings is a promising development in the skin care industry. The model's ability to accurately predict the most appropriate skin care product for an individual based on their biosensor readings can lead to better outcomes for consumers. Further research can be done to improve the model's accuracy and effectiveness.Keywords: biosensors, data model, machine learning, skin care
Procedia PDF Downloads 97851 Change of Epidemiological Characteristics and Disease Burden of Varicella Due to Implementation of Mass Immunization Program in Taiwan from 2000 to 2012
Authors: En-Tzu Wang, Ting-Ann Wang, Yi-Hui Shen, Yu-Min Chou, Chi-Tai Fang, Chin-Hui Yang
Abstract:
Background and purpose: A mass varicella immunization program was established to provide free 1-dose vaccination for all 1-year-old children throughout Taiwan since 2004. The epidemiological characteristics and disease burden of varicella from 2000 to 2012 was investigated and the results will be essential to refine the national immunization policy. Method: We included patients (n = 17,838–164,245) with ICD-9-CM codes 052 (chickenpox) from the 2000 to 2012 National Health Insurance Database. The age, period, and cohort-specific incidence of varicella were calculated. The hospital admission rate, medical costs and indirect costs from the societal perspective of varicella including travel costs to the medical facility, registration fee, productivity losses of the patients and caregivers were also estimated. Result: There were 979,252 patients for medical treatment due to varicella from 2000 to 2012 in Taiwan. The implementation of a routine childhood varicella vaccination program has resulted in 87% decline in morbidity (881.49 to 115.17 per 100,000). The average age of patients increased from 7.9 years to 16.3 years. The overall varicella-related hospital admission rate was 15.5 per 1000 patients, and peaked in the groups of infants younger than 1 year, adults aged from 20 to 39 years and elders over 70 years. Among patients admitted to hospital, 33.5% of them had one or more complications. Patients with underlying diseases had higher admission rate (241.6 per 1,000) and longer duration of hospital stay (6.61 days vs. 4.76 days). The annual varicella-related medical expense declined after 2002 and the proportion of medical costs for admission has increased to 42%. The annual indirect costs from the societal perspective of varicella were 5.29 to 9.63 times higher than varicella-related medical costs. Every one dollar invested in the varicella immunization program, 2.97 dollars of medical and social costs were saved on average. Conclusion: The dramatic decline in morbidity, hospitalization, medical and social costs of varicella can be directly attributed to the implementation of the mass immunization program. Two-dose vaccination is recommended for both children with underlying diseases and susceptible adults to prevent serious complications and hospitalizations.Keywords: disease burden, epidemiology, medical and social costs, varicella, varicella vaccine
Procedia PDF Downloads 411850 Folding of β-Structures via the Polarized Structure-Specific Backbone Charge (PSBC) Model
Authors: Yew Mun Yip, Dawei Zhang
Abstract:
Proteins are the biological machinery that executes specific vital functions in every cell of the human body by folding into their 3D structures. When a protein misfolds from its native structure, the machinery will malfunction and lead to misfolding diseases. Although in vitro experiments are able to conclude that the mutations of the amino acid sequence lead to incorrectly folded protein structures, these experiments are unable to decipher the folding process. Therefore, molecular dynamic (MD) simulations are employed to simulate the folding process so that our improved understanding of the folding process will enable us to contemplate better treatments for misfolding diseases. MD simulations make use of force fields to simulate the folding process of peptides. Secondary structures are formed via the hydrogen bonds formed between the backbone atoms (C, O, N, H). It is important that the hydrogen bond energy computed during the MD simulation is accurate in order to direct the folding process to the native structure. Since the atoms involved in a hydrogen bond possess very dissimilar electronegativities, the more electronegative atom will attract greater electron density from the less electronegative atom towards itself. This is known as the polarization effect. Since the polarization effect changes the electron density of the two atoms in close proximity, the atomic charges of the two atoms should also vary based on the strength of the polarization effect. However, the fixed atomic charge scheme in force fields does not account for the polarization effect. In this study, we introduce the polarized structure-specific backbone charge (PSBC) model. The PSBC model accounts for the polarization effect in MD simulation by updating the atomic charges of the backbone hydrogen bond atoms according to equations derived between the amount of charge transferred to the atom and the length of the hydrogen bond, which are calculated from quantum-mechanical calculations. Compared to other polarizable models, the PSBC model does not require quantum-mechanical calculations of the peptide simulated at every time-step of the simulation and maintains the dynamic update of atomic charges, thereby reducing the computational cost and time while accounting for the polarization effect dynamically at the same time. The PSBC model is applied to two different β-peptides, namely the Beta3s/GS peptide, a de novo designed three-stranded β-sheet whose structure is folded in vitro and studied by NMR, and the trpzip peptides, a double-stranded β-sheet where a correlation is found between the type of amino acids that constitute the β-turn and the β-propensity.Keywords: hydrogen bond, polarization effect, protein folding, PSBC
Procedia PDF Downloads 270849 Data Refinement Enhances The Accuracy of Short-Term Traffic Latency Prediction
Authors: Man Fung Ho, Lap So, Jiaqi Zhang, Yuheng Zhao, Huiyang Lu, Tat Shing Choi, K. Y. Michael Wong
Abstract:
Nowadays, a tremendous amount of data is available in the transportation system, enabling the development of various machine learning approaches to make short-term latency predictions. A natural question is then the choice of relevant information to enable accurate predictions. Using traffic data collected from the Taiwan Freeway System, we consider the prediction of short-term latency of a freeway segment with a length of 17 km covering 5 measurement points, each collecting vehicle-by-vehicle data through the electronic toll collection system. The processed data include the past latencies of the freeway segment with different time lags, the traffic conditions of the individual segments (the accumulations, the traffic fluxes, the entrance and exit rates), the total accumulations, and the weekday latency profiles obtained by Gaussian process regression of past data. We arrive at several important conclusions about how data should be refined to obtain accurate predictions, which have implications for future system-wide latency predictions. (1) We find that the prediction of median latency is much more accurate and meaningful than the prediction of average latency, as the latter is plagued by outliers. This is verified by machine-learning prediction using XGBoost that yields a 35% improvement in the mean square error of the 5-minute averaged latencies. (2) We find that the median latency of the segment 15 minutes ago is a very good baseline for performance comparison, and we have evidence that further improvement is achieved by machine learning approaches such as XGBoost and Long Short-Term Memory (LSTM). (3) By analyzing the feature importance score in XGBoost and calculating the mutual information between the inputs and the latencies to be predicted, we identify a sequence of inputs ranked in importance. It confirms that the past latencies are most informative of the predicted latencies, followed by the total accumulation, whereas inputs such as the entrance and exit rates are uninformative. It also confirms that the inputs are much less informative of the average latencies than the median latencies. (4) For predicting the latencies of segments composed of two or three sub-segments, summing up the predicted latencies of each sub-segment is more accurate than the one-step prediction of the whole segment, especially with the latency prediction of the downstream sub-segments trained to anticipate latencies several minutes ahead. The duration of the anticipation time is an increasing function of the traveling time of the upstream segment. The above findings have important implications to predicting the full set of latencies among the various locations in the freeway system.Keywords: data refinement, machine learning, mutual information, short-term latency prediction
Procedia PDF Downloads 169848 Verification Protocols for the Lightning Protection of a Large Scale Scientific Instrument in Harsh Environments: A Case Study
Authors: Clara Oliver, Oibar Martinez, Jose Miguel Miranda
Abstract:
This paper is devoted to the study of the most suitable protocols to verify the lightning protection and ground resistance quality in a large-scale scientific facility located in a harsh environment. We illustrate this work by reviewing a case study: the largest telescopes of the Northern Hemisphere Cherenkov Telescope Array, CTA-N. This array hosts sensitive and high-speed optoelectronics instrumentation and sits on a clear, free from obstacle terrain at around 2400 m above sea level. The site offers a top-quality sky but also features challenging conditions for a lightning protection system: the terrain is volcanic and has resistivities well above 1 kOhm·m. In addition, the environment often exhibits humidities well below 5%. On the other hand, the high complexity of a Cherenkov telescope structure does not allow a straightforward application of lightning protection standards. CTA-N has been conceived as an array of fourteen Cherenkov Telescopes of two different sizes, which will be constructed in La Palma Island, Spain. Cherenkov Telescopes can provide valuable information on different astrophysical sources from the gamma rays reaching the Earth’s atmosphere. The largest telescopes of CTA are called LST’s, and the construction of the first one was finished in October 2018. The LST has a shape which resembles a large parabolic antenna, with a 23-meter reflective surface supported by a tubular structure made of carbon fibers and steel tubes. The reflective surface has 400 square meters and is made of an array of segmented mirrors that can be controlled individually by a subsystem of actuators. This surface collects and focuses the Cherenkov photons into the camera, where 1855 photo-sensors convert the light in electrical signals that can be processed by dedicated electronics. We describe here how the risk assessment of direct strike impacts was made and how down conductors and ground system were both tested. The verification protocols which should be applied for the commissioning and operation phases are then explained. We stress our attention on the ground resistance quality assessment.Keywords: grounding, large scale scientific instrument, lightning risk assessment, lightning standards and safety
Procedia PDF Downloads 123847 Effect of Feeding Broilers on Diets Enriching With Omega-3 Fatty Acids Sources
Authors: Khalid Mahmoud Gaafar
Abstract:
In human diets , ω-6 and ω-3 are important essential fatty acids for immunity and health. However, considerable alteration in dietary patterns and contents has resulted in change of the consumption of such fatty acids ,with subsequent increase in the consumption of ω-6 fatty acids and a marked decrease in the consumption of ω-3 fatty acids. This dietary alteration has led to an imbalance in the ratio for ω-6/ω-3, which at 20:1 now differs considerably from the original ratio (1:1). Therefore, dietary supplements such as eggs and meat enriched with omega 3 are necessary to increase the consumption of ω-3 to meet the recommended need for ω-3. Foods that supply ω-6 fatty acids include soybean, palm , sunflower, and rapeseed oils, whereas foods that supply ω-3 fatty acids such as linseed and fish oils. Lin seed oils contain Alpha – linolenic acid (ALA), which can be converted to DHA and EPA in the birds body, with linseed oil containing more than 50% ALA. On the other hand, high doses of omega 6 sources in the diet may have deleterious effects on humans. Maintaining an optimum ratio of ω-3 and ω-6fatty acids not only improves performance but also prevents these health risks. The ratio of n-6:ω-3 fatty acids also plays an important role in the immune response, production performance of broilers and designing meat enriched with ω-3 polyunsaturated fatty acids (PUFAs). Birds of three experimental groups fed on basal starter (0-2nd weeks), grower (3rd -4th weeks) and finisher (5th week) rations. The first is control group fed during the grower-finisher periods on basic diet with two replicate (one fed on basic diet contain vegetable oil and the other don’t) without any additives. The three experimental groups (T1 – T2 –T3) fed during the grower- finisher periods on diets free from vegetable oils and contain of 5% of extruded mixture of soybean and linseed (60%:40%). The second (T2) and third (T3) experimental groups supplemented with vitamin B12 and enzyme mixture. The first experimental groups don’t receive vitamins or enzymes. The obtained results showed a significant increased growth performance, immune response, highest antioxidant activity and serum HDL with lowest serum LDL and triglycerides levels in all experimental groups compared with control group, which was highly significant in group fed on vitamin B6.Keywords: omega fatty acids, broiler, feeding, human health, growth performance, immunity
Procedia PDF Downloads 113846 Production of Bricks Using Mill Waste and Tyre Crumbs at a Low Temperature by Alkali-Activation
Authors: Zipeng Zhang, Yat C. Wong, Arul Arulrajah
Abstract:
Since automobiles became widely popular around the early 20th century, end-of-life tyres have been one of the major types of waste humans encounter. Every minute, there are considerable quantities of tyres being disposed of around the world. Most end-of-life tyres are simply landfilled or simply stockpiled, other than recycling. To address the potential issues caused by tyre waste, incorporating it into construction materials can be a possibility. This research investigated the viability of manufacturing bricks using mill waste and tyre crumb by alkali-activation at a relatively low temperature. The mill waste was extracted from a brick factory located in Melbourne, Australia, and the tyre crumbs were supplied by a local recycling company. As the main precursor, the mill waste was activated by the alkaline solution, which was comprised of sodium hydroxide (8m) and sodium silicate (liquid). The introduction ratio of alkaline solution (relative to the solid weight) and the weight ratio between sodium hydroxide and sodium silicate was fixed at 20 wt.% and 1:1, respectively. The tyre crumb was introduced to substitute part of the mill waste at four ratios by weight, namely 0, 5, 10 and 15%. The mixture of mill waste and tyre crumbs were firstly dry-mixed for 2 min to ensure the homogeneity, followed by a 2.5-min wet mixing after adding the solution. The ready mixture subsequently was press-moulded into blocks with the size of 109 mm in length, 112.5 mm in width and 76 mm in height. The blocks were cured at 50°C with 95% relative humidity for 2 days, followed by a 110°C oven-curing for 1 day. All the samples were then placed under the ambient environment until the age of 7 and 28 days for testing. A series of tests were conducted to evaluate the linear shrinkage, compressive strength and water absorption of the samples. In addition, the microstructure of the samples was examined via the scanning electron microscope (SEM) test. The results showed the highest compressive strength was 17.6 MPa, found in the 28-day-old group using 5 wt.% tyre crumbs. Such strength has been able to satisfy the requirement of ASTM C67. However, the increasing addition of tyre crumb weakened the compressive strength of samples. Apart from the strength, the linear shrinkage and water absorption of all the groups can meet the requirements of the standard. It is worth noting that the use of tyre crumbs tended to decrease the shrinkage and even caused expansion when the tyre content was over 15 wt.%. The research also found that there was a significant reduction in compressive strength for the samples after water absorption tests. In conclusion, the tyre crumbs have the potential to be used as a filler material in brick manufacturing, but more research needs to be done to tackle the durability problem in the future.Keywords: bricks, mill waste, tyre crumbs, waste recycling
Procedia PDF Downloads 122845 Behavioral Analysis of Stock Using Selective Indicators from Fundamental and Technical Analysis
Authors: Vish Putcha, Chandrasekhar Putcha, Siva Hari
Abstract:
In the current digital era of free trading and pandemic-driven remote work culture, markets worldwide gained momentum for retail investors to trade from anywhere easily. The number of retail traders rose to 24% of the market from 15% at the pre-pandemic level. Most of them are young retail traders with high-risk tolerance compared to the previous generation of retail traders. This trend boosted the growth of subscription-based market predictors and market data vendors. Young traders are betting on these predictors, assuming one of them is correct. However, 90% of retail traders are on the losing end. This paper presents multiple indicators and attempts to derive behavioral patterns from the underlying stocks. The two major indicators that traders and investors follow are technical and fundamental. The famous investor, Warren Buffett, adheres to the “Value Investing” method that is based on a stock’s fundamental Analysis. In this paper, we present multiple indicators from various methods to understand the behavior patterns of stocks. For this research, we picked five stocks with a market capitalization of more than $200M, listed on the exchange for more than 20 years, and from different industry sectors. To study the behavioral pattern over time for these five stocks, a total of 8 indicators are chosen from fundamental, technical, and financial indicators, such as Price to Earning (P/E), Price to Book Value (P/B), Debt to Equity (D/E), Beta, Volatility, Relative Strength Index (RSI), Moving Averages and Dividend yields, followed by detailed mathematical Analysis. This is an interdisciplinary paper between various disciplines of Engineering, Accounting, and Finance. The research takes a new approach to identify clear indicators affecting stocks. Statistical Analysis of the data will be performed in terms of the probabilistic distribution, then follow and then determine the probability of the stock price going over a specific target value. The Chi-square test will be used to determine the validity of the assumed distribution. Preliminary results indicate that this approach is working well. When the complete results are presented in the final paper, they will be beneficial to the community.Keywords: stock pattern, stock market analysis, stock predictions, trading, investing, fundamental analysis, technical analysis, quantitative trading, financial analysis, behavioral analysis
Procedia PDF Downloads 85844 Strategies for Good Governance during Crisis in Higher Education
Authors: Naziema B. Jappie
Abstract:
Over the last 23 years leaders in government, political parties and universities have been spending much time on identifying and discussing various gaps in the system that impact systematically on students especially those from historically Black communities. Equity and access to higher education were two critical aspects that featured in achieving the transformation goals together with a funding model for those previously disadvantaged. Free education was not a feasible option for the government. Institutional leaders in higher education face many demands on their time and resources. Often, the time for crisis management planning or consideration of being proactive and preventative is not a standing agenda item. With many issues being priority in academia, people become complacent and think that crisis may not affect them or they will cross the bridge when they get to it. Historically South Africa has proven to be a country of militancy, strikes and protests in most industries, some leading to disastrous outcomes. Higher education was not different between October 2015 and late 2016 when the #Rhodes Must Fall which morphed into the # Fees Must Fall protest challenged the establishment, changed the social fabric of universities, bringing the sector to a standstill. Some institutional leaders and administrators were better at handling unexpected, high-consequence situations than others. At most crisis leadership is viewed as a situation more than a style of leadership which is usually characterized by crisis management. The objective of this paper is to show how institutions managed catastrophes of disastrous proportions, down through unexpected incidents of 2015/2016. The content draws on the vast past crisis management experience of the presenter and includes the occurrences of the recent protests giving an event timeline. Using responses from interviews with institutional leaders and administrators as well as students will ensure first-hand information on their experiences and the outcomes. Students have tasted the power of organized action and they demand immediate change, if not the revolt will continue. This paper will examine the approaches that guided institutional leaders and their crisis teams and sector crisis response. It will further expand on whether the solutions effectively changed governance in higher education or has it minimized the need for more protests. The conclusion will give an insight into the future of higher education in South Africa from a leadership perspective.Keywords: crisis, governance, intervention, leadership, strategies, protests
Procedia PDF Downloads 147843 Health Seeking Manners of Road Traffic Accident Victims: A Qualitative Study
Authors: Mohammad Mahbub Alam Talukder, Shahnewaz, Hasanat-E-Rabbi, Mohammed Nazrul Islam
Abstract:
Road traffic accident is a global problem which is severe in the developing countries like Bangladesh. In consequence, in developing countries road trauma has now been recognized as an increasing public health hazards and economic burning issue. And after road traffic accidents the lack of management and economic costs related with health seeking behavior have a disproportionate impact on lower income groups, thus contributing to the persistence of poverty in conjunction with disability. This cross sectional study, carried out during July 2012 to June 2013, aimed to explore health seeking decision and culture of handling the road traffic accident related victims, as taken from experiences of the poor disabled people of slum dwellers of Dhaka city. The present study has been designed based on qualitative techniques such as in-depth interview and case studies. Additionally, a survey questionnaire was used to collect the demographic characteristics of the study population (n=150) and to select participants purposely for in-depth interview (n=50) and case study (n=30). Content analysis of qualitative data was done through theme coding and matrix analysis of case study was done to use relevant verbatim. Most of the time the health seeking decision totally depended on the surrounded people of the accidental place, their knowledge, awareness and remaining facility and capacity regarding proper management of the victims. However, most of the cases the victims did not get any early treatment and it took 2-12 hours to get even the first aid because of distance, shortage of money, lack of availability of getting the aid, lack of mass awareness etc. Under the reality of discriminated and unaffordable health service provision better treatment could not turn out due to economic inability of the poor victims. To avoid the severe trauma, treatment delay must be reduced by providing first aid within very short time and to do so, mass awareness campaign is necessary for handing the victims. Moreover, necessary measures should be taken to ensure cost free health service provision to treat the chronic disabled condition of the road traffic accident related poor victims.Keywords: accident, injury, disabled, qualitative, slum
Procedia PDF Downloads 363842 Storage Assignment Strategies to Reduce Manual Picking Errors with an Emphasis on an Ageing Workforce
Authors: Heiko Diefenbach, Christoph H. Glock
Abstract:
Order picking, i.e., the order-based retrieval of items in a warehouse, is an important time- and cost-intensive process for many logistic systems. Despite the ongoing trend of automation, most order picking systems are still manual picker-to-parts systems, where human pickers walk through the warehouse to collect ordered items. Human work in warehouses is not free from errors, and order pickers may at times pick the wrong or the incorrect number of items. Errors can cause additional costs and significant correction efforts. Moreover, age might increase a person’s likelihood to make mistakes. Hence, the negative impact of picking errors might increase for an aging workforce currently witnessed in many regions globally. A significant amount of research has focused on making order picking systems more efficient. Among other factors, storage assignment, i.e., the assignment of items to storage locations (e.g., shelves) within the warehouse, has been subject to optimization. Usually, the objective is to assign items to storage locations such that order picking times are minimized. Surprisingly, there is a lack of research concerned with picking errors and respective prevention approaches. This paper hypothesize that the storage assignment of items can affect the probability of pick errors. For example, storing similar-looking items apart from one other might reduce confusion. Moreover, storing items that are hard to count or require a lot of counting at easy-to-access and easy-to-comprehend self heights might reduce the probability to pick the wrong number of items. Based on this hypothesis, the paper discusses how to incorporate error-prevention measures into mathematical models for storage assignment optimization. Various approaches with respective benefits and shortcomings are presented and mathematically modeled. To investigate the newly developed models further, they are compared to conventional storage assignment strategies in a computational study. The study specifically investigates how the importance of error prevention increases with pickers being more prone to errors due to age, for example. The results suggest that considering error-prevention measures for storage assignment can reduce error probabilities with only minor decreases in picking efficiency. The results might be especially relevant for an aging workforce.Keywords: an aging workforce, error prevention, order picking, storage assignment
Procedia PDF Downloads 204841 Anti-Aging Effects of Two Agricultural Plant Extracts and Their Underlying Mechanism
Authors: Shwu-Ling Peng, Chiung-Man Tsai, Chia-Jui Weng
Abstract:
Chronic micro-inflammation is a hallmark of many aging-related neurodegenerative and metabolic syndrome-driven diseases. In high glucose (HG) environment, reactive oxygen species (ROS) is generated and the ROS induced inflammation, cytokines secretion, DNA damage, and cell cycle arrest to lead to cellular senescence. Water chestnut shell (WCS) is a plant hull which containing polyphenolic compounds and showed antioxidant and anticancer activities. Orchid, which containing a natural polysaccharide compound, possesses many physiological activities including anti-inflammatory and neuroprotective effects. These agricultural plants might be able to reduce oxidative stress and inflammation. This study was used HG-induced human normal dermal fibroblasts (HG-HNDFs) as an in vitro model to disclose the effects of water extract of Phalaenopsis orchid flower (WEPF) and ethanol extract of water chestnut shell (EEWCS) on the anti-aging and their underlying molecular mechanisms. The toxicity of extracts on human normal dermal fibroblasts (HNDFs) was determined by MTT method. The senescence of cells was assayed by β-galactosidase (SA-β-gal) kit. ROS and nitrate production was analyzed by Intracellular ROS contents and ELISA, respectively. Western blotting was used to detect the proteins in cells. The results showed that the exposure of HNDFs to HG (30 mM) for 72 h were caused cellular senescence and arrested cells at G0/G1 phase. Indeed, the treatment of HG-HNDFs with WEPF (200 μg/ml) and EEWCS (10 μg/ml) significantly released cell cycle arrest and promoted cell proliferation. The G1/S phase transition regulatory proteins such as protein retinoblastoma (pRb), p53, and p16ᴵᴺᴷ⁴ᵃ depressed by WEPF and EEWCS were also observed. Additionally, the treatment of WEPF and EEWCS increased the activity of HO-1 through upregulating Nrf2 as well as decreased the ROS and NO of HG-HNDFs. Therefore, the senescence marker protein-30 (SMP30) in cells was diminished. In conclusion, the WEPF and EEWCS might inhibit HG-induced aging of HNDFs by reducing oxidative stress and free radicals.Keywords: agricultural plant extract, anti-aging, high glucose, Phalaenopsis orchid flower, water chestnut shell
Procedia PDF Downloads 154840 Electromagnetic-Mechanical Stimulation on PC12 for Enhancement of Nerve Axonal Extension
Authors: E. Nakamachi, K. Matsumoto, K. Yamamoto, Y. Morita, H. Sakamoto
Abstract:
In recently, electromagnetic and mechanical stimulations have been recognized as the effective extracellular environment stimulation technique to enhance the defected peripheral nerve tissue regeneration. In this study, we developed a new hybrid bioreactor by adopting 50 Hz uniform alternative current (AC) magnetic stimulation and 4% strain mechanical stimulation. The guide tube for nerve regeneration is mesh structured tube made of biodegradable polymer, such as polylatic acid (PLA). However, when neural damage is large, there is a possibility that peripheral nerve undergoes necrosis. So it is quite important to accelerate the nerve tissue regeneration by achieving enhancement of nerve axonal extension rate. Therefore, we try to design and fabricate the system that can simultaneously load the uniform AC magnetic field stimulation and the stretch stimulation to cells for enhancement of nerve axonal extension. Next, we evaluated systems performance and the effectiveness of each stimulation for rat adrenal pheochromocytoma cells (PC12). First, we designed and fabricated the uniform AC magnetic field system and the stretch stimulation system. For the AC magnetic stimulation system, we focused on the use of pole piece structure to carry out in-situ microscopic observation. We designed an optimum pole piece structure using the magnetic field finite element analyses and the response surface methodology. We fabricated the uniform AC magnetic field stimulation system as a bio-reactor by adopting analytically determined design specifications. We measured magnetic flux density that is generated by the uniform AC magnetic field stimulation system. We confirmed that measurement values show good agreement with analytical results, where the uniform magnetic field was observed. Second, we fabricated the cyclic stretch stimulation device under the conditions of particular strains, where the chamber was made of polyoxymethylene (POM). We measured strains in the PC12 cell culture region to confirm the uniform strain. We found slightly different values from the target strain. Finally, we concluded that these differences were allowable in this mechanical stimulation system. We evaluated the effectiveness of each stimulation to enhance the nerve axonal extension using PC12. We confirmed that the average axonal extension length of PC12 under the uniform AC magnetic stimulation was increased by 16 % at 96 h in our bio-reactor. We could not confirm that the axonal extension enhancement under the stretch stimulation condition, where we found the exfoliating of cells. Further, the hybrid stimulation enhanced the axonal extension. Because the magnetic stimulation inhibits the exfoliating of cells. Finally, we concluded that the enhancement of PC12 axonal extension is due to the magnetic stimulation rather than the mechanical stimulation. Finally, we confirmed that the effectiveness of the uniform AC magnetic field stimulation for the nerve axonal extension using PC12 cells.Keywords: nerve cell PC12, axonal extension, nerve regeneration, electromagnetic-mechanical stimulation, bioreactor
Procedia PDF Downloads 264839 A Study on Relationship between Firm Managers Environmental Attitudes and Environment-Friendly Practices for Textile Firms in India
Authors: Anupriya Sharma, Sapna Narula
Abstract:
Over the past decade, sustainability has gone mainstream as more people are worried about environment-related issues than ever before. These issues are of even more concern for industries which leave a significant impact on the environment. Following these ecological issues, corporates are beginning to comprehend the impact on their business. Many such initiatives have been made to address these emerging issues in the consumer-driven textile industry. Demand from customers, local communities, government regulations, etc. are considered some of the major factors affecting environmental decision-making. Research also shows that motivations to go green are inevitably determined by the way top managers perceive environmental issues as managers personal values and ethical commitment act as a motivating factor towards corporate social responsibility. Little empirical research has been conducted to examine the relationship between top managers’ personal environmental attitudes and corporate environmental behaviors for the textile industry in the Indian context. The primary purpose of this study is to determine the current state of environmental management in textile industry and whether the attitude of textile firms’ top managers is significantly related to firm’s response to environmental issues and their perceived benefits of environmental management. To achieve the aforesaid objectives of the study, authors used structured questionnaire based on literature review. The questionnaire consisted of six sections with a total length of eight pages. The first section was based on background information on the position of the respondents in the organization, annual turnover, year of firm’s establishment and so on. The other five sections of the questionnaire were based upon (drivers, attitude, and awareness, sustainable business practices, barriers to implementation and benefits achieved). To test the questionnaire, a pretest was conducted with the professionals working in corporate sustainability and had knowledge about the textile industry and was then mailed to various stakeholders involved in textile production thereby covering firms top manufacturing officers, EHS managers, textile engineers, HR personnel and R&D managers. The results of the study showed that most of the textile firms were implementing some type of environmental management practice, even though the magnitude of firm’s involvement in environmental management practices varied. The results also show that textile firms with a higher level of involvement in environmental management were more involved in the process driven technical environmental practices. It also identified that firm’s top managers environmental attitudes were correlated with perceived advantages of environmental management as textile firm’s top managers are the ones who possess managerial discretion on formulating and deciding business policies such as environmental initiatives.Keywords: attitude and awareness, Environmental management, sustainability, textile industry
Procedia PDF Downloads 233838 Well-Defined Polypeptides: Synthesis and Selective Attachment of Poly(ethylene glycol) Functionalities
Authors: Cristina Lavilla, Andreas Heise
Abstract:
The synthesis of sequence-controlled polymers has received increasing attention in the last years. Well-defined polyacrylates, polyacrylamides and styrene-maleimide copolymers have been synthesized by sequential or kinetic addition of comonomers. However this approach has not yet been introduced to the synthesis of polypeptides, which are in fact polymers developed by nature in a sequence-controlled way. Polypeptides are natural materials that possess the ability to self-assemble into complex and highly ordered structures. Their folding and properties arise from precisely controlled sequences and compositions in their constituent amino acid monomers. So far, solid-phase peptide synthesis is the only technique that allows preparing short peptide sequences with excellent sequence control, but also requires extensive protection/deprotection steps and it is a difficult technique to scale-up. A new strategy towards sequence control in the synthesis of polypeptides is introduced, based on the sequential addition of α-amino acid-N-carboxyanhydrides (NCAs). The living ring-opening process is conducted to full conversion and no purification or deprotection is needed before addition of a new amino acid. The length of every block is predefined by the NCA:initiator ratio in every step. This method yields polypeptides with a specific sequence and controlled molecular weights. A series of polypeptides with varying block sequences have been synthesized with the aim to identify structure-property relationships. All of them are able to adopt secondary structures similar to natural polypeptides, and display properties in the solid state and in solution that are characteristic of the primary structure. By design the prepared polypeptides allow selective modification of individual block sequences, which has been exploited to introduce functionalities in defined positions along the polypeptide chain. Poly(ethylene glycol)(PEG) was the functionality chosen, as it is known to favor hydrophilicity and also yield thermoresponsive materials. After PEGylation, hydrophilicity of the polypeptides is enhanced, and their thermal response in H2O has been studied. Noteworthy differences in the behavior of the polypeptides having different sequences have been found. Circular dichroism measurements confirmed that the α-helical conformation is stable over the examined temperature range (5-90 °C). It is concluded that PEG units are the main responsible of the changes in H-bonding interactions with H2O upon variation of temperature, and the position of these functional units along the backbone is a factor of utmost importance in the resulting properties of the α-helical polypeptides.Keywords: α-amino acid N-carboxyanhydrides, multiblock copolymers, poly(ethylene glycol), polypeptides, ring-opening polymerization, sequence control
Procedia PDF Downloads 200837 Evaluating the Efficacy of Tasquinimod in Covid-19
Authors: Raphael Udeh, Luis García De Guadiana Romualdo, Xenia Dolje-Gore
Abstract:
Background: Quite disturbing is the huge public health impact of COVID-19: As at today [25th March 2021, the COVID-19 global burden shows over 123 million cases and over 2.7 million deaths worldwide. Rationale: Recent evidence shows calprotectin’s potential as a therapeutic target, stating that tasquinimod, from the Quinoline-3-Carboxamide family is capable of blocking the interaction between calprotectin and TLR4. Hence preventing the cytokine release syndrome, that heralds the functional exhaustion in COVID-19. Early preclinical studies showed that tasquinimod inhibit tumor growth and prevent angiogenesis/cytokine storm. Phase I – III clinical studies in prostate cancer showed it has a good safety profile with good radiologic progression free survival but no effect on overall survival. Rationale/hypothesis: Strategic endeavors have been amplified globally to assess new therapeutic interventions for COVID-19 management – thus the clinical and antiviral efficacy of tasquinimod in COVID-19 remains to be explored. Hence the primary objective of this trial will be to evaluate the efficacy of tasquinimod in the treatment of adult patients with severe COVID-19 infections. Therefore, I hypothesise that among adults with COVID19 infection, tasquinimod will reduce the severe respiratory distress associated with COVID-19 compared to placebo, over a 28-day study period. Method: The setting is in Europe. Design – a randomized, placebo-controlled, phase II double-blinded trial. Trial lasts for 28 days from randomization, Tasquinimod capsule given as 0.5mg daily 1st fortnight, then 1mg daily 2nd fortnight. I0 outcome - assessed using six-point ordinal scale alongside eight 20 outcomes. 125 participants to be enrolled, data collection at baseline and subsequent data points, and safety reporting monitored via serological profile. Significance: This work could potentially establish tasquinimod as an effective and safe therapeutic agent for COVID-19 by reducing the severe respiratory distress, related time to recovery, time on oxygen/admission. It will also drive future research – as in larger multi-centre RCT.Keywords: Calprotectin, COVID-19, Phase II Trial, Tasquinimod
Procedia PDF Downloads 195836 Shaping of World-Class Delhi: Politics of Marginalization and Inclusion
Authors: Aparajita Santra
Abstract:
In the context of the government's vision of turning Delhi into a green, privatized and slum free city, giving it a world-class image at par with the global cities of the world, this paper investigates into the various processes and politics of things that went behind defining spaces in the city and attributing an aesthetic image to it. The paper will explore two cases that were forged primarily through the forces of one particular type of power relation. One would be to look at the modernist movement adopted by the Nehruvian government post-independence and the next case will look at special periods like Emergency and Commonwealth games. The study of these cases will help understand the ambivalence embedded in the different rationales of the Government and different powerful agencies adopted in order to build world-classness. Through the study, it will be easier to discern how city spaces were reconfigured in the name of 'good governance'. In this process, it also became important to analyze the double nature of law, both as a protector of people’s rights and as a threat to people. What was interesting to note through the study was that in the process of nation building and creating an image for the city, the government’s policies and programs were mostly aimed at the richer sections of the society and the poorer sections and people from lower income groups kept getting marginalized, subdued, and pushed further away (These marginalized people were pushed away even geographically!). The reconfiguration of city space and attributing an aesthetic character to it, led to an alteration not only in the way in which citizens perceived and engaged with these spaces, but also brought about changes in the way they envisioned their place in the city. Ironically, it was found that every attempt to build any kind of facility for the city’s elite in turn led to an inevitable removal of the marginalized sections of the society as a necessary step to achieve a clean, green and world-class city. The paper questions the claim made by the government for creating a just, equitable city and granting rights to all. An argument is put forth that in the politics of redistribution of space, the city that has been designed is meant for the aspirational middle-class and elite only, who are ideally primed to live in world-class cities. Thus, the aim is to study city spaces, urban form, the associated politics and power plays involved within and understand whether segmented cities are being built in the name of creating sensible, inclusive cities.Keywords: aesthetics, ambivalence, governmentality, power, World-class
Procedia PDF Downloads 117835 Spanish Language Violence Corpus: An Analysis of Offensive Language in Twitter
Authors: Beatriz Botella-Gil, Patricio Martínez-Barco, Lea Canales
Abstract:
The Internet and ICT are an integral element of and omnipresent in our daily lives. Technologies have changed the way we see the world and relate to it. The number of companies in the ICT sector is increasing every year, and there has also been an increase in the work that occurs online, from sending e-mails to the way companies promote themselves. In social life, ICT’s have gained momentum. Social networks are useful for keeping in contact with family or friends that live far away. This change in how we manage our relationships using electronic devices and social media has been experienced differently depending on the age of the person. According to currently available data, people are increasingly connected to social media and other forms of online communication. Therefore, it is no surprise that violent content has also made its way to digital media. One of the important reasons for this is the anonymity provided by social media, which causes a sense of impunity in the victim. Moreover, it is not uncommon to find derogatory comments, attacking a person’s physical appearance, hobbies, or beliefs. This is why it is necessary to develop artificial intelligence tools that allow us to keep track of violent comments that relate to violent events so that this type of violent online behavior can be deterred. The objective of our research is to create a guide for detecting and recording violent messages. Our annotation guide begins with a study on the problem of violent messages. First, we consider the characteristics that a message should contain for it to be categorized as violent. Second, the possibility of establishing different levels of aggressiveness. To download the corpus, we chose the social network Twitter for its ease of obtaining free messages. We chose two recent, highly visible violent cases that occurred in Spain. Both of them experienced a high degree of social media coverage and user comments. Our corpus has a total of 633 messages, manually tagged, according to the characteristics we considered important, such as, for example, the verbs used, the presence of exclamations or insults, and the presence of negations. We consider it necessary to create wordlists that are present in violent messages as indicators of violence, such as lists of negative verbs, insults, negative phrases. As a final step, we will use automatic learning systems to check the data obtained and the effectiveness of our guide.Keywords: human language technologies, language modelling, offensive language detection, violent online content
Procedia PDF Downloads 131834 Decent Work Agenda in the Philippines: A Capacity Assessment
Authors: Dianne Lyneth Alavado
Abstract:
At the turn of the millennium, development paradigms in the international scene revolved around one goal: elimination of global poverty without comprising human rights. One measure which achieved high endorsement and visibility in the world of work is the Decent Work Agenda (DWA) championed by the United Nation’s (UN) specialized agency for work, the International Labour Organization (ILO). The DWA has been thoroughly promoted and recommended as an ingredient of development planning and a poverty reduction strategy, particularly in developing countries such as the Philippines. The global imperative of economic growth is measurable not only in the numbers raked in by countries in terms of expanding economy but also by the development and realization of the full capacities of their people. Decent work (DW), as an outcome and not just a development approach, promises poverty eradication by means of providing both quantity and quality work that is accompanied by rights, representation, and protection. As a party to these international pacts, the Philippines is expected to heed the call towards a world free from poverty through well-endorsed measures such as the DWA with the aid of multilateral and donor organizations such as the ILO. This study aims to assess the capacity and readiness of the Philippines to achieve the goals of the DWA. This is a qualitative research using the sociological and juridical lens in the desk analysis of existing Philippine laws, policies, and programs vis-à-vis decent work indicators set forth by the ILO. Interview with experts on the Philippine labor situation is conducted for further validation. The paper identifies gaps within the Philippine legal system and its collection of laws, acts, presidential decrees, department orders and other policy instruments aimed towards achieving the goals of the DWA. Among the major findings of this paper are: the predisposition of Philippine labor laws towards the formal sector; the need for alternative solutions for the informal sector veering away from the usual dole-outs and livelihood projects; the needs for evaluation of policies and programs that are usually self-evaluated; the minimal reach of the labour inspectorate which ensures decent work; and the lack of substantial penalty for non-compliance with labor laws. The paper concludes with policy implications and recommendations towards addressing the potholes on the road to Decent Work.Keywords: decent work agenda, labor laws, millennium development goals, poverty eradication, sustainable development goal
Procedia PDF Downloads 274833 Control of Biofilm Formation and Inorganic Particle Accumulation on Reverse Osmosis Membrane by Hypochlorite Washing
Authors: Masaki Ohno, Cervinia Manalo, Tetsuji Okuda, Satoshi Nakai, Wataru Nishijima
Abstract:
Reverse osmosis (RO) membranes have been widely used for desalination to purify water for drinking and other purposes. Although at present most RO membranes have no resistance to chlorine, chlorine-resistant membranes are being developed. Therefore, direct chlorine treatment or chlorine washing will be an option in preventing biofouling on chlorine-resistant membranes. Furthermore, if particle accumulation control is possible by using chlorine washing, expensive pretreatment for particle removal can be removed or simplified. The objective of this study was to determine the effective hypochlorite washing condition required for controlling biofilm formation and inorganic particle accumulation on RO membrane in a continuous flow channel with RO membrane and spacer. In this study, direct chlorine washing was done by soaking fouled RO membranes in hypochlorite solution and fluorescence intensity was used to quantify biofilm on the membrane surface. After 48 h of soaking the membranes in high fouling potential waters, the fluorescence intensity decreased to 0 from 470 using the following washing conditions: 10 mg/L chlorine concentration, 2 times/d washing interval, and 30 min washing time. The chlorine concentration required to control biofilm formation decreased as the chlorine concentration (0.5–10 mg/L), the washing interval (1–4 times/d), or the washing time (1–30 min) increased. For the sample solutions used in the study, 10 mg/L chlorine concentration with 2 times/d interval, and 5 min washing time was required for biofilm control. The optimum chlorine washing conditions obtained from soaking experiments proved to be applicable also in controlling biofilm formation in continuous flow experiments. Moreover, chlorine washing employed in controlling biofilm with suspended particles resulted in lower amounts of organic (0.03 mg/cm2) and inorganic (0.14 mg/cm2) deposits on the membrane than that for sample water without chlorine washing (0.14 mg/cm2 and 0.33 mg/cm2, respectively). The amount of biofilm formed was 79% controlled by continuous washing with 10 mg/L of free chlorine concentration, and the inorganic accumulation amount decreased by 58% to levels similar to that of pure water with kaolin (0.17 mg/cm2) as feed water. These results confirmed the acceleration of particle accumulation due to biofilm formation, and that the inhibition of biofilm growth can almost completely reduce further particle accumulation. In addition, effective hypochlorite washing condition which can control both biofilm formation and particle accumulation could be achieved.Keywords: reverse osmosis, washing condition optimization, hypochlorous acid, biofouling control
Procedia PDF Downloads 351832 The Influence of Activity Selection and Travel Distance on Forest Recreation Policies
Authors: Mark Morgan, Christine Li, Shuangyu Xu, Jenny McCarty
Abstract:
The National Wild and Scenic Rivers System was created by the U.S. Congress in 1968 (Public Law 90-542; 16 U.S.C. 1271 et seq.) to preserve outstanding natural, cultural, and recreational values of some U.S. rivers in a free-flowing condition for the enjoyment of present and future generations. This Act is notable for safeguarding the special character of these rivers while supporting management action that encourages public participation for co-creating river protection goals and strategies. This is not an easy task. To meet the challenges of modern ecosystem management, federal resource agencies must address many legal, environmental, economic, political, and social issues. The U.S. Forest Service manages a 44-mile section of the Eleven Point National Scenic River (EPR) in southern Missouri, mainly for outdoor recreation purposes. About half of the acreage is in private lands, while the remainder flows through the Mark Twain National Forest. Private land along the river is managed by scenic easements to ensure protection of scenic values and natural resources, without public access. A portion of the EPR lies adjacent to a 16,500-acre tract known as the Irish Wilderness. The spring-fed river has steep bluffs, deep pools, clear water, and a slow current, making it an ideal setting for outdoor enthusiasts. A 10-month visitor study was conducted at five access points along the EPR during 2019 so the US Forest Service could update their river management plan. A mail-back survey was administered to 560 on-site visitors, yielding a response rate of 53%. Although different types of visitors use the EPR, boating and fishing were the predominant forms of outdoor recreation. Some river use was from locals, but other visitors came from farther away. Formulating unbiased policies for outdoor recreation is difficult because managers must assign relative values to recreational activities and travel distance. Because policymaking is a subjective process, management decisions can affect user groups in different ways (i.e., boaters vs. fishers; proximate vs. distal visitors), as seen through a GIS analysis.Keywords: activity selection, forest recreation, policy, travel distance
Procedia PDF Downloads 140831 The Effects of Ellagic Acid on Rat Lungs Induced Tobacco Smoke
Authors: Nalan Kaya, Gonca Ozan, Elif Erdem, Neriman Colakoglu, Enver Ozan
Abstract:
The toxic effects of tobacco smoke exposure have been detected in numerous studies. Ellagic acid (EA), (2,3,7,8-tetrahydroxy [1]-benzopyranol [5,4,3-cde] benzopyran 5,10-dione), a natural phenolic lactone compound, is found in various plant species including pomegranate, grape, strawberries, blackberries and raspberries. Similar to the other effective antioxidants, EA can safely interact with the free radicals and reduces oxidative stress through the phenolic ring and hydroxyl components in its structure. The aim of the present study was to examine the protective effects of ellagic acid against oxidative damage on lung tissues of rats induced by tobacco smoke. Twenty-four male adult (8 weeks old) Spraque-Dawley rats were divided randomly into 4 equal groups: group I (Control), group II (Tobacco smoke), group III (Tobacco smoke + corn oil) and group IV (Tobacco smoke + ellagic acid). The rats in group II, III and IV, were exposed to tobacco smoke 1 hour twice a day for 12 weeks. In addition to tobacco smoke exposure, 12 mg/kg ellagic acid (dissolved in corn oil), was applied to the rats in group IV by oral gavage. Equal amount of corn oil used in solving ellagic acid was applied to the rats by oral gavage in group III. At the end of the experimental period, rats were decapitated. Lung tissues and blood samples were taken. The lung slides were stained by H&E and Masson’s Trichrome methods. Also, galactin-3 stain was applied. Biochemical analyzes were performed. Vascular congestion and inflammatory cell infiltration in pulmonary interstitium, thickness in interalveolar septum, cytoplasmic vacuolation in some macrophages and galactin-3 positive cells were observed in histological examination of tobacco smoke group. In addition to these findings, hemorrhage in pulmonary interstitium and bronchial lumen was detected in tobacco smoke + corn oil group. Reduced vascular congestion and hemorrhage in pulmoner interstitium and rarely thickness in interalveolar septum were shown in tobacco smoke + EA group. Compared to group-I, group-II GSH level was decreased and MDA level was increased significantly. Nevertheless group-IV GSH level was higher and MDA level was lower than group-II. The results indicate that ellagic acid could protect the lung tissue from the tobacco smoke harmful effects.Keywords: ellagic acid, lung, rat, tobacco smoke
Procedia PDF Downloads 214