Search results for: alternative energy
1471 Overcoming the Challenges of Subjective Truths in the Post-Truth Age Through a CriticalEthical English Pedagogy
Authors: Farah Vierra
Abstract:
Following the 2016 US presidential election and the advancement of the Brexit referendum, the concept of “post-truth”, defined by Oxford Dictionary as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”, came into prominent use in public, political and educational circles. What this essentially entails is that in this age, individuals are increasingly confronted with subjective perpetuations of truth in their discourse spheres that are informed by beliefs and opinions as opposed to any form of coherence to the reality of those who these truth claims concern. In principle, a subjective delineation of truth is progressive and liberating – especially considering its potential in providing marginalised groups in the diverse communities of our globalised world with the voice to articulate truths that are representative of themselves and their experiences. However, any form of human flourishing that seems to be promised here collapses as the tenets of subjective truths initially in place to liberate has been distorted through post-truth to allow individuals to purport selective and individualistic truth claims that further oppress and silence certain groups within society without due accountability. The evidence of which is prevalent through the conception of terms such as "alternative facts" and "fake news" that we observe individuals declare when their problematic truth claims are questioned. Considering the pervasiveness of post-truth and the ethical issues that accompany it, educators and scholars alike have increasingly noted the need to adapt educational practices and pedagogies to account for the diminishing objectivity of truth in the twenty-first century, especially because students, as digital natives, find themselves in the firing line of post-truth; engulfed in digital societies that proliferate post-truth through the surge of truth claims allowed in various media sites. In an attempt to equip students with the vital skills to navigate the post-truth age and oppose its proliferation of social injustices, English educators find themselves having to devise instructional strategies that not only teach students the ways they can critically and ethically scrutinise truth claims but also teach them to mediate the subjectivity of truth in a manner that does not undermine the voices of diverse communities. In hopes of providing educators with the roadmap to do so, this paper will first examine the challenges that confront students as a result of post-truth. Following which, the paper will elucidate the role English education can play in helping students overcome the complex ramifications of post-truth. Scholars have consistently touted the affordances of literary texts in providing students with imagined spaces to explore societal issues through a critical discernment of language and an ethical engagement with its narrative developments. Therefore, this paper will explain and demonstrate how literary texts, when used alongside a critical-ethical post-truth pedagogy that equips students with interpretive strategies informed by literary traditions such as literary and ethical criticism, can be effective in helping students develop the pertinent skills to comprehensively examine truth claims and overcome the challenges of the post-truth age.Keywords: post-truth, pedagogy, ethics, English, education
Procedia PDF Downloads 711470 Photocatalytic Hydrogen Production, Effect of Metal Particle Size and Their Electronic/Optical Properties on the Reaction
Authors: Hicham Idriss
Abstract:
Hydrogen production from water is one of the most promising methods to secure renewable sources or vectors of energy for societies in general and for chemical industries in particular. At present over 90% of the total amount of hydrogen produced in the world is made from non-renewable fossil fuels (via methane reforming). There are many methods for producing hydrogen from water and these include reducible oxide materials (solar thermal production), combined PV/electrolysis, artificial photosynthesis and photocatalysis. The most promising of these processes is the one relying on photocatalysis; yet serious challenges are hindering its success so far. In order to make this process viable considerable improvement of the photon conversion is needed. Among the key studies that our group has been conducting in the last few years are those focusing on synergism between the semiconductor phases, photonic band gap materials, pn junctions, plasmonic resonance responses, charge transfer to metal cations, in addition to metal dispersion and band gap engineering. In this work results related to phase transformation of the anatase to rutile in the case of TiO2 (synergism), of Au and Ag dispersion (electron trapping and hydrogen-hydrogen recombination centers) as well as their plasmon resonance response (visible light conversion) are presented and discussed. It is found for example that synergism between the two common phases of TiO2 (anatase and rutile) is sensitive to the initial particle size. It is also found, in agreement with previous results, that the rate is very sensitive to the amount of metals (with similar particle size) on the surface unlike the case of thermal heterogeneous catalysis.Keywords: photo-catalysis, hydrogen production, water splitting, plasmonic
Procedia PDF Downloads 2511469 Experimental Research of Smoke Impact on the Performance of Cylindrical Eight Channel Cyclone
Authors: Pranas Baltrėnas, Dainius Paliulis
Abstract:
Cyclones are widely used for separating particles from gas in energy production objects. Efficiency of normal centrifugal air cleaning devices ranges from 85 to 90%, but weakness of many cyclones is low collection efficiency of particles less than 10 μm in diameter. Many factors have impact on cyclone efficiency – humidity, temperature, gas (air) composition, airflow velocity and etc. Many scientists evaluated only effect of origin and size of PM on cyclone efficiency. Effect of gas (air) composition and temperature on cyclone efficiency still demands contributions. Complex experimental research on efficiency of cylindrical eight-channel system with adjustable half-rings for removing fine dispersive particles (< 20 μm) was carried out. The impact of gaseous smoke components on removal of wood ashes was analyzed. Gaseous components, present in the smoke mixture, with the dynamic viscosity lower than that of same temperature air, decrease the d50 value, simultaneously increasing the overall particulate matter removal efficiency in the cyclone, i.e. this effect is attributed to CO2 and CO, while O2 and NO have the opposite effect. Air temperature influences the d50 value, an increase in air temperature yields an increase in d50 value, i.e. the overall particulate matter removal efficiency declines, the reason for this being an increasing dynamic air viscosity. At 120 °C temperature the d50 value is approximately 11.8 % higher than at air temperature of 20 °C. With an increase in smoke (gas) temperature from 20 °C to 50 °C, the aerodynamic resistance in a 1-tier eight-channel cylindrical cyclone drops from 1605 to 1380 Pa, from 1660 to 1420 Pa in a 2-tier eight-channel cylindrical cyclone, from 1715 to 1450 Pa in a 3-tier eight-channel cylindrical cyclone. The reason for a decline in aerodynamic resistance is the declining gas density. The aim of the paper is to analyze the impact of gaseous smoke components on the eight–channel cyclone with tangential inlet.Keywords: cyclone, adjustable half-rings, particulate matter, efficiency, gaseous compounds, smoke
Procedia PDF Downloads 2881468 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel
Authors: F. M. Pisano, M. Ciminello
Abstract:
Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.Keywords: interactive dashboards, optical fibers, structural health monitoring, visual analytics
Procedia PDF Downloads 1231467 The Potential of Sown Pastures as Feedstock for Biofuels in Brazil
Authors: Danilo G. De Quadros
Abstract:
Biofuels are a priority in the renewable energy agenda. The utilization of tropical grasses to ethanol production is a real opportunity to Brazil reaches the world’s leadership in biofuels production because there are 100 million hectares of sown pastures, which represent 20% of all land and 80% of agricultural areas. Basically, nowadays tropical grasses are used to raise livestock. The results obtained in this research could bring tremendous advance not only to national technology and economy but also to improve social and environmental aspects. Thus, the objective of this work was to estimate, through well-established international models, the potential of biofuels production using sown tropical pastures as feedstocks and to compare the results with sugarcane ethanol, considering state-of-art of conversion technology, advantages and limitations factors. There were used data from national and international literature about forage yield and biochemical conversion yield. Some scenarios were studied to evaluate potential advantages and limitations for cellulosic ethanol production, since non-food feedstock appeal to conversion strategies, passing through harvest, densification, logistics, environmental impacts (carbon and water cycles, nutrient recycling and biodiversity), and social aspects. If Brazil used only 1% of sown pastures to ethanol production by biochemical pathway, with average dry matter yield of 15 metric tons per hectare per year (there are results of 40 tons), resulted annually in 721 billion liters, that represents 10 times more than sugarcane ethanol projected by the Government in 2030. However, more research is necessary to take the results to commercial scale with competitive costs, considering many strategies and methods applied in ethanol production using cellulosic feedstock.Keywords: biofuels, biochemical pathway, cellulosic ethanol, sustainability
Procedia PDF Downloads 2591466 Experimental and Modelling Performances of a Sustainable Integrated System of Conditioning for Bee-Pollen
Authors: Andrés Durán, Brian Castellanos, Marta Quicazán, Carlos Zuluaga-Domínguez
Abstract:
Bee-pollen is an apicultural-derived food product, with a growing appreciation among consumers given the remarkable nutritional and functional composition, in particular, protein (24%), dietary fiber (15%), phenols (15 – 20 GAE/g) and carotenoids (600 – 900 µg/g). These properties are given by the geographical and climatic characteristics of the region where it is collected. There are several countries recognized by their pollen production, e.g. China, United States, Japan, Spain, among others. Beekeepers use traps in the entrance of the hive where bee-pollen is collected. After the removal of foreign particles and drying, this product is ready to be marketed. However, in countries located along the equator, the absence of seasons and a constant tropical climate throughout the year favors a more rapid spoilage condition for foods with elevated water activity. The climatic conditions also trigger the proliferation of microorganisms and insects. This, added to the factor that beekeepers usually do not have adequate processing systems for bee-pollen, leads to deficiencies in the quality and safety of the product. In contrast, the Andean region of South America, lying on equator, typically has a high production of bee-pollen of up to 36 kg/year/hive, being four times higher than in countries with marked seasons. This region is also located in altitudes superior to 2500 meters above sea level, having extremes sun ultraviolet radiation all year long. As a mechanism of defense of radiation, plants produce more secondary metabolites acting as antioxidant agents, hence, plant products such as bee-pollen contain remarkable more phenolics and carotenoids than collected in other places. Considering this, the improvement of bee-pollen processing facilities by technical modifications and the implementation of an integrated cleaning and drying system for the product in an apiary in the area was proposed. The beehives were modified through the installation of alternative bee-pollen traps to avoid sources of contamination. The processing facility was modified according to considerations of Good Manufacturing Practices, implementing the combined use of a cabin dryer with temperature control and forced airflow and a greenhouse-type solar drying system. Additionally, for the separation of impurities, a cyclone type system was implemented, complementary to a screening equipment. With these modifications, a decrease in the content of impurities and the microbiological load of bee-pollen was seen from the first stages, principally with a reduction of the presence of molds and yeasts and in the number of foreign animal origin impurities. The use of the greenhouse solar dryer integrated to the cabin dryer allowed the processing of larger quantities of product with shorter waiting times in storage, reaching a moisture content of about 6% and a water activity lower than 0.6, being appropriate for the conservation of bee-pollen. Additionally, the contents of functional or nutritional compounds were not affected, even observing an increase of up to 25% in phenols content and a non-significant decrease in carotenoids content and antioxidant activity.Keywords: beekeeping, drying, food processing, food safety
Procedia PDF Downloads 1031465 An Efficient Process Analysis and Control Method for Tire Mixing Operation
Authors: Hwang Ho Kim, Do Gyun Kim, Jin Young Choi, Sang Chul Park
Abstract:
Since tire production process is very complicated, company-wide management of it is very difficult, necessitating considerable amounts of capital and labors. Thus, productivity should be enhanced and maintained competitive by developing and applying effective production plans. Among major processes for tire manufacturing, consisting of mixing component preparation, building and curing, the mixing process is an essential and important step because the main component of tire, called compound, is formed at this step. Compound as a rubber synthesis with various characteristics plays its own role required for a tire as a finished product. Meanwhile, scheduling tire mixing process is similar to flexible job shop scheduling problem (FJSSP) because various kinds of compounds have their unique orders of operations, and a set of alternative machines can be used to process each operation. In addition, setup time required for different operations may differ due to alteration of additives. In other words, each operation of mixing processes requires different setup time depending on the previous one, and this kind of feature, called sequence dependent setup time (SDST), is a very important issue in traditional scheduling problems such as flexible job shop scheduling problems. However, despite of its importance, there exist few research works dealing with the tire mixing process. Thus, in this paper, we consider the scheduling problem for tire mixing process and suggest an efficient particle swarm optimization (PSO) algorithm to minimize the makespan for completing all the required jobs belonging to the process. Specifically, we design a particle encoding scheme for the considered scheduling problem, including a processing sequence for compounds and machine allocation information for each job operation, and a method for generating a tire mixing schedule from a given particle. At each iteration, the coordination and velocity of particles are updated, and the current solution is compared with new solution. This procedure is repeated until a stopping condition is satisfied. The performance of the proposed algorithm is validated through a numerical experiment by using some small-sized problem instances expressing the tire mixing process. Furthermore, we compare the solution of the proposed algorithm with it obtained by solving a mixed integer linear programming (MILP) model developed in previous research work. As for performance measure, we define an error rate which can evaluate the difference between two solutions. As a result, we show that PSO algorithm proposed in this paper outperforms MILP model with respect to the effectiveness and efficiency. As the direction for future work, we plan to consider scheduling problems in other processes such as building, curing. We can also extend our current work by considering other performance measures such as weighted makespan or processing times affected by aging or learning effects.Keywords: compound, error rate, flexible job shop scheduling problem, makespan, particle encoding scheme, particle swarm optimization, sequence dependent setup time, tire mixing process
Procedia PDF Downloads 2651464 Green Synthesis and Characterisation of Gold Nanoparticles from the Stem Bark and Leaves of Khaya Senegalensis and Its Cytotoxicity on MCF7 Cell Lines
Authors: Stephen Daniel Iduh, Evans Chidi Egwin, Oluwatosin Kudirat Shittu
Abstract:
The process for the development of reliable and eco-friendly metallic Nanoparticles is an important step in the field of Nanotechnology for biomedical application. To achieve this, use of natural sources like biological systems becomes essential. In the present work, extracellular biosynthesis of gold Nanoparticles using aqueous leave and stembark extracts of K. senegalensis has been attempted. The gold Nanoparticles produced were characterized using High Resolution scanning electron microscopy, Ultra Violet–Visible spectroscopy, zeta-sizer Nano, Energy-Dispersive X-ray (EDAX) Spectroscopy and Fourier Transmission Infrared (FTIR) Spectroscopy. The cytotoxicity of the synthesized gold nanoparticles on MCF-7 cell line was evaluated using MTT assay. The result showed a rapid development of Nano size and shaped particles within 5 minutes of reaction with Surface Plasmon Resonance at 520 and 525nm respectively. An average particle size of 20-90nm was confirmed. The amount of the extracts determines the core size of the AuNPs. The core size of the AuNPs decreases as the amount of extract increases and it causes the shift of Surface Plasmon Resonance band. The FTIR confirms the presence of biomolecules serving as reducing and capping agents on the synthesised gold nanoparticles. The MTT assay shows a significant effect of gold nanoparticles which is concentration dependent. This environment-friendly method of biological gold Nanoparticle synthesis has the potential and can be directly applied in cancer therapy.Keywords: biosynthesis, gold nanoparticles, characterization, calotropis procera, cytotoxicity
Procedia PDF Downloads 4901463 Exploring 1,2,4-Triazine-3(2H)-One Derivatives as Anticancer Agents for Breast Cancer: A QSAR, Molecular Docking, ADMET, and Molecular Dynamics
Authors: Said Belaaouad
Abstract:
This study aimed to explore the quantitative structure-activity relationship (QSAR) of 1,2,4-Triazine-3(2H)-one derivative as a potential anticancer agent against breast cancer. The electronic descriptors were obtained using the Density Functional Theory (DFT) method, and a multiple linear regression techniques was employed to construct the QSAR model. The model exhibited favorable statistical parameters, including R2=0.849, R2adj=0.656, MSE=0.056, R2test=0.710, and Q2cv=0.542, indicating its reliability. Among the descriptors analyzed, absolute electronegativity (χ), total energy (TE), number of hydrogen bond donors (NHD), water solubility (LogS), and shape coefficient (I) were identified as influential factors. Furthermore, leveraging the validated QSAR model, new derivatives of 1,2,4-Triazine-3(2H)-one were designed, and their activity and pharmacokinetic properties were estimated. Subsequently, molecular docking (MD) and molecular dynamics (MD) simulations were employed to assess the binding affinity of the designed molecules. The Tubulin colchicine binding site, which plays a crucial role in cancer treatment, was chosen as the target protein. Through the simulation trajectory spanning 100 ns, the binding affinity was calculated using the MMPBSA script. As a result, fourteen novel Tubulin-colchicine inhibitors with promising pharmacokinetic characteristics were identified. Overall, this study provides valuable insights into the QSAR of 1,2,4-Triazine-3(2H)-one derivative as potential anticancer agent, along with the design of new compounds and their assessment through molecular docking and dynamics simulations targeting the Tubulin-colchicine binding site.Keywords: QSAR, molecular docking, ADMET, 1, 2, 4-triazin-3(2H)-ones, breast cancer, anticancer, molecular dynamic simulations, MMPBSA calculation
Procedia PDF Downloads 941462 Innovative Technologies for Aeration and Feeding of Fish in Aquaculture with Minimal Impact on the Environment
Authors: Vasile Caunii, Andreea D. Serban, Mihaela Ivancia
Abstract:
The paper presents a new approach in terms of the circular economy of technologies for feeding and aeration of accumulations and water basins for fish farming and aquaculture. Because fish is and will be one of the main foods on the planet, the use of bio-eco-technologies is a priority for all producers. The technologies proposed in the paper want to reduce by a substantial percentage the costs of operation of ponds and water accumulation, using non-polluting technologies with minimal impact on the environment. The paper proposes two innovative, intelligent systems, fully automated that use a common platform, completely eco-friendly. One system is intended to aerate the water of the fish pond, and the second is intended to feed the fish by dispersing an optimal amount of fodder, depending on population size, age and habits. Both systems use a floating platform, regenerative energy sources, are equipped with intelligent and innovative systems, and in addition to fully automated operation, significantly reduce the costs of aerating water accumulations (natural or artificial) and feeding fish. The intelligent system used for feeding, in addition, to reduce operating costs, optimizes the amount of food, thus preventing water pollution and the development of bacteria, microorganisms. The advantages of the systems are: increasing the yield of fish production, these are green installations, with zero pollutant emissions, can be arranged anywhere on the water surface, depending on the user's needs, can operate autonomously or remotely controlled, if there is a component failure, the system provides the operator with accurate data on the issue, significantly reducing maintenance costs, transmit data about the water physical and chemical parameters.Keywords: bio-eco-technologies, economy, environment, fish
Procedia PDF Downloads 1481461 Evaluation: Developing An Appropriate Survey Instrument For E-Learning
Authors: Brenda Ravenscroft, Ulemu Luhanga, Bev King
Abstract:
A comprehensive evaluation of online learning needs to include a blend of educational design, technology use, and online instructional practices that integrate technology appropriately for developing and delivering quality online courses. Research shows that classroom-based evaluation tools do not adequately capture the dynamic relationships between content, pedagogy, and technology in online courses. Furthermore, studies suggest that using classroom evaluations for online courses yields lower than normal scores for instructors, and may affect faculty negatively in terms of administrative decisions. In 2014, the Faculty of Arts and Science at Queen’s University responded to this evidence by seeking an alternative to the university-mandated evaluation tool, which is designed for classroom learning. The Faculty is deeply engaged in e-learning, offering large variety of online courses and programs in the sciences, social sciences, humanities and arts. This paper describes the process by which a new student survey instrument for online courses was developed and piloted, the methods used to analyze the data, and the ways in which the instrument was subsequently adapted based on the results. It concludes with a critical reflection on the challenges of evaluating e-learning. The Student Evaluation of Online Teaching Effectiveness (SEOTE), developed by Arthur W. Bangert in 2004 to assess constructivist-compatible online teaching practices, provided the starting point. Modifications were made in order to allow the instrument to serve the two functions required by the university: student survey results provide the instructor with feedback to enhance their teaching, and also provide the institution with evidence of teaching quality in personnel processes. Changes were therefore made to the SEOTE to distinguish more clearly between evaluation of the instructor’s teaching and evaluation of the course design, since, in the online environment, the instructor is not necessarily the course designer. After the first pilot phase, involving 35 courses, the results were analyzed using Stobart's validity framework as a guide. This process included statistical analyses of the data to test for reliability and validity, student and instructor focus groups to ascertain the tool’s usefulness in terms of the feedback it provided, and an assessment of the utility of the results by the Faculty’s e-learning unit responsible for supporting online course design. A set of recommendations led to further modifications to the survey instrument prior to a second pilot phase involving 19 courses. Following the second pilot, statistical analyses were repeated, and more focus groups were used, this time involving deans and other decision makers to determine the usefulness of the survey results in personnel processes. As a result of this inclusive process and robust analysis, the modified SEOTE instrument is currently being considered for adoption as the standard evaluation tool for all online courses at the university. Audience members at this presentation will be stimulated to consider factors that differentiate effective evaluation of online courses from classroom-based teaching. They will gain insight into strategies for introducing a new evaluation tool in a unionized institutional environment, and methodologies for evaluating the tool itself.Keywords: evaluation, online courses, student survey, teaching effectiveness
Procedia PDF Downloads 2651460 Forgeability Study of Medium Carbon Micro-Alloyed Forging Steel
Authors: M. I. Equbal, R. K. Ohdar, B. Singh, P. Talukdar
Abstract:
Micro-alloyed steel components are used in automotive industry for the necessity to make the manufacturing process cycles shorter when compared to conventional steel by eliminating heat treatment cycles, so an important saving of costs and energy can be reached by reducing the number of operations. Micro-alloying elements like vanadium, niobium or titanium have been added to medium carbon steels to achieve grain refinement with or without precipitation strengthening along with uniform microstructure throughout the matrix. Present study reports the applicability of medium carbon vanadium micro-alloyed steel in hot forging. Forgeability has been determined with respect to different cooling rates, after forging in a hydraulic press at 50% diameter reduction in temperature range of 900-11000C. Final microstructures, hardness, tensile strength, and impact strength have been evaluated. The friction coefficients of different lubricating conditions, viz., graphite in hydraulic oil, graphite in furnace oil, DF 150 (Graphite, Water-Based) die lubricant and dry or without any lubrication were obtained from the ring compression test for the above micro-alloyed steel. Results of ring compression tests indicate that graphite in hydraulic oil lubricant is preferred for free forging and dry lubricant is preferred for die forging operation. Exceptionally good forgeability and high resistance to fracture, especially for faster cooling rate has been observed for fine equiaxed ferrite-pearlite grains, some amount of bainite and fine precipitates of vanadium carbides and carbonitrides. The results indicated that the cooling rate has a remarkable effect on the microstructure and mechanical properties at room temperature.Keywords: cooling rate, hot forging, micro-alloyed, ring compression
Procedia PDF Downloads 3591459 Comparative Research on Culture-Led Regeneration across Cities in China
Authors: Fang Bin Guo, Emma Roberts, Haibin Du, Yonggang Wang, Yu Chen, Xiuli Ge
Abstract:
This paper explores the findings so far from a major externally-funded project which operates internationally in China, Germany and the UK. The research team is working in the context of the redevelopment of post-industrial sites in China and how these might be platforms for creative enterprises and thereby, the economy and welfare to flourish. Results from the project are anticipated to inform urban design policies in China and possibly farther afield. The research has utilised ethnographic studies and participatory design methods to investigate alternative strategies for sustainable urban renewal of China’s post-industrial areas. Additionally, it has undertaken comparative studies of successful examples of European and Chinese urban regeneration cases. The international cross-disciplinary team has been seeking different opportunities for developing relevant creative industries whilst retaining cultural and industrial heritage. This paper will explore the research conducted so far by the team and offer initial findings. Findings point out the development challenges of cities respecting the protection of local culture/heritages, history of the industries and transformation of the local economies. The preliminary results and pilot analysis of the current research have demonstrated that local government policyholders, business investors/developers and creative industry practitioners are the three major stakeholders that will impact city revitalisations. These groups are expected to work together with asynchronous vision in order for redevelopments to be successful. Meanwhile, local geography, history, culture, politics, economy and ethnography have been identified as important factors that impact on project design and development during urban transformations. Data is being processed from the team’s research conducted across the focal Western and Chinese cities. This has provided theoretical guidance and practical support to the development of significant experimental projects. Many were re-examined with a more international perspective, and adjustments have been based on the conclusions of the research. The observations and research are already generating design solutions in terms of ascertaining essential site components, layouts, visual design and practical facilities for regenerated sites. Two significant projects undertaken by this project team have been nominated by the central Chinese government as the most successful exemplars. They have been listed as outstanding national industry heritage projects; in particular, one of them was nominated by ArchDaily as Building of the Year 2019, and so this project outcome has made a substantial contribution to research and innovation. In summary, this paper will outline the funded project, discuss the work conducted so far, and pinpoint the initial discoveries. It will detail the future steps and indicate how these will impact on national and local governments in China, designers, local citizens and building users.Keywords: cultural & industrial heritages, ethnographic research, participatory design, regeneration of post-industrial sites, sustainable
Procedia PDF Downloads 1461458 Optimal Framework of Policy Systems with Innovation: Use of Strategic Design for Evolution of Decisions
Authors: Yuna Lee
Abstract:
In the current policy process, there has been a growing interest in more open approaches that incorporate creativity and innovation based on the forecasting groups composed by the public and experts together into scientific data-driven foresight methods to implement more effective policymaking. Especially, citizen participation as collective intelligence in policymaking with design and deep scale of innovation at the global level has been developed and human-centred design thinking is considered as one of the most promising methods for strategic foresight. Yet, there is a lack of a common theoretical foundation for a comprehensive approach for the current situation of and post-COVID-19 era, and substantial changes in policymaking practice are insignificant and ongoing with trial and error. This project hypothesized that rigorously developed policy systems and tools that support strategic foresight by considering the public understanding could maximize ways to create new possibilities for a preferable future, however, it must involve a better understating of Behavioural Insights, including individual and cultural values, profit motives and needs, and psychological motivations, for implementing holistic and multilateral foresight and creating more positive possibilities. To what extent is the policymaking system theoretically possible that incorporates the holistic and comprehensive foresight and policy process implementation, assuming that theory and practice, in reality, are different and not connected? What components and environmental conditions should be included in the strategic foresight system to enhance the capacity of decision from policymakers to predict alternative futures, or detect uncertainties of the future more accurately? And, compared to the required environmental condition, what are the environmental vulnerabilities of the current policymaking system? In this light, this research contemplates the question of how effectively policymaking practices have been implemented through the synthesis of scientific, technology-oriented innovation with the strategic design for tackling complex societal challenges and devising more significant insights to make society greener and more liveable. Here, this study conceptualizes the notions of a new collaborative way of strategic foresight that aims to maximize mutual benefits between policy actors and citizens through the cooperation stemming from evolutionary game theory. This study applies mixed methodology, including interviews of policy experts, with the case in which digital transformation and strategic design provided future-oriented solutions or directions to cities’ sustainable development goals and society-wide urgent challenges such as COVID-19. As a result, artistic and sensual interpreting capabilities through strategic design promote a concrete form of ideas toward a stable connection from the present to the future and enhance the understanding and active cooperation among decision-makers, stakeholders, and citizens. Ultimately, an improved theoretical foundation proposed in this study is expected to help strategically respond to the highly interconnected future changes of the post-COVID-19 world.Keywords: policymaking, strategic design, sustainable innovation, evolution of cooperation
Procedia PDF Downloads 1941457 Identification and Characterization of Small Peptides Encoded by Small Open Reading Frames using Mass Spectrometry and Bioinformatics
Authors: Su Mon Saw, Joe Rothnagel
Abstract:
Short open reading frames (sORFs) located in 5’UTR of mRNAs are known as uORFs. Characterization of uORF-encoded peptides (uPEPs) i.e., a subset of short open reading frame encoded peptides (sPEPs) and their translation regulation lead to understanding of causes of genetic disease, proteome complexity and development of treatments. Existence of uORFs within cellular proteome could be detected by LC-MS/MS. The ability of uORF to be translated into uPEP and achievement of uPEP identification will allow uPEP’s characterization, structures, functions, subcellular localization, evolutionary maintenance (conservation in human and other species) and abundance in cells. It is hypothesized that a subset of sORFs are translatable and that their encoded sPEPs are functional and are endogenously expressed contributing to the eukaryotic cellular proteome complexity. This project aimed to investigate whether sORFs encode functional peptides. Liquid chromatography-mass spectrometry (LC-MS) and bioinformatics were thus employed. Due to probable low abundance of sPEPs and small in sizes, the need for efficient peptide enrichment strategies for enriching small proteins and depleting the sub-proteome of large and abundant proteins is crucial for identifying sPEPs. Low molecular weight proteins were extracted using SDS-PAGE from Human Embryonic Kidney (HEK293) cells and Strong Cation Exchange Chromatography (SCX) from secreted HEK293 cells. Extracted proteins were digested by trypsin to peptides, which were detected by LC-MS/MS. The MS/MS data obtained was searched against Swiss-Prot using MASCOT version 2.4 to filter out known proteins, and all unmatched spectra were re-searched against human RefSeq database. ProteinPilot v5.0.1 was used to identify sPEPs by searching against human RefSeq, Vanderperre and Human Alternative Open Reading Frame (HaltORF) databases. Potential sPEPs were analyzed by bioinformatics. Since SDS PAGE electrophoresis could not separate proteins <20kDa, this could not identify sPEPs. All MASCOT-identified peptide fragments were parts of main open reading frame (mORF) by ORF Finder search and blastp search. No sPEP was detected and existence of sPEPs could not be identified in this study. 13 translated sORFs in HEK293 cells by mass spectrometry in previous studies were characterized by bioinformatics. Identified sPEPs from previous studies were <100 amino acids and <15 kDa. Bioinformatics results showed that sORFs are translated to sPEPs and contribute to proteome complexity. uPEP translated from uORF of SLC35A4 was strongly conserved in human and mouse while uPEP translated from uORF of MKKS was strongly conserved in human and Rhesus monkey. Cross-species conserved uORFs in association with protein translation strongly suggest evolutionary maintenance of coding sequence and indicate probable functional expression of peptides encoded within these uORFs. Translation of sORFs was confirmed by mass spectrometry and sPEPs were characterized with bioinformatics.Keywords: bioinformatics, HEK293 cells, liquid chromatography-mass spectrometry, ProteinPilot, Strong Cation Exchange Chromatography, SDS-PAGE, sPEPs
Procedia PDF Downloads 1861456 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes
Authors: Igor A. Krichtafovitch
Abstract:
The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.Keywords: supercomputer, biological evolution, Darwinism, speciation
Procedia PDF Downloads 1641455 Structural Evolution of Electrodeposited Ni Coating on Ti-6Al-4V Alloy during Heat Treatment
Authors: M. Abdoos, A. Amadeh, M. Adabi
Abstract:
In recent decades, the use of titanium and its alloys due to their high mechanical properties, light weight and their corrosion resistance has increased in military and industry applications. However, the poor surface properties can limit their widely usage. Many researches were carried out to improve their surface properties. The most effective technique is based on solid-state diffusion of elements that can form intermetallic compounds with the substrate. In the present work, inter-diffusion of nickel and titanium and formation of Ni-Ti intermetallic compounds in nickel-coated Ti-6Al-4V alloy have been studied. Initially, nickel was electrodeposited on the alloy using Watts bath at a current density of 20 mA/cm2 for 1 hour. The coated specimens were then heat treated in a tubular furnace under argon atmosphere at different temperatures near Ti β-transus to maximize the diffusion rate for various durations in order to improve the surface properties of the Ti-6Al-4V alloy. The effect of temperature and time on the thickness of diffusion layer and characteristics of intermetallic phases was studied by means of scanning electron microscope (SEM) equipped with energy dispersive X-ray spectrometer (EDS) and microhardness test. The results showed that a multilayer structure was formed after heat treatment: an outer layer of remaining nickel, an area of intermetallic layers with different compositions and solid solution of Ni-Ti. Three intermetallic layers was detected by EDS analysis, namely an outer layer with about 75 at.% Ni (Ni3Ti), an intermediate layer with 50 at.% Ni (NiTi) and finally an inner layer with 36 at.% Ni (NiTi2). It was also observed that the increase in time or temperature led to the formation of thicker intermetallic layers. Meanwhile, the microhardness of heat treated samples increased with formation of Ni-Ti intermetallics; however, its value depended on heat treatment parameters.Keywords: heat treatment, microhardness, Ni coating, Ti-6Al-4V
Procedia PDF Downloads 4331454 Deep Reinforcement Learning-Based Computation Offloading for 5G Vehicle-Aware Multi-Access Edge Computing Network
Authors: Ziying Wu, Danfeng Yan
Abstract:
Multi-Access Edge Computing (MEC) is one of the key technologies of the future 5G network. By deploying edge computing centers at the edge of wireless access network, the computation tasks can be offloaded to edge servers rather than the remote cloud server to meet the requirements of 5G low-latency and high-reliability application scenarios. Meanwhile, with the development of IOV (Internet of Vehicles) technology, various delay-sensitive and compute-intensive in-vehicle applications continue to appear. Compared with traditional internet business, these computation tasks have higher processing priority and lower delay requirements. In this paper, we design a 5G-based Vehicle-Aware Multi-Access Edge Computing Network (VAMECN) and propose a joint optimization problem of minimizing total system cost. In view of the problem, a deep reinforcement learning-based joint computation offloading and task migration optimization (JCOTM) algorithm is proposed, considering the influences of multiple factors such as concurrent multiple computation tasks, system computing resources distribution, and network communication bandwidth. And, the mixed integer nonlinear programming problem is described as a Markov Decision Process. Experiments show that our proposed algorithm can effectively reduce task processing delay and equipment energy consumption, optimize computing offloading and resource allocation schemes, and improve system resource utilization, compared with other computing offloading policies.Keywords: multi-access edge computing, computation offloading, 5th generation, vehicle-aware, deep reinforcement learning, deep q-network
Procedia PDF Downloads 1161453 Plasma Spraying of 316 Stainless Steel on Aluminum and Investigation of Coat/Substrate Interface
Authors: P. Abachi, T. W. Coyle, P. S. Musavi Gharavi
Abstract:
By applying coating onto a structural component, the corrosion and/or wear resistance requirements of the surface can be fulfilled. Since the layer adhesion of the coating influences the mechanical integrity of the coat/substrate interface during the service time, it should be examined accurately. At the present work, the tensile bonding strength of the 316 stainless steel plasma sprayed coating on aluminum substrate was determined by using tensile adhesion test, TAT, specimen. The interfacial fracture toughness was specified using four-point bend specimen containing a saw notch and modified chevron-notched short-bar (SB) specimen. The coating microstructure and fractured specimen surface were examined by using scanning electron- and optical-microscopy. The investigation of coated surface after tensile adhesion test indicates that the failure mechanism is mostly cohesive and rarely adhesive type. The calculated value of critical strain energy release rate proposes relatively good interface status. It seems that four-point bending test offers a potentially more sensitive means for evaluation of mechanical integrity of coating/substrate interfaces than is possible with the tensile test. The fracture toughness value reported for the modified chevron-notched short-bar specimen testing cannot be taken as absolute value because its calculation is based on the minimum stress intensity coefficient value which has been suggested for the fracture toughness determination of homogeneous parts in the ASTM E1304-97 standard.Keywords: bonding strength, four-point bend test, interfacial fracture toughness, modified chevron-notched short-bar specimen, plasma sprayed coating, tensile adhesion test
Procedia PDF Downloads 2591452 In Silico Study of Antiviral Drugs Against Three Important Proteins of Sars-Cov-2 Using Molecular Docking Method
Authors: Alireza Jalalvand, Maryam Saleh, Somayeh Behjat Khatouni, Zahra Bahri Najafi, Foroozan Fatahinia, Narges Ismailzadeh, Behrokh Farahmand
Abstract:
Object: In the last two decades, the recent outbreak of Coronavirus (SARS-CoV-2) imposed a global pandemic in the world. Despite the increasing prevalence of the disease, there are no effective drugs to treat it. A suitable and rapid way to afford an effective drug and treat the global pandemic is a computational drug study. This study used molecular docking methods to examine the potential inhibition of over 50 antiviral drugs against three fundamental proteins of SARS-CoV-2. METHODS: Through a literature review, three important proteins (a key protease, RNA-dependent RNA polymerase (RdRp), and spike) were selected as drug targets. Three-dimensional (3D) structures of protease, spike, and RdRP proteins were obtained from the Protein Data Bank. Protein had minimal energy. Over 50 antiviral drugs were considered candidates for protein inhibition and their 3D structures were obtained from drug banks. The Autodock 4.2 software was used to define the molecular docking settings and run the algorithm. RESULTS: Five drugs, including indinavir, lopinavir, saquinavir, nelfinavir, and remdesivir, exhibited the highest inhibitory potency against all three proteins based on the binding energies and drug binding positions deduced from docking and hydrogen-bonding analysis. Conclusions: According to the results, among the drugs mentioned, saquinavir and lopinavir showed the highest inhibitory potency against all three proteins compared to other drugs. It may enter laboratory phase studies as a dual-drug treatment to inhibit SARS-CoV-2.Keywords: covid-19, drug repositioning, molecular docking, lopinavir, saquinavir
Procedia PDF Downloads 861451 Multi Universe Existence Based-On Quantum Relativity using DJV Circuit Experiment Interpretation
Authors: Muhammad Arif Jalil, Somchat Sonasang, Preecha Yupapin
Abstract:
This study hypothesizes that the universe is at the center of the universe among the white and black holes, which are the entangled pairs. The coupling between them is in terms of spacetime forming the universe and things. The birth of things is based on exchange energy between the white and black sides. That is, the transition from the white side to the black side is called wave-matter, where it has a speed faster than light with positive gravity. The transition from the black to the white side has a speed faster than light with negative gravity called a wave-particle. In the part where the speed is equal to light, the particle rest mass is formed. Things can appear to take shape here. Thus, the gravity is zero because it is the center. The gravitational force belongs to the Earth itself because it is in a position that is twisted towards the white hole. Therefore, it is negative. The coupling of black-white holes occurs directly on both sides. The mass is formed at the saturation and will create universes and other things. Therefore, it can be hundreds of thousands of universes on both sides of the B and white holes before reaching the saturation point of multi-universes. This work will use the DJV circuit that the research team made as an entangled or two-level system circuit that has been experimentally demonstrated. Therefore, this principle has the possibility for interpretation. This work explains the emergence of multiple universes and can be applied as a practical guideline for searching for universes in the future. Moreover, the results indicate that the DJV circuit can create the elementary particles according to Feynman's diagram with rest mass conditions, which will be discussed for fission and fusion applications.Keywords: multi-universes, feynman diagram, fission, fusion
Procedia PDF Downloads 621450 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods
Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer
Abstract:
Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.Keywords: cross-validation, importance sampling, information criteria, predictive accuracy
Procedia PDF Downloads 3911449 The Gender Criteria of Film Criticism: Creating the ‘Big’, Avoiding the Important
Authors: Eleni Karasavvidou
Abstract:
Social and anthropological research, parallel to Gender Studies, highlighted the relationship between social structures and symbolic forms as an important field of interaction and recording of 'social trends.' Since the study of representations can contribute to the understanding of the social functions and power relations, they encompass. This ‘mirage,’ however, has not only to do with the representations themselves but also with the ways they are received and the film or critical narratives that are established as dominant or alternative. Cinema and the criticism of its cultural products are no exception. Even in the rapidly changing media landscape of the 21st century, movies remain an integral and widespread part of popular culture, making films an extremely powerful means of 'legitimizing' or 'delegitimizing' visions of domination and commonsensical gender stereotypes throughout society. And yet it is film criticism, the 'language per se,' that legitimizes, reinforces, rewards and reproduces (or at least ignores) the stereotypical depictions of female roles that remain common in the realm of film images. This creates the need for this issue to have emerged (also) in academic research questioning gender criteria in film reviews as part of the effort for an inclusive art and society. Qualitative content analysis is used to examine female roles in selected Oscar-nominated films against their reviews from leading websites and newspapers. This method was chosen because of the complex nature of the depictions in the films and the narratives they evoke. The films were divided into basic scenes depicting social functions, such as love and work relationships, positions of power and their function, which were analyzed by content analysis, with borrowings from structuralism (Gennette) and the local/universal images of intercultural philology (Wierlacher). In addition to the measurement of the general ‘representation-time’ by gender, other qualitative characteristics were also analyzed, such as: speaking time, sayings or key actions, overall quality of the character's action in relation to the development of the scenario and social representations in general, as well as quantitatively (insufficient number of female lead roles, fewer key supporting roles, relatively few female directors and people in the production chain and how they might affect screen representations. The quantitative analysis in this study was used to complement the qualitative content analysis. Then the focus shifted to the criteria of film criticism and to the rhetorical narratives that exclude or highlight in relation to gender identities and functions. In the criteria and language of film criticism, stereotypes are often reproduced or allegedly overturned within the framework of apolitical "identity politics," which mainly addresses the surface of a self-referential cultural-consumer product without connecting it more deeply with the material and cultural life. One of the prime examples of this failure is the Bechtel Test, which tracks whether female characters speak in a film regardless of whether women's stories are represented or not in the films analyzed. If perceived unbiased male filmmakers still fail to tell truly feminist stories, the same is the case with the criteria of criticism and the related interventions.Keywords: representations, context analysis, reviews, sexist stereotypes
Procedia PDF Downloads 821448 Use of Recycled Vegetable Oil in the Diet of Lactating Sows
Authors: Juan Manuel Uriarte Lopez, Hector Raul Guemez Gaxiola, Javier Alonso Romo Rubio, Juan Manuel Romo Valdez
Abstract:
The objective of this investigation was to determine the influence of the use of recycled vegetable oil from restaurants in the productive performance of sows in lactation. Twenty-four hybrids lactating sows (Landrace x Yorkshire) were divided into three treatments with eight sows per treatment. On day 107 of gestation, the sows were moved to the mesh floor maternity cages in an environment regulated by the environment regulated (2.4 × 0.6 m) contained an area (2.4 × 0.5 m) for newborn pigs on each side, all diets were provided as a dry powder, and the sows received free access to water throughout the experimental period. After farrowing, the sows were fasted for 12 hours, the daily feed ration gradually increased, and the sows had ad libitum access to feed on the fourth day. The diets used were corn-soybean meal-based, containing 0 (CONT), recycled vegetable oil 1.0 % (RVOL), or recycled vegetable oil 1.5 % (RVOH) for 30 days. The diets contained similar calculated levels of crude protein and metabolizable energy and contained vitamins and minerals that exceeded National Research Council (1998) recommendations; sows were fed three times daily. On day 30, piglets were weaned, and performances of lactating sows and nursery piglets were recorded. Results indicated that average daily feed intake (5.58, 5.55, and 5.49 kg for CONT, RVOL, and RVO, respectively) of sows were not affected (P > 0.05) by different dietary. There was no difference in the average body weight of piglets on the day of birth, with 1.33, 1.36, and 1.35 kg, respectively (P > 0.05). There was no difference in average body weight of piglets on day 30, with 6.91, 6.75, and 7.05 kg, respectively 0.05) between treatments numbers of weaned piglets per sow (9.95, 9.80, and 9.80) were not affected by treatments (P > 0.05).In conclusion, the substitution of virgin vegetable oil for recycled vegetable oil in the diet does not affect the productive performance of lactating sows.Keywords: lactating, sow, vegetable, oil
Procedia PDF Downloads 2981447 Production of Composite Materials by Mixing Chromium-Rich Ash and Soda-Lime Glass Powder: Mechanical Properties and Microstructure
Authors: Savvas Varitis, Panagiotis Kavouras, George Vourlias, Eleni Pavlidou, Theodoros Karakostas, Philomela Komninou
Abstract:
A chromium-loaded ash originating from incineration of tannery sludge under anoxic conditions was mixed with low grade soda-lime glass powder coming from commercial glass bottles. The relative weight proportions of ash over glass powder tested were 30/70, 40/60 and 50/50. The solid mixtures, formed in green state compacts, were sintered at the temperature range of 800oC up to 1200oC. The resulting products were characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersive X-ray spectrometry (EDXS) and micro-indentation. The above methods were employed to characterize the various phases, microstructure and hardness of the produced materials. Thermal treatment at 800oC and 1000oC produced opaque ceramic products composed of a variety of chromium-containing and chromium-free crystalline phases. Thermal treatment at 1200oC gave rise to composite products, where only chromium-containing crystalline phases were detected. Hardness results suggest that specific products are serious candidates for structural applications. Acknowledgement: This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program “Education and Lifelong Learning” of the National Strategic Reference Framework (NSRF) – Research Funding Program: THALES “WasteVal”: Reinforcement of the interdisciplinary and/or inter-institutional research and innovation.Keywords: chromium-rich tannery residues, glass-ceramic materials, mechanical properties, microstructure
Procedia PDF Downloads 3381446 Overcoming the Challenges of Subjective Truths in the Post-Truth Age Through a Critical-Ethical English Pedagogy
Authors: Farah Vierra
Abstract:
Following the 2016 US presidential election and the advancement of the Brexit referendum, the concept of “post-truth,” defined by the Oxford Dictionary as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief,” came into prominent use in public, political and educational circles. What this essentially entails is that in this age, individuals are increasingly confronted with subjective perpetuations of truth in their discourse spheres that are informed by beliefs and opinions as opposed to any form of coherence to the reality of those to who this truth claims concern. In principle, a subjective delineation of truth is progressive and liberating – especially considering its potential to provide marginalised groups in the diverse communities of our globalised world with the voice to articulate truths that are representative of themselves and their experiences. However, any form of human flourishing that seems to be promised here collapses as the tenets of subjective truths initially in place to liberate have been distorted through post-truth to allow individuals to purport selective and individualistic truth claims that further oppress and silence certain groups within society without due accountability. The evidence of this is prevalent through the conception of terms such as "alternative facts" and "fake news" that we observe individuals declare when their problematic truth claims are being questioned. Considering the pervasiveness of post-truth and the ethical issues that accompany it, educators and scholars alike have increasingly noted the need to adapt educational practices and pedagogies to account for the diminishing objectivity of truth in the twenty-first century, especially because students, as digital natives, find themselves in the firing line of post-truth; engulfed in digital societies that proliferate post-truth through the surge of truth claims allowed in various media sites. In an attempt to equip students with the vital skills to navigate the post-truth age and oppose its proliferation of social injustices, English educators find themselves having to contend with a complex question: how can the teaching of English equip students with the ability to critically and ethically scrutinise truth claims whilst also mediating the subjectivity of truth in a manner that does not undermine the voices of diverse communities. In order to address this question, this paper will first examine the challenges that confront students as a result of post-truth. Following this, the paper will elucidate the role English education can play in helping students overcome the complex demands of the post-truth age. Scholars have consistently touted the affordances of literary texts in providing students with imagined spaces to explore societal issues through a critical discernment of language and an ethical engagement with its narrative developments. Therefore, this paper will explain and demonstrate how literary texts, when used alongside a critical-ethical post-truth pedagogy that equips students with interpretive strategies informed by literary traditions such as literary and ethical criticism, can be effective in helping students develop the pertinent skills to comprehensively examine truth claims and overcome the challenges of the post-truth age.Keywords: post-truth, pedagogy, ethics, english, education
Procedia PDF Downloads 661445 Understanding the Factors Influencing Urban Ethiopian Consumers’ Consumption Intention of Spirulina-Supplemented Bread
Authors: Adino Andaregie, Isao Takagi, Hirohisa Shimura, Mitsuko Chikasada, Shinjiro Sato, Solomon Addisu
Abstract:
Context: The prevalence of undernutrition in developing countries like Ethiopia has become a significant issue. In this regard, finding alternative nutritional supplements seems to be a practical solution. Spirulina, a highly nutritious microalgae, offers a valuable option as it is a rich source of various essential nutrients. The study aimed to establish the factors affecting urban Ethiopian consumers' consumption intention of Spirulina-fortified bread. Research Aim: The primary purpose of this research is to identify the behavioral and socioeconomic factors impacting the intention of urban Ethiopian consumers to eat Spirulina-fortified bread. Methodology: The research utilized a quantitative approach wherein a structured questionnaire was created and distributed among 361 urban consumers via an online platform. The theory of planned behavior (TPB) was used as a conceptual framework, and confirmatory factor analysis (CFA) and structural equation modelling (SEM) were employed for data analysis. Findings: The study results revealed that attitude towards the supplement, subjective norms, and perceived behavioral control were the critical factors influencing the consumption intention of Spirulina-fortified bread. Moreover, age, physical exercise, and prior knowledge of Spirulina as a food ingredient were also found to have a significant influence. Theoretical Importance: The study contributes towards the understanding of consumer behavior and factors affecting the purchase intentions of Spirulina-fortified bread in urban Ethiopia. The use of TPB as a theoretical framework adds a vital aspect to the study as it provides helpful insights into the factors affecting intentions towards this functional food. Data Collection and Analysis Procedures: The data collection process involved the creation of a structured questionnaire, which was distributed online to urban Ethiopian consumers. Once data was collected, CFA and SEM were utilized to analyze the data and identify the factors impacting consumer behavior. Questions Addressed: The study aimed to address the following questions: (1) What are the behavioral and socioeconomic factors impacting urban Ethiopian consumers' consumption intention of Spirulina-fortified bread? (2) To what extent do attitude towards the supplement, subjective norms, and perceived behavioral control affect the purchase intention of Spirulina-fortified bread? (3) What role does age, education, income, physical exercise, and prior knowledge of Spirulina as a food ingredient play in the purchase intention of Spirulina-fortified bread among urban Ethiopian consumers? Conclusion: The study concludes that attitude towards the supplement, subjective norms, and perceived behavioral control are significant factors influencing urban Ethiopian consumers’ consumption intention of Spirulina-fortified bread. Moreover, age, education, income, physical exercise, and prior knowledge of Spirulina as a food ingredient also play a significant role in determining purchase intentions. The findings provide valuable insights for developing effective marketing strategies for Spirulina-fortified functional foods targeted at different consumer segments.Keywords: spirulina, consumption, factors, intention, consumers, behavior
Procedia PDF Downloads 821444 An Eco-Systemic Typology of Fashion Resale Business Models in Denmark
Authors: Mette Dalgaard Nielsen
Abstract:
The paper serves the purpose of providing an eco-systemic typology of fashion resale business models in Denmark while pointing to possibilities to learn from its wisdom during a time when a fundamental break with the dominant linear fashion paradigm has become inevitable. As we transgress planetary boundaries and can no longer continue the unsustainable path of over-exploiting the Earth’s resources, the global fashion industry faces a tremendous need for change. One of the preferred answers to the fashion industry’s sustainability crises lies in the circular economy, which aims to maximize the utilization of resources by keeping garments in use for longer. Thus, in the context of fashion, resale business models that allow pre-owned garments to change hands with the purpose of being reused in continuous cycles are considered to be among the most efficient forms of circularity. Methodologies: The paper is based on empirical data from an ongoing project and a series of qualitative pilot studies that have been conducted on the Danish resale market over a 2-year time period from Fall 2021 to Fall 2023. The methodological framework is comprised of (n) ethnography and fieldwork in selected resale environments, as well as semi-structured interviews and a workshop with eight business partners from the Danish fashion and textiles industry. By focusing on the real-world circulation of pre-owned garments, which is enabled by the identified resale business models, the research lets go of simplistic hypotheses to the benefit of dynamic, vibrant and non-linear processes. As such, the paper contributes to the emerging research field of circular economy and fashion, which finds itself in a critical need to move from non-verified concepts and theories to empirical evidence. Findings: Based on the empirical data and anchored in the business partners, the paper analyses and presents five distinct resale business models with different product, service and design characteristics. These are 1) branded resale, 2) trade-in resale, 3) peer-2-peer resale, 4) resale boutiques and consignment shops and 5) resale shelf/square meter stores and flea markets. Together, the five business models represent a plurality of resale-promoting business model design elements that have been found to contribute to the circulation of pre-owned garments in various ways for different garments, users and businesses in Denmark. Hence, the provided typology points to the necessity of prioritizing several rather than single resale business model designs, services and initiatives for the resale market to help reconfigure the linear fashion model and create a circular-ish future. Conclusions: The article represents a twofold research ambition by 1) presenting an original, up-to-date eco-systemic typology of resale business models in Denmark and 2) using the typology and its eco-systemic traits as a tool to understand different business model design elements and possibilities to help fashion grow out of its linear growth model. By basing the typology on eco-systemic mechanisms and actual exemplars of resale business models, it becomes possible to envision the contours of a genuine alternative to business as usual that ultimately helps bend the linear fashion model towards circularity.Keywords: circular business models, circular economy, fashion, resale, strategic design, sustainability
Procedia PDF Downloads 571443 Relocation of Plastic Hinge of Interior Beam Column Connections with Intermediate Bars in Reinforced Concrete and T-Section Steel Inserts in Precast Concrete Frames
Authors: P. Wongmatar, C. Hansapinyo, C. Buachart
Abstract:
Failure of typical seismic frames has been found by plastic hinge occurring on beams section near column faces. Past researches shown that the seismic capacity of the frames can be enhanced if the plastic hinges of the beams are shifted away from the column faces. This paper presents detailing of reinforcements in the interior beam–column connections aiming to relocate the plastic hinge of reinforced concrete and precast concrete frames. Four specimens were tested under quasi-static cyclic load including two monolithic specimens and two precast specimens. For one monolithic specimen, typical seismic reinforcement was provided and considered as a reference specimen named M1. The other reinforced concrete frame M2 contained additional intermediate steel in the connection area compared with the specimen M1. For the precast specimens, embedded T-section steels in joint were provided, with and without diagonal bars in the connection area for specimen P1 and P2, respectively. The test results indicated the ductile failure with beam flexural failure in monolithic specimen M1 and the intermediate steel increased strength and improved joint performance of specimen M2. For the precast specimens, cracks generated at the end of the steel inserts. However, slipping of reinforcing steel lapped in top of the beams was seen before yielding of the main bars leading to the brittle failure. The diagonal bars in precast specimens P2 improved the connection stiffness and the energy dissipation capacity.Keywords: relocation, plastic hinge, intermediate bar, T-section steel, precast concrete frame
Procedia PDF Downloads 2721442 Free Vibration Analysis of Timoshenko Beams at Higher Modes with Central Concentrated Mass Using Coupled Displacement Field Method
Authors: K. Meera Saheb, K. Krishna Bhaskar
Abstract:
Complex structures used in many fields of engineering are made up of simple structural elements like beams, plates etc. These structural elements, sometimes carry concentrated masses at discrete points, and when subjected to severe dynamic environment tend to vibrate with large amplitudes. The frequency amplitude relationship is very much essential in determining the response of these structural elements subjected to the dynamic loads. For Timoshenko beams, the effects of shear deformation and rotary inertia are to be considered to evaluate the fundamental linear and nonlinear frequencies. A commonly used method for solving vibration problem is energy method, or a finite element analogue of the same. In the present Coupled Displacement Field method the number of undetermined coefficients is reduced to half when compared to the famous Rayleigh Ritz method, which significantly simplifies the procedure to solve the vibration problem. This is accomplished by using a coupling equation derived from the static equilibrium of the shear flexible structural element. The prime objective of the present paper here is to study, in detail, the effect of a central concentrated mass on the large amplitude free vibrations of uniform shear flexible beams. Accurate closed form expressions for linear frequency parameter for uniform shear flexible beams with a central concentrated mass was developed and the results are presented in digital form.Keywords: coupled displacement field, coupling equation, large amplitude vibrations, moderately thick plates
Procedia PDF Downloads 225