Search results for: Fourier Galerkin approach
13852 Multiscale Modeling of Damage in Textile Composites
Authors: Jaan-Willem Simon, Bertram Stier, Brett Bednarcyk, Evan Pineda, Stefanie Reese
Abstract:
Textile composites, in which the reinforcing fibers are woven or braided, have become very popular in numerous applications in aerospace, automotive, and maritime industry. These textile composites are advantageous due to their ease of manufacture, damage tolerance, and relatively low cost. However, physics-based modeling of the mechanical behavior of textile composites is challenging. Compared to their unidirectional counterparts, textile composites introduce additional geometric complexities, which cause significant local stress and strain concentrations. Since these internal concentrations are primary drivers of nonlinearity, damage, and failure within textile composites, they must be taken into account in order for the models to be predictive. The macro-scale approach to modeling textile-reinforced composites treats the whole composite as an effective, homogenized material. This approach is very computationally efficient, but it cannot be considered predictive beyond the elastic regime because the complex microstructural geometry is not considered. Further, this approach can, at best, offer a phenomenological treatment of nonlinear deformation and failure. In contrast, the mesoscale approach to modeling textile composites explicitly considers the internal geometry of the reinforcing tows, and thus, their interaction, and the effects of their curved paths can be modeled. The tows are treated as effective (homogenized) materials, requiring the use of anisotropic material models to capture their behavior. Finally, the micro-scale approach goes one level lower, modeling the individual filaments that constitute the tows. This paper will compare meso- and micro-scale approaches to modeling the deformation, damage, and failure of textile-reinforced polymer matrix composites. For the mesoscale approach, the woven composite architecture will be modeled using the finite element method, and an anisotropic damage model for the tows will be employed to capture the local nonlinear behavior. For the micro-scale, two different models will be used, the one being based on the finite element method, whereas the other one makes use of an embedded semi-analytical approach. The goal will be the comparison and evaluation of these approaches to modeling textile-reinforced composites in terms of accuracy, efficiency, and utility.Keywords: multiscale modeling, continuum damage model, damage interaction, textile composites
Procedia PDF Downloads 35413851 An Evaluation of the Impact of Epoxidized Neem Seed Azadirachta indica Oil on the Mechanical Properties of Polystyrene
Authors: Salihu Takuma
Abstract:
Neem seed oil has high contents of unsaturated fatty acids which can be converted to epoxy fatty acids. The vegetable oil – based epoxy material are sustainable, renewable and biodegradable materials replacing petrochemical – based epoxy materials in some applications. Polystyrene is highly brittle with limited mechanical applications. Raw neem seed oil was obtained from National Research Institute for Chemical Technology (NARICT), Zaria, Nigeria. The oil was epoxidized at 60 0C for three (3) hours using formic acid generated in situ. The epoxidized oil was characterized using Fourier Transform Infrared spectroscopy (FTIR). The disappearance of C = C stretching peak around 3011.7 cm-1and formation of a new absorption peak around 943 cm-1 indicate the success of epoxidation. The epoxidized oil was blended with pure polystyrene in different weight percent compositions using solution casting in chloroform. The tensile properties of the blends demonstrated that the addition of 5 wt % ENO to PS led to an increase in elongation at break, but a decrease in tensile strength and modulus. This is in accordance with the common rule that plasticizers can decrease the tensile strength of the polymer.Keywords: biodegradable, elongation at break, epoxidation, epoxy fatty acids, sustainable, tensile strength and modulus
Procedia PDF Downloads 23513850 Information and Cooperativity in Fiction: The Pragmatics of David Baboulene’s “Knowledge Gaps”
Authors: Cara DiGirolamo
Abstract:
In his 2017 Ph.D. thesis, script doctor David Baboulene presented a theory of fiction in which differences in the knowledge states between participants in a literary experience, including reader, author, and characters, create many story elements, among them suspense, expectations, subtext, theme, metaphor, and allegory. This theory can be adjusted and modeled by incorporating a formal pragmatic approach that understands narrative as a speech act with a conversational function. This approach requires both the Speaker and the Listener to be understood as participants in the discourse. It also uses theories of cooperativity and the QUD to identify the existence of implicit questions. This approach predicts that what an effective literary narrative must do: provide a conversational context early in the story so the reader can engage with the text as a conversational participant. In addition, this model incorporates schema theory. Schema theory is a cognitive model for learning and processing information about the world and transforming it into functional knowledge. Using this approach can extend the QUD model. Instead of describing conversation as a form of information gathering restricted to question-answer sets, the QUD can include knowledge modeling and understanding as a possible outcome of a conversation. With this model, Baboulene’s “Knowledge Gaps” can provide real insight into storytelling as a conversational move, and extend the QUD to be able to simply and effectively apply to a more diverse set of conversational interactions and also to narrative texts.Keywords: literature, speech acts, QUD, literary theory
Procedia PDF Downloads 1913849 Big Classes, Bigger Ambitions: A Participatory Approach to the Multiple-Choice Exam
Authors: Melanie Adrian, Elspeth McCulloch, Emily-Jean Gallant
Abstract:
Resources -financial, physical, and human- are increasingly constrained in higher education. University classes are getting bigger, and the concomitant grading burden on faculty is growing rapidly. Multiple-choice exams are seen by some as one solution to these changes. How much students retain, however, and what their testing experience is, continues to be debated. Are multiple-choice exams serving students well, or are they bearing the burden of these developments? Is there a way to address both the resource constraints and make these types of exams more meaningful? In short, how do we engender evaluation methods for large-scale classes that provide opportunities for heightened student learning and enrichment? The following article lays out a testing approach we have employed in four iterations of the same third-year law class. We base our comments in this paper on our initial observations as well as data gathered from an ethics-approved study looking at student experiences. This testing approach provides students with multiple opportunities for revision (thus increasing chances for long term retention), is both individually and collaboratively driven (thus reflecting the individual effort and group effort) and is automatically graded (thus draining limited institutional resources). We found that overall students appreciated the approach and found it more ‘humane’, that it notably reduced pre-exam and intra-exam stress levels, increased ease, and lowered nervousness.Keywords: exam, higher education, multiple-choice, law
Procedia PDF Downloads 12813848 25 Years of the Neurolinguistic Approach: Origin, Outcomes, Expansion and Current Experiments
Authors: Steeve Mercier, Joan Netten, Olivier Massé
Abstract:
The traditional lack of success of most Canadian students in the regular French program in attaining the ability to communicate spontaneously led to the conceptualization of a modified program. This program, called Intensive French, introduced and evaluated as an experiment in several school districts, formed the basis for the creation of a more effective approach for the development of skills in a second/foreign language and literacy: the Neurolinguistic Approach (NLA).The NLA expresses the major change in the understanding of how communication skills are developed: learning to communicate spontaneously in a second language depends on the reuse of structures in a variety of cognitive situations to express authentic messages rather than on knowledge of the way a language functions. Put differently, it prioritises the acquisition of implicit competence over the learning of grammatical knowledge. This is achieved by the adoption of a literacy-based approach and an increase in intensity of instruction.Besides having strong support empirically from numerous experiments, the NLA has sound theoretical foundation, as it conforms to research in neurolinguistics. The five pedagogical principles that define the approach will be explained, as well as the differences between the NLA and the paradigm on which most current resources and teaching strategies are based. It is now 25 years since the original research occurred. The use of the NLA, as it will be shown, has expanded widely. With some adaptations, it is used for other languages and in other milieus. In Canada, classes are offered in mandarin, Ukrainian, Spanish and Arabic, amongst others. It has also been used in several indigenous communities, such as to restore the use of Mohawk, Cri and Dene. Its use has expanded throughout the world, as in China, Japan, France, Germany, Belgium, Poland, Russia, as well as Mexico. The Intensive French program originally focussed on students in grades 5 or 6 (ages 10 -12); nowadays, the programs based on the approach include adults, particularly immigrants entering new countries. With the increasing interest in inclusion and cultural diversity, there is a demand for language learning amongst pre-school and primary children that can be successfully addressed by the NLA. Other current experiments target trilingual schools and work with Inuit communities of Nunavik in the province of Quebec.Keywords: neuroeducation, neurolinguistic approach, literacy, second language acquisition, plurilingualism, foreign language teaching and learning
Procedia PDF Downloads 7313847 Whether Chaos Theory Could Reconstruct the Ancient Societies
Authors: Zahra Kouzehgari
Abstract:
Since the early emergence of chaos theory in the 1970s in mathematics and physical science, it has increasingly been developed and adapted in social sciences as well. The non-linear and dynamic characteristics of the theory make it a useful conceptual framework to interpret the complex social systems behavior. Regarding chaotic approach principals, sensitivity to initial conditions, dynamic adoption, strange attractors and unpredictability this paper aims to examine whether chaos approach could interpret the ancient social changes. To do this, at first, a brief history of the chaos theory, its development and application in social science as well as the principals making the theory, then its application in archaeological since has been reviewed. The study demonstrates that although based on existing archaeological records reconstruct the whole social system of the human past, the non-linear approaches in studying social complex systems would be of a great help in finding general order of the ancient societies and would enable us to shed light on some of the social phenomena in the human history or to make sense of them.Keywords: archaeology, non-linear approach, chaos theory, ancient social systems
Procedia PDF Downloads 28513846 Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications
Authors: Chia-Ju Peng, Shih-Jui Chen
Abstract:
This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance.Keywords: biomedical engineering, brain computer interface, electroencephalography, rehabilitation
Procedia PDF Downloads 39213845 Taguchi-Based Six Sigma Approach to Optimize Surface Roughness for Milling Processes
Authors: Sky Chou, Joseph C. Chen
Abstract:
This paper focuses on using Six Sigma methodologies to improve the surface roughness of a manufactured part produced by the CNC milling machine. It presents a case study where the surface roughness of milled aluminum is required to reduce or eliminate defects and to improve the process capability index Cp and Cpk for a CNC milling process. The six sigma methodology, DMAIC (design, measure, analyze, improve, and control) approach, was applied in this study to improve the process, reduce defects, and ultimately reduce costs. The Taguchi-based six sigma approach was applied to identify the optimized processing parameters that led to the targeted surface roughness specified by our customer. A L9 orthogonal array was applied in the Taguchi experimental design, with four controllable factors and one non-controllable/noise factor. The four controllable factors identified consist of feed rate, depth of cut, spindle speed, and surface roughness. The noise factor is the difference between the old cutting tool and the new cutting tool. The confirmation run with the optimal parameters confirmed that the new parameter settings are correct. The new settings also improved the process capability index. The purpose of this study is that the Taguchi–based six sigma approach can be efficiently used to phase out defects and improve the process capability index of the CNC milling process.Keywords: CNC machining, six sigma, surface roughness, Taguchi methodology
Procedia PDF Downloads 24313844 Study of Thermal and Mechanical Properties of Ethylene/1-Octene Copolymer Based Nanocomposites
Authors: Sharmila Pradhan, Ralf Lach, George Michler, Jean Mark Saiter, Rameshwar Adhikari
Abstract:
Ethylene/1-octene copolymer was modified incorporating three types of nanofillers differed in their dimensionality in order to investigate the effect of filler dimensionality on mechanical properties, for instance, tensile strength, microhardness etc. The samples were prepared by melt mixing followed by compression moldings. The microstructure of the novel material was characterized by Fourier transform infrared spectroscopy (FTIR), X-ray diffraction (XRD) method and Transmission electron microscopy (TEM). Other important properties such as melting, crystallizing and thermal stability were also investigated via differential scanning calorimetry (DSC) and Thermogravimetry analysis (TGA). The FTIR and XRD results showed that the composites were formed by physical mixing. The TEM result supported the homogeneous dispersion of nanofillers in the matrix. The mechanical characterization performed by tensile testing showed that the composites with 1D nanofiller effectively reinforced the polymer. TGA results revealed that the thermal stability of pure EOC is marginally improved by the addition of nanofillers. Likewise, melting and crystallizing properties of the composites are not much different from that of pure.Keywords: copolymer, differential scanning calorimetry, nanofiller, tensile strength
Procedia PDF Downloads 24913843 Analysis of Detection Concealed Objects Based on Multispectral and Hyperspectral Signatures
Authors: M. Kastek, M. Kowalski, M. Szustakowski, H. Polakowski, T. Sosnowski
Abstract:
Development of highly efficient security systems is one of the most urgent topics for science and engineering. There are many kinds of threats and many methods of prevention. It is very important to detect a threat as early as possible in order to neutralize it. One of the very challenging problems is detection of dangerous objects hidden under human’s clothing. This problem is particularly important for safety of airport passengers. In order to develop methods and algorithms to detect hidden objects it is necessary to determine the thermal signatures of such objects of interest. The laboratory measurements were conducted to determine the thermal signatures of dangerous tools hidden under various clothes in different ambient conditions. Cameras used for measurements were working in spectral range 0.6-12.5 μm An infrared imaging Fourier transform spectroradiometer was also used, working in spectral range 7.7-11.7 μm. Analysis of registered thermograms and hyperspectral datacubes has yielded the thermal signatures for two types of guns, two types of knives and home-made explosive bombs. The determined thermal signatures will be used in the development of method and algorithms of image analysis implemented in proposed monitoring systems.Keywords: hyperspectral detection, nultispectral detection, image processing, monitoring systems
Procedia PDF Downloads 34913842 Optimization of the Numerical Fracture Mechanics
Authors: H. Hentati, R. Abdelmoula, Li Jia, A. Maalej
Abstract:
In this work, we present numerical simulations of the quasi-static crack propagation based on the variation approach. We perform numerical simulations of a piece of brittle material without initial crack. An alternate minimization algorithm is used. Based on these numerical results, we determine the influence of numerical parameters on the location of crack. We show the importance of trying to optimize the time of numerical computation and we present the first attempt to develop a simple numerical method to optimize this time.Keywords: fracture mechanics, optimization, variation approach, mechanic
Procedia PDF Downloads 60713841 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution
Authors: Nikolay P. Brayanov, Anna V. Stoynova
Abstract:
Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development
Procedia PDF Downloads 24413840 Sensitivity Analysis of Movable Bed Roughness Formula in Sandy Rivers
Authors: Mehdi Fuladipanah
Abstract:
Sensitivity analysis as a technique is applied to determine influential input factors on model output. Variance-based sensitivity analysis method has more application compared to other methods because of including linear and non-linear models. In this paper, van Rijn’s movable bed roughness formula was selected to evaluate because of its reasonable results in sandy rivers. This equation contains four variables as: flow depth, sediment size,bBed form height and bed form length. These variable’s importance was determined using the first order of Fourier Amplitude Sensitivity Test. Sensitivity index was applied to evaluate importance of factors. The first order FAST based sensitivity indices test, explain 90% of the total variance that is indicating acceptance criteria of FAST application. More value of this index is indicating more important variable. Results show that bed form height, bed form length, sediment size and flow depth are more influential factors with sensitivity index: 32%, 24%, 19% and 15% respectively.Keywords: sdensitivity analysis, variance, movable bed roughness formula, Sandy River
Procedia PDF Downloads 26113839 Characteristic Function in Estimation of Probability Distribution Moments
Authors: Vladimir S. Timofeev
Abstract:
In this article the problem of distributional moments estimation is considered. The new approach of moments estimation based on usage of the characteristic function is proposed. By statistical simulation technique, author shows that new approach has some robust properties. For calculation of the derivatives of characteristic function there is used numerical differentiation. Obtained results confirmed that author’s idea has a certain working efficiency and it can be recommended for any statistical applications.Keywords: characteristic function, distributional moments, robustness, outlier, statistical estimation problem, statistical simulation
Procedia PDF Downloads 50613838 The Impact of Female Education on Fertility: A Natural Experiment from Egypt
Authors: Fatma Romeh, Shiferaw Gurmu
Abstract:
This paper examines the impact of female education on fertility, using the change in length of primary schooling in Egypt in 1988-89 as the source of exogenous variation in schooling. In particular, beginning in 1988, children had to attend primary school for only five years rather than six years. This change was applicable to all individuals born on or after October 1977. Using a nonparametric regression discontinuity approach, we compare education and fertility of women born just before and after October 1977. The results show that female education significantly reduces the number of children born per woman and delays the time until first birth. Applying a robust regression discontinuity approach, however, the impact of education on the number of children is no longer significant. The impact on the timing of first birth remained significant under the robust approach. Each year of female education postponed childbearing by three months, on average.Keywords: Egypt, female education, fertility, robust regression discontinuity
Procedia PDF Downloads 33813837 Fuzzy Logic Modeling of Evaluation the Urban Skylines by the Entropy Approach
Authors: Murat Oral, Seda Bostancı, Sadık Ata, Kevser Dincer
Abstract:
When evaluating the aesthetics of cities, an analysis of the urban form development depending on design properties with a variety of factors is performed together with a study of the effects of this appearance on human beings. Different methods are used while making an aesthetical evaluation related to a city. Entropy, in its preliminary meaning, is the mathematical representation of thermodynamic results. Measuring the entropy is related to the distribution of positional figures of a message or information from the probabilities standpoint. In this study, analysis of evaluation the urban skylines by the entropy approach was modelled with Rule-Based Mamdani-Type Fuzzy (RBMTF) modelling technique. Input-output parameters were described by RBMTF if-then rules. Numerical parameters of input and output variables were fuzzificated as linguistic variables: Very Very Low (L1), Very Low (L2), Low (L3), Negative Medium (L4), Medium (L5), Positive Medium (L6), High (L7), Very High (L8) and Very Very High (L9) linguistic classes. The comparison between application data and RBMTF is done by using absolute fraction of variance (R2). The actual values and RBMTF results indicated that RBMTF can be successfully used for the analysis of evaluation the urban skylines by the entropy approach. As a result, RBMTF model has shown satisfying relation with experimental results, which suggests an alternative method to evaluation of the urban skylines by the entropy approach.Keywords: urban skylines, entropy, rule-based Mamdani type, fuzzy logic
Procedia PDF Downloads 29013836 Limit-Cycles Method for the Navigation and Avoidance of Any Form of Obstacles for Mobile Robots in Cluttered Environment
Authors: F. Boufera, F. Debbat
Abstract:
This paper deals with an approach based on limit-cycles method for the problem of obstacle avoidance of mobile robots in unknown environments for any form of obstacles. The purpose of this approach is the improvement of limit-cycles method in order to obtain safe and flexible navigation. The proposed algorithm has been successfully tested in different configuration on simulation.Keywords: mobile robot, navigation, avoidance of obstacles, limit-cycles method
Procedia PDF Downloads 42913835 Investigation of Alfa Fibers Reinforced Epoxy-Amine Composites Properties
Authors: Amar Boukerrou, Ouerdia Belhadj, Dalila Hammiche, Jean Francois Gerard, Jannick Rumeau
Abstract:
The main goal of this study is the investigation of alfa fiber content, treated with alkali treatment, on the thermal and mechanical properties of epoxy-amine matrix-based composites. The fibers were treated with 5% of sodium hydroxide solution and varied between 10% to 30% weight fractions. The tensile, flexural, and hardness tests are carried out to investigate the mechanical properties of composites. The results show those composites’ mechanical properties are higher than the neat epoxy-amine. It was noticed that the alkali treatment is more effective in the case of the tensile and flexural modulus than the tensile and flexural strength. The decline of both the tensile and flexural behavior of all composites with the increasing of the filler content was due probably to the random dispersion of the fibers in the epoxy resin The Fourier transform infrared (FTIR) was employed to analyze the chemical structure of epoxy resin before and after curing with amine hardener. FTIR and DSC analysis confirmed that epoxy resin was completely cured with amine hardener at room temperature. SEM analysis has highlighted the microstructure of epoxy matrix and its composites.Keywords: alfa fiber, epoxy resin, alkali treatment, mechanical properties
Procedia PDF Downloads 11013834 Obtaining High Purity Hydroxyapatite from Bovine Bone: Effect of Chemical and Thermal Treatments
Authors: Hernandez Pardo Diego F., Guiza Arguello Viviana R., Coy Echeverria Ana, Viejo Abrante Fernando
Abstract:
The biological hydroxyapatite obtained from bovine bone arouses great interest in its application as a material for bone regeneration due to its better bioactive behavior in comparison with synthetic hydroxyapatite. For this reason, the objective of the present investigation was to determine the effect of chemical and thermal treatments in obtaining biological bovine hydroxyapatite of high purity and crystallinity. Two different chemical reagents were evaluated (NaOH and HCl) with the aim to remove the organic matrix of the bovine cortical bone. On the other hand, for analyzing the effect of thermal treatment temperature was ranged between 500 and 1000°C for a holding time of 4 hours. To accomplish the above, the materials before and after the chemical and thermal treatments were characterized by elemental compositional analysis (CHN), infrared spectroscopy by Fourier transform (FTIR), RAMAN spectroscopy, scanning electron microscopy (SEM), thermogravimetric analysis (TGA) and X-ray diffraction (XRD) and energy dispersion X-ray spectroscopy (EDS). The results allowed to establish that NaOH is more effective in the removal of the organic matrix of the bone when compared to HCl, whereas a thermal treatment at 700ºC for 4 hours was enough to obtain biological hydroxyapatite of high purity and crystallinity.Keywords: bovine bone, hydroxyapatite, biomaterials, thermal treatment
Procedia PDF Downloads 11713833 Sorption of Crystal Violet from Aqueous Solution Using Chitosan−Charcoal Composite
Authors: Kingsley Izuagbe Ikeke, Abayomi O. Adetuyi
Abstract:
The study investigated the removal efficiency of crystal violet from aqueous solution using chitosan-charcoal composite as adsorbent. Deproteination was carried out by placing 200g of powdered snail shell in 4% w/v NaOH for 2hours. The sample was then placed in 1% HCl for 24 hours to remove CaCO3. Deacetylation was done by boiling in 50% NaOH for 2hours. 10% Oxalic acid was used to dissolve the chitosan before mixing with charcoal at 55°C to form the composite. The composite was characterized by Fourier Transform Infra-Red and Scanning Electron Microscopy measurements. The efficiency of adsorption was evaluated by varying pH of the solution, contact time, initial concentration and adsorbent dose. Maximum removal of crystal violet by composite and activated charcoal was attained at pH10 while maximum removal of crystal violet by chitosan was achieved at pH 8. The results showed that adsorption of both dyes followed the pseudo-second-order rate equation and fit the Langmuir and Freundlich isotherms. The data showed that composite was best suited for crystal violet removal and also did relatively well in the removal of alizarin red. Thermodynamic parameters such as enthalpy change (ΔHº), free energy change (ΔGº) and entropy change (ΔSº) indicate that adsorption process of Crystal Violet was endothermic, spontaneous and feasible respectively.Keywords: crystal violet, chitosan−charcoal composite, extraction process, sorption
Procedia PDF Downloads 44013832 Investigating the Systematic Implications of Plastic Waste Additions to Concrete Taking a Circular Approach
Authors: Christina Cheong, Naomi Keena
Abstract:
In the face of growing urbanization the construction of new buildings is inevitable and with current construction methods leading to environmental degradation much questioning is needed around reducing the environmental impact of buildings. This paper explores the global environmental issue of concrete production in parallel with the problem of plastic waste, and questions if new solutions into plastic waste additions in concrete is a viable sustainable solution with positive systematic implications to living systems, both human and non-human. We investigate how certification programs can be used to access the sustainability of the new concrete composition. With this classification we look to the health impacts as well as reusability of such concrete in a second or third life cycle. We conclude that such an approach has benefits to the environment and that taking a circular approach to its development, in terms of the overall life cycle of the new concrete product, can help understand the nuances in terms of the material’s environmental and human health impacts.Keywords: Concrete, Plastic waste additions to concrete, sustainability ratings, sustainable materials
Procedia PDF Downloads 15013831 A Learning Automata Based Clustering Approach for Underwater Sensor Networks to Reduce Energy Consumption
Authors: Motahareh Fadaei
Abstract:
Wireless sensor networks that are used to monitor a special environment, are formed from a large number of sensor nodes. The role of these sensors is to sense special parameters from ambient and to make connection. In these networks, the most important challenge is the management of energy usage. Clustering is one of the methods that are broadly used to face this challenge. In this paper, a distributed clustering protocol based on learning automata is proposed for underwater wireless sensor networks. The proposed algorithm that is called LA-Clustering forms clusters in the same energy level, based on the energy level of nodes and the connection radius regardless of size and the structure of sensor network. The proposed approach is simulated and is compared with some other protocols with considering some metrics such as network lifetime, number of alive nodes, and number of transmitted data. The simulation results demonstrate the efficiency of the proposed approach.Keywords: clustering, energy consumption, learning automata, underwater sensor networks
Procedia PDF Downloads 31713830 Forecasting Etching Behavior Silica Sand Using the Design of Experiments Method
Authors: Kefaifi Aissa, Sahraoui Tahar, Kheloufi Abdelkrim, Anas Sabiha, Hannane Farouk
Abstract:
The aim of this study is to show how the Design of Experiments Method (DOE) can be put into use as a practical approach for silica sand etching behavior modeling during its primary step of leaching. In the present work, we have studied etching effect on particle size during a primary step of leaching process on Algerian silica sand with florid acid (HF) at 20% and 30 % during 4 and 8 hours. Therefore, a new purity of the sand is noted depending on the time of leaching. This study was expanded by a numerical approach using a method of experiment design, which shows the influence of each parameter and the interaction between them in the process and approved the obtained experimental results. This model is a predictive approach using hide software. Based on the measured parameters experimentally in the interior of the model, the use of DOE method can make it possible to predict the outside parameters of the model in question and can give us the optimize response without making the experimental measurement.Keywords: acid leaching, design of experiments method(DOE), purity silica, silica etching
Procedia PDF Downloads 28613829 Rethinking Sustainability: Towards an Open System Approach
Authors: Fatemeh Yazdandoust
Abstract:
Sustainability is a growing concern in architecture and urban planning due to the environmental impact of the built environment. Ecological challenges persist despite the proliferation of sustainable design strategies, prompting a critical reevaluation of existing approaches. This study examines sustainable design practices, focusing on the origins and processes of production, environmental impact, and socioeconomic dimensions. It also discusses ‘cleantech’ initiatives, which often prioritize profitability over ecological stewardship. The study advocates for a paradigm shift in urban design towards greater adaptability, complexity, and inclusivity, embracing porosity, incompleteness, and seed planning. This holistic approach emphasizes citizen participation and bottom-up interventions, reimagining urban spaces as evolving ecosystems. The study calls for a reimagining of sustainability that transcends conventional green design concepts, promoting a more resilient and inclusive built environment through an open system approach grounded in adaptability, diversity, and equity principles.Keywords: sustainability, clean-tech, open system design, sustainable design
Procedia PDF Downloads 6313828 On Supporting a Meta-Design Approach in Socio-Technical Ontology Engineering
Authors: Mesnan Silalahi, Dana Indra Sensuse, Indra Budi
Abstract:
Many research have revealed the fact of the complexity of ontology building process that there is a need to have a new approach which addresses the socio-technical aspects in the collaboration to reach a consensus. Meta-design approach is considered applicable as a method in the methodological model in a socio-technical ontology engineering. Principles in the meta-design framework is applied in the construction phases on the ontology. A portal is developed to support the meta-design principles requirements. To validate the methodological model semantic web applications were developed and integrated in the portal and also used as a way to show the usefulness of the ontology. The knowledge based system will be filled with data of Indonesian medicinal plants. By showing the usefulness of the developed ontology in a web semantic application, we motivate all stakeholders to participate in the development of knowledge based system of medicinal plants in Indonesia.Keywords: socio-technical, metadesign, ontology engineering methodology, semantic web application
Procedia PDF Downloads 43913827 Integer Programming: Domain Transformation in Nurse Scheduling Problem.
Authors: Geetha Baskaran, Andrzej Barjiela, Rong Qu
Abstract:
Motivation: Nurse scheduling is a complex combinatorial optimization problem. It is also known as NP-hard. It needs an efficient re-scheduling to minimize some trade-off of the measures of violation by reducing selected constraints to soft constraints with measurements of their violations. Problem Statement: In this paper, we extend our novel approach to solve the nurse scheduling problem by transforming it through Information Granulation. Approach: This approach satisfies the rules of a typical hospital environment based on a standard benchmark problem. Generating good work schedules has a great influence on nurses' working conditions which are strongly related to the level of a quality health care. Domain transformation that combines the strengths of operation research and artificial intelligence was proposed for the solution of the problem. Compared to conventional methods, our approach involves judicious grouping (information granulation) of shifts types’ that transforms the original problem into a smaller solution domain. Later these schedules from the smaller problem domain are converted back into the original problem domain by taking into account the constraints that could not be represented in the smaller domain. An Integer Programming (IP) package is used to solve the transformed scheduling problem by expending the branch and bound algorithm. We have used the GNU Octave for Windows to solve this problem. Results: The scheduling problem has been solved in the proposed formalism resulting in a high quality schedule. Conclusion: Domain transformation represents departure from a conventional one-shift-at-a-time scheduling approach. It offers an advantage of efficient and easily understandable solutions as well as offering deterministic reproducibility of the results. We note, however, that it does not guarantee the global optimum.Keywords: domain transformation, nurse scheduling, information granulation, artificial intelligence, simulation
Procedia PDF Downloads 39713826 Cybersecurity Protection Structures: The Case of Lesotho
Authors: N. N. Mosola, K. F. Moeketsi, R. Sehobai, N. Pule
Abstract:
The Internet brings increasing use of Information and Communications Technology (ICT) services and facilities. Consequently, new computing paradigms emerge to provide services over the Internet. Although there are several benefits stemming from these services, they pose several risks inherited from the Internet. For example, cybercrime, identity theft, malware etc. To thwart these risks, this paper proposes a holistic approach. This approach involves multidisciplinary interactions. The paper proposes a top-down and bottom-up approach to deal with cyber security concerns in developing countries. These concerns range from regulatory and legislative areas, cyber awareness, research and development, technical dimensions etc. The main focus areas are highlighted and a cybersecurity model solution is proposed. The paper concludes by combining all relevant solutions into a proposed cybersecurity model to assist developing countries in enhancing a cyber-safe environment to instill and promote a culture of cybersecurity.Keywords: cybercrime, cybersecurity, computer emergency response team, computer security incident response team
Procedia PDF Downloads 15713825 Solid Dispersions of Cefixime Using β-Cyclodextrin: Characterization and in vitro Evaluation
Authors: Nagasamy Venkatesh Dhandapani, Amged Awad El-Gied
Abstract:
Cefixime, a BCS class II drug, is insoluble in water but freely soluble in acetone and in alcohol. The aqueous solubility of cefixime in water is poor and exhibits exceptionally slow and intrinsic dissolution rate. In the present study, cefixime and β-Cyclodextrin (β-CD) solid dispersions were prepared with a view to study the effect and influence of β-CD on the solubility and dissolution rate of this poorly aqueous soluble drug. Phase solubility profile revealed that the solubility of cefixime was increased in the presence of β-CD and was classified as AL-type. Effect of variable, such as drug:carrier ratio, was studied. Physical characterization of the solid dispersion was characterized by Fourier transform infrared spectroscopy (FT-IR) and Differential scanning calorimetry (DSC). These studies revealed that a distinct loss of drug crystallinity in the solid molecular dispersions is ostensibly accounting for enhancement of dissolution rate in distilled water. The drug release from the prepared solid dispersion exhibited a first order kinetics. Solid dispersions of cefixime showed a 6.77 times fold increase in dissolution rate over the pure drug.Keywords: β-cyclodextrin, cefixime, dissolution, Kneading method, solid dispersions, release kinetics
Procedia PDF Downloads 31613824 Depolymerization of Lignin in Sugarcane Bagasse by Hydrothermal Liquefaction to Optimize Catechol Formation
Authors: Nirmala Deenadayalu, Kwanele B. Mazibuko, Lethiwe D. Mthembu
Abstract:
Sugarcane bagasse is the residue obtained after the extraction of sugar from the sugarcane. The main aim of this work was to produce catechol from sugarcane bagasse. The optimization of catechol production was investigated using a Box-Behnken design of experiments. The sugarcane bagasse was heated in a Parr reactor at a set temperature. The reactions were carried out at different temperatures (100-250) °C, catalyst loading (1% -10% KOH (m/v)) and reaction times (60 – 240 min) at 17 bar pressure. The solid and liquid fractions were then separated by vacuum filtration. The liquid fraction was analyzed for catechol using high-pressure liquid chromatography (HPLC) and characterized for the functional groups using Fourier transform infrared spectroscopy (FTIR). The optimized condition for catechol production was 175 oC, 240 min, and 10 % KOH with a catechol yield of 79.11 ppm. Since the maximum time was 240 min and 10 % KOH, a further series of experiments were conducted at 175 oC, 260 min, and 20 % KOH and yielded 2.46 ppm catechol, which was a large reduction in catechol produced. The HPLC peak for catechol was obtained at 2.5 min for the standards and the samples. The FTIR peak at 1750 cm⁻¹ was due to the C=C vibration band of the aromatic ring in the catechol present for both the standard and the samples. The peak at 3325 cm⁻¹ was due to the hydrogen-bonded phenolic OH vibration bands for the catechol. The ANOVA analysis was also performed on the set of experimental data to obtain the factors that most affected the amount of catechol produced.Keywords: catechol, sugarcane bagasse, lignin, hydrothermal liquefaction
Procedia PDF Downloads 10213823 Deciding on Customary International Law: The ICJ's Approach Using Induction, Deduction, and Assertion
Authors: Maryam Nimehforush, Hamid Vahidkia
Abstract:
The International Court of Justice, as well as international law in general, may not excel in methodology. In contrast to how it interprets treaties, the Court rarely explains how it determines the existence, content, and scope of customary international law rules it uses. The Court's jurisprudence only mentions the inductive and deductive methods of law determination sporadically. Both the Court and legal literature have not extensively discussed their approach to determining customary international law. Surprisingly, the question of the Court's methodology has not garnered much attention despite the fact that interpreting and shaping the law have always been intertwined. This article seeks to redirect focus to the method used by the Court in deciding the customs of international law it enforces, emphasizing the importance of methodology in the evolution of customary international law. The text begins by giving explanations for the concepts of ‘induction’ and ‘deduction’ and explores how the Court utilizes them. It later examines when the Court employs inductive and deductive reasoning, the varied types and purposes of deduction, and the connection between the two approaches. The text questions the different concepts of inductive and deductive tradition and proves that the primary approach utilized by the Court is not induction or deduction but instead, assertion.Keywords: ICJ, law, international, induction, deduction, assertion
Procedia PDF Downloads 18