Search results for: path process chart technique
4376 The Role of Identity Politics in the 2023 General Election in Nigeria: An Overview
Authors: Adekunle Saheed Ajisebiyawo
Abstract:
This paper examines the influence of identity politics on the development of electoral democracy in Nigeria. The paper was anchored on a theory of African democracy adopted the qualitative methodology and deployed data from secondary sources to evaluate the 2023 presidential election, and found that ethnicity, religion, and regional sentiments played a major role in the election. The practical implications of this paper are that while Nigeria’s democracy is tending towards consolidation, if the unexpected does not happen, e.g., military takeover, religious and ethnic identities can mar the country’s development as competent candidates that have good policies will be voted out based on religious and ethnic sentiments. Thus, there is a need to de-emphasize religion and ethnicity in the Nigerian polity. Candidates and parties that campaign based on racial or religious narratives should be barred from contesting elective positions. The paper concluded that identity politics is inimical to Nigeria’s democratization process as well as efforts aimed at uniting and integrating the country; it, therefore, recommended that to establish a sound electoral democracy and a strong united country, the menace of ethnic, religious, and regional cleavages should be addressed. To achieve this, efforts should be intensified towards providing a set of principles for nation-building which should be included in the constitution. In addition, the paper urges the media to support the formation of an inclusive government, cutting across tribes and religions in the country to reduce the negative impact of ethnicity and religion in the country.Keywords: cleavages, democracy, ethnicity, election, identity politics, religion
Procedia PDF Downloads 604375 Rapid and Efficient Removal of Lead from Water Using Chitosan/Magnetite Nanoparticles
Authors: Othman M. Hakami, Abdul Jabbar Al-Rajab
Abstract:
Occurrence of heavy metals in water resources increased in the recent years albeit at low concentrations. Lead (PbII) is among the most important inorganic pollutants in ground and surface water. However, removal of this toxic metal efficiently from water is of public and scientific concern. In this study, we developed a rapid and efficient removal method of lead from water using chitosan/magnetite nanoparticles. A simple and effective process has been used to prepare chitosan/magnetite nanoparticles (NPs) (CS/Mag NPs) with effect on saturation magnetization value; the particles were strongly responsive to an external magnetic field making separation from solution possible in less than 2 minutes using a permanent magnet and the total Fe in solution was below the detection limit of ICP-OES (<0.19 mg L-1). The hydrodynamic particle size distribution increased from an average diameter of ~60 nm for Fe3O4 NPs to ~75 nm after chitosan coating. The feasibility of the prepared NPs for the adsorption and desorption of Pb(II) from water were evaluated using Chitosan/Magnetite NPs which showed a high removal efficiency for Pb(II) uptake, with 90% of Pb(II) removed during the first 5 minutes and equilibrium in less than 10 minutes. Maximum adsorption capacities for Pb(II) occurred at pH 6.0 and under room temperature were as high as 85.5 mg g-1, according to Langmuir isotherm model. Desorption of adsorbed Pb on CS/Mag NPs was evaluated using deionized water at different pH values ranged from 1 to 7 which was an effective eluent and did not result the destruction of NPs, then, they could subsequently be reused without any loss of their activity in further adsorption tests. Overall, our results showed the high efficiency of chitosan/magnetite nanoparticles (NPs) in lead removal from water in controlled conditions, and further studies should be realized in real field conditions.Keywords: chitosan, magnetite, water, treatment
Procedia PDF Downloads 4044374 Steam Reforming of Acetic Acid over Microwave-Synthesized Ce0.75Zr0.25O2 Supported Ni Catalysts
Authors: Panumard Kaewmora, Thirasak Rirksomboon, Vissanu Meeyoo
Abstract:
Due to the globally growing demands of petroleum fuel and fossil fuels, the scarcity or even depletion of fossil fuel sources could be inevitable. Alternatively, the utilization of renewable sources, such as biomass, has become attractive to the community. Biomass can be converted into bio-oil by fast pyrolysis. In water phase of bio-oil, acetic acid which is one of its main components can be converted to hydrogen with high selectivity over effective catalysts in steam reforming process. Steam reforming of acetic acid as model compound has been intensively investigated for hydrogen production using various metal oxide supported nickel catalysts and yet they seem to be rapidly deactivated depending on the support utilized. A catalyst support such as Ce1-xZrxO2 mixed oxide was proposed for alleviating this problem with the anticipation of enhancing hydrogen yield. However, catalyst preparation methods play a significant role in catalytic activity and performance of the catalysts. In this work, Ce0.75Zr0.25O2 mixed oxide solid solution support was prepared by urea hydrolysis using microwave as heat source. After that nickel metal was incorporated at 15 wt% by incipient wetness impregnation method. The catalysts were characterized by several techniques including BET, XRD, H2-TPR, XRF, SEM, and TEM as well as tested for the steam reforming of acetic acid at various operating conditions. Preliminary results showed that a hydrogen yield of ca. 32% with a relatively high acetic conversion was attained at 650°C.Keywords: acetic acid, steam reforming, microwave, nickel, ceria, zirconia
Procedia PDF Downloads 1744373 Investigation of Alumina Membrane Coated Titanium Implants on Osseointegration
Authors: Pinar Erturk, Sevde Altuntas, Fatih Buyukserin
Abstract:
In order to obtain an effective integration between an implant and a bone, implant surfaces should have similar properties to bone tissue surfaces. Especially mimicry of the chemical, mechanical and topographic properties of the implant to the bone is crucial for fast and effective osseointegration. Titanium-based biomaterials are more preferred in clinical use, and there are studies of coating these implants with oxide layers that have chemical/nanotopographic properties stimulating cell interactions for enhanced osseointegration. There are low success rates of current implantations, especially in craniofacial implant applications, which are large and vital zones, and the oxide layer coating increases bone-implant integration providing long-lasting implants without requiring revision surgery. Our aim in this study is to examine bone-cell behavior on titanium implants with an aluminum oxide layer (AAO) on effective osseointegration potential in the deformation of large zones with difficult spontaneous healing. In our study, aluminum layer coated titanium surfaces were anodized in sulfuric, phosphoric, and oxalic acid, which are the most common used AAO anodization electrolytes. After morphologic, chemical, and mechanical tests on AAO coated Ti substrates, viability, adhesion, and mineralization of adult bone cells on these substrates were analyzed. Besides with atomic layer deposition (ALD) as a sensitive and conformal technique, these surfaces were coated with pure alumina (5 nm); thus, cell studies were performed on ALD-coated nanoporous oxide layers with suppressed ionic content too. Lastly, in order to investigate the effect of the topography on the cell behavior, flat non-porous alumina layers on silicon wafers formed by ALD were compared with the porous ones. Cell viability ratio was similar between anodized surfaces, but pure alumina coated titanium and anodized surfaces showed a higher viability ratio compared to bare titanium and bare anodized ones. Alumina coated titanium surfaces, which anodized in phosphoric acid, showed significantly different mineralization ratios after 21 days over other bare titanium and titanium surfaces which anodized in other electrolytes. Bare titanium was the second surface that had the highest mineralization ratio. Otherwise, titanium, which is anodized in oxalic acid electrolyte, demonstrated the lowest mineralization. No significant difference was shown between bare titanium and anodized surfaces except AAO titanium surface anodized in phosphoric acid. Currently, osteogenic activities of these cells on the genetic level are investigated by quantitative real-time polymerase chain reaction (qRT-PCR) analysis results of RUNX-2, VEGF, OPG, and osteopontin genes. Also, as a result of the activities of the genes mentioned before, Western Blot will be used for protein detection. Acknowledgment: The project is supported by The Scientific and Technological Research Council of Turkey.Keywords: alumina, craniofacial implant, MG-63 cell line, osseointegration, oxalic acid, phosphoric acid, sulphuric acid, titanium
Procedia PDF Downloads 1314372 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services
Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme
Abstract:
Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing
Procedia PDF Downloads 1134371 The Implementation of Entrepreneurial Marketing in Small Business Enterprise
Authors: Iin Mayasari
Abstract:
This study aims at exploring the influence of aspects of entrepreneurial marketing on a firm’s performance. Entrepreneurs are not only supported by resources control to obtain sustainable competitive advantage, but it should also be supported by intangible resources. Entrepreneurial marketing provides the opportunity for entrepreneurs to proactively find better ways to create value for desired customers, to create innovation, and to build customer equity. Entrepreneurial marketing has the medium between entrepreneurship and marketing, and serves as an umbrella for many of the emergent perspectives on marketing. It has eight underlying dimensions. They are proactiveness, calculated risk-taking, innovativeness, an opportunity focus, entrepreneurial orientation, resource leveraging, customer intensity, and value creating. The research method of the study was a qualitative study by having an interview with 8 small companies in Kudus Region, the Central Java, Indonesia. The interviewees were the owner and the manager of the company that had the scope work of small business enterprise in wood crafting industry. The interview was related to the implementation of the elements of the entrepreneurial marketing. The result showed that the small business enterprises had implemented the elements of entrepreneurial marketing in supporting their daily activities. The understanding based on the theoretical implementation was well executed by the owner and managers. The problems in managing small business enterprises were related to the full support by the government and the branding management. Furthermore, the innovation process should be improved especially the use of internet to promote the product, to expand the market and to increase the firm’s performance.Keywords: entrepreneurial marketing, innovativeness, risk taking, opportunity focus
Procedia PDF Downloads 2984370 Improving Fingerprinting-Based Localization (FPL) System Using Generative Artificial Intelligence (GAI)
Authors: Getaneh Berie Tarekegn, Li-Chia Tai
Abstract:
With the rapid advancement of artificial intelligence, low-power built-in sensors on Internet of Things devices, and communication technologies, location-aware services have become increasingly popular and have permeated every aspect of people’s lives. Global navigation satellite systems (GNSSs) are the default method of providing continuous positioning services for ground and aerial vehicles, as well as consumer devices (smartphones, watches, notepads, etc.). However, the environment affects satellite positioning systems, particularly indoors, in dense urban and suburban cities enclosed by skyscrapers, or when deep shadows obscure satellite signals. This is because (1) indoor environments are more complicated due to the presence of many objects surrounding them; (2) reflection within the building is highly dependent on the surrounding environment, including the positions of objects and human activity; and (3) satellite signals cannot be reached in an indoor environment, and GNSS doesn't have enough power to penetrate building walls. GPS is also highly power-hungry, which poses a severe challenge for battery-powered IoT devices. Due to these challenges, IoT applications are limited. Consequently, precise, seamless, and ubiquitous Positioning, Navigation and Timing (PNT) systems are crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 474369 Carbon Coated Silicon Nanoparticles Embedded MWCNT/Graphene Matrix Anode Material for Li-Ion Batteries
Authors: Ubeyd Toçoğlu, Miraç Alaf, Hatem Akbulut
Abstract:
We present a work which was conducted in order to improve the cycle life of silicon based lithium ion battery anodes by utilizing novel composite structure. In this study, carbon coated nano sized (50-100 nm) silicon particles were embedded into Graphene/MWCNT silicon matrix to produce free standing silicon based electrodes. Also, conventional Si powder anodes were produced from Si powder slurry on copper current collectors in order to make comparison of composite and conventional anode structures. Free –standing composite anodes (binder-free) were produced via vacuum filtration from a well dispersion of Graphene, MWCNT and carbon coated silicon powders. Carbon coating process of silicon powders was carried out via microwave reaction system. The certain amount of silicon powder and glucose was mixed under ultrasonication and then coating was conducted at 200 °C for two hours in Teflon lined autoclave reaction chamber. Graphene which was used in this study was synthesized from well-known Hummers method and hydrazine reduction of graphene oxide. X-Ray diffraction analysis and RAMAN spectroscopy techniques were used for phase characterization of anodes. Scanning electron microscopy analyses were conducted for morphological characterization. The electrochemical performance tests were carried out by means of galvanostatic charge/discharge, cyclic voltammetry and electrochemical impedance spectroscopy.Keywords: graphene, Li-Ion, MWCNT, silicon
Procedia PDF Downloads 2564368 Enzymatic Saccharification of Dilute Alkaline Pre-treated Microalgal (Tetraselmis suecica) Biomass for Biobutanol Production
Authors: M. A. Kassim, R. Potumarthi, A. Tanksale, S. C. Srivatsa, S. Bhattacharya
Abstract:
Enzymatic saccharification of biomass for reducing sugar production is one of the crucial processes in biofuel production through biochemical conversion. In this study, enzymatic saccharification of dilute potassium hydroxide (KOH) pre-treated Tetraselmis suecica biomass was carried out by using cellulase enzyme obtained from Trichoderma longibrachiatum. Initially, the pre-treatment conditions were optimised by changing alkali reagent concentration, retention time for reaction, and temperature. The T. suecica biomass after pre-treatment was also characterized using Fourier Transform Infrared Spectra and Scanning Electron Microscope. These analyses revealed that the functional group such as acetyl and hydroxyl groups, structure and surface of T. suecica biomass were changed through pre-treatment, which is favourable for enzymatic saccharification process. Comparison of enzymatic saccharification of untreated and pre-treated microalgal biomass indicated that higher level of reducing sugar can be obtained from pre-treated T. suecica. Enzymatic saccharification of pre-treated T. suecica biomass was optimised by changing temperature, pH, and enzyme concentration to solid ratio ([E]/[S]). Highest conversion of carbohydrate into reducing sugar of 95% amounted to reducing sugar yield of 20 (wt%) from pre-treated T. suecica was obtained from saccharification, at temperature: 40°C, pH: 4.5 and [E]/[S] of 0.1 after 72 h of incubation. Hydrolysate obtained from enzymatic saccharification of pretreated T. suecica biomass was further fermented into biobutanol using Clostridium saccharoperbutyliticum as biocatalyst. The results from this study demonstrate a positive prospect of application of dilute alkaline pre-treatment to enhance enzymatic saccharification and biobutanol production from microalgal biomass.Keywords: microalgal biomass, enzymatic saccharification, biobutanol, fermentation
Procedia PDF Downloads 3854367 The Relevance of the U-Shaped Learning Model to the Acquisition of the Difference between C'est and Il Est in the English Learners of French Context
Authors: Pooja Booluck
Abstract:
A U-shaped learning curve entails a three-step process: a good performance followed by a bad performance followed by a good performance again. U-shaped curves have been observed not only in language acquisition but also in various fields such as temperature face recognition object permanence to name a few. Building on previous studies of the curve child language acquisition and Second Language Acquisition this empirical study seeks to investigate the relevance of the U-shaped learning model to the acquisition of the difference between cest and il est in the English Learners of French context. The present study was developed to assess whether older learners of French in the ELF context follow the same acquisition pattern. The empirical study was conducted on 15 English learners of French which lasted six weeks. Compositions and questionnaires were collected from each subject at three time intervals (after one week after three weeks after six weeks) after which students work were graded as being either correct or incorrect. The data indicates that there is evidence of a U-shaped learning curve in the acquisition of cest and il est and students did follow the same acquisition pattern as children in regards to rote-learned terms and subject clitics. This paper also discusses the need to introduce modules on U-shaped learning curve in teaching curriculum as many teachers are unaware of the trajectory learners undertake while acquiring core components in grammar. In addition this study also addresses the need to conduct more research on the acquisition of rote-learned terms and subject clitics in SLA.Keywords: child language acquisition, rote-learning, subject clitics, u-shaped learning model
Procedia PDF Downloads 2934366 Simulation and Assessment of Carbon Dioxide Separation by Piperazine Blended Solutions Using E-NRTL and Peng-Robinson Models: Study of Regeneration Heat Duty
Authors: Arash Esmaeili, Zhibang Liu, Yang Xiang, Jimmy Yun, Lei Shao
Abstract:
A high-pressure carbon dioxide (CO₂) absorption from a specific off-gas in a conventional column has been evaluated for the environmental concerns by the Aspen HYSYS simulator using a wide range of single absorbents and piperazine (PZ) blended solutions to estimate the outlet CO₂ concentration, CO₂ loading, reboiler power supply, and regeneration heat duty to choose the most efficient solution in terms of CO₂ removal and required heat duty. The property package, which is compatible with all applied solutions for the simulation in this study, estimates the properties based on the electrolyte non-random two-liquid (E-NRTL) model for electrolyte thermodynamics and Peng-Robinson equation of state for vapor phase and liquid hydrocarbon phase properties. The results of the simulation indicate that piperazine, in addition to the mixture of piperazine and monoethanolamine (MEA), demands the highest regeneration heat duty compared with other studied single and blended amine solutions, respectively. The blended amine solutions with the lowest PZ concentrations (5wt% and 10wt%) were considered and compared to reduce the cost of the process, among which the blended solution of 10wt%PZ+35wt%MDEA (methyldiethanolamine) was found as the most appropriate solution in terms of CO₂ content in the outlet gas, rich-CO₂ loading, and regeneration heat duty.Keywords: absorption, amine solutions, aspen HYSYS, CO₂ loading, piperazine, regeneration heat duty
Procedia PDF Downloads 1884365 Voice Liveness Detection Using Kolmogorov Arnold Networks
Authors: Arth J. Shah, Madhu R. Kamble
Abstract:
Voice biometric liveness detection is customized to certify an authentication process of the voice data presented is genuine and not a recording or synthetic voice. With the rise of deepfakes and other equivalently sophisticated spoofing generation techniques, it’s becoming challenging to ensure that the person on the other end is a live speaker or not. Voice Liveness Detection (VLD) system is a group of security measures which detect and prevent voice spoofing attacks. Motivated by the recent development of the Kolmogorov-Arnold Network (KAN) based on the Kolmogorov-Arnold theorem, we proposed KAN for the VLD task. To date, multilayer perceptron (MLP) based classifiers have been used for the classification tasks. We aim to capture not only the compositional structure of the model but also to optimize the values of univariate functions. This study explains the mathematical as well as experimental analysis of KAN for VLD tasks, thereby opening a new perspective for scientists to work on speech and signal processing-based tasks. This study emerges as a combination of traditional signal processing tasks and new deep learning models, which further proved to be a better combination for VLD tasks. The experiments are performed on the POCO and ASVSpoof 2017 V2 database. We used Constant Q-transform, Mel, and short-time Fourier transform (STFT) based front-end features and used CNN, BiLSTM, and KAN as back-end classifiers. The best accuracy is 91.26 % on the POCO database using STFT features with the KAN classifier. In the ASVSpoof 2017 V2 database, the lowest EER we obtained was 26.42 %, using CQT features and KAN as a classifier.Keywords: Kolmogorov Arnold networks, multilayer perceptron, pop noise, voice liveness detection
Procedia PDF Downloads 414364 Easymodel: Web-based Bioinformatics Software for Protein Modeling Based on Modeller
Authors: Alireza Dantism
Abstract:
Presently, describing the function of a protein sequence is one of the most common problems in biology. Usually, this problem can be facilitated by studying the three-dimensional structure of proteins. In the absence of a protein structure, comparative modeling often provides a useful three-dimensional model of the protein that is dependent on at least one known protein structure. Comparative modeling predicts the three-dimensional structure of a given protein sequence (target) mainly based on its alignment with one or more proteins of known structure (templates). Comparative modeling consists of four main steps 1. Similarity between the target sequence and at least one known template structure 2. Alignment of target sequence and template(s) 3. Build a model based on alignment with the selected template(s). 4. Prediction of model errors 5. Optimization of the built model There are many computer programs and web servers that automate the comparative modeling process. One of the most important advantages of these servers is that it makes comparative modeling available to both experts and non-experts, and they can easily do their own modeling without the need for programming knowledge, but some other experts prefer using programming knowledge and do their modeling manually because by doing this they can maximize the accuracy of their modeling. In this study, a web-based tool has been designed to predict the tertiary structure of proteins using PHP and Python programming languages. This tool is called EasyModel. EasyModel can receive, according to the user's inputs, the desired unknown sequence (which we know as the target) in this study, the protein sequence file (template), etc., which also has a percentage of similarity with the primary sequence, and its third structure Predict the unknown sequence and present the results in the form of graphs and constructed protein files.Keywords: structural bioinformatics, protein tertiary structure prediction, modeling, comparative modeling, modeller
Procedia PDF Downloads 974363 Antimicrobial Efficacy of Some Antibiotics Combinations Tested against Some Molecular Characterized Multiresistant Staphylococcus Clinical Isolates, in Egypt
Authors: Nourhan Hussein Fanaki, Hoda Mohamed Gamal El-Din Omar, Nihal Kadry Moussa, Eva Adel Edward Farid
Abstract:
The resistance of staphylococci to various antibiotics has become a major concern for health care professionals. The efficacy of the combinations of selected glycopeptides (vancomycin and teicoplanin) with gentamicin or rifampicin, as well as that of gentamicin/rifampicin combination, was studied against selected pathogenic staphylococcus isolated from Egypt. The molecular distribution of genes conferring resistance to these four antibiotics was detected among tested clinical isolates. Antibiotic combinations were studied using the checkerboard technique and the time-kill assay (in both the stationary and log phases). Induction of resistance to glycopeptides in staphylococci was tried in the absence and presence of diclofenac sodium as inducer. Transmission electron microscopy was used to study the effect of glycopeptides on the ultrastructure of the cell wall of staphylococci. Attempts were made to cure gentamicin resistance plasmids and to study the transfer of these plasmids by conjugation. Trials for the transformation of the successfully isolated gentamicin resistance plasmid to competent cells were carried out. The detection of genes conferring resistance to the tested antibiotics was performed using the polymerase chain reaction. The studied antibiotic combinations proved their efficacy, especially when tested during the log phase. Induction of resistance to glycopeptides in staphylococci was more promising in presence of diclofenac sodium, compared to its absence. Transmission electron microscopy revealed the thickening of bacterial cell wall in staphylococcus clinical isolates due to the presence of tested glycopeptides. Curing of gentamicin resistance plasmids was only successful in 2 out of 9 tested isolates, with a curing rate of 1 percent for each. Both isolates, when used as donors in conjugation experiments, yielded promising conjugation frequencies ranging between 5.4 X 10-2 and 7.48 X 10-2 colony forming unit/donor cells. Plasmid isolation was only successful in one out of the two tested isolates. However, low transformation efficiency (59.7 transformants/microgram plasmid DNA) of such plasmids was obtained. Negative regulators of autolysis, such as arlR, lytR and lrgB, as well as cell-wall associated genes, such as pbp4 and/or pbp2, were detected in staphylococcus isolates with reduced susceptibility to the tested glycopeptides. Concerning rifampicin resistance genes, rpoBstaph was detected in 75 percent of the tested staphylococcus isolates. It could be concluded that in vitro studies emphasized the usefulness of the combination of vancomycin or teicoplanin with gentamicin or rifampicin, as well as that of gentamicin with rifampicin, against staphylococci showing varying resistance patterns. However, further in vivo studies are required to ensure the safety and efficacy of such combinations. Diclofenac sodium can act as an inducer of resistance to glycopeptides in staphylococci. Cell-wall thickness is a major contributor to such resistance among them. Gentamicin resistance in these strains could be chromosomally or plasmid mediated. Multiple mutations in the rpoB gene could mediate staphylococcus resistance to rifampicin.Keywords: glycopeptides, combinations, induction, diclofenac, transmission electron microscopy, polymerase chain reaction
Procedia PDF Downloads 2934362 Synthesis and Characterisation of Starch-PVP as Encapsulation Material for Drug Delivery System
Authors: Nungki Rositaningsih, Emil Budianto
Abstract:
Starch has been widely used as an encapsulation material for drug delivery system. However, starch hydrogel is very easily degraded during metabolism in human stomach. Modification of this material is needed to improve the encapsulation process in drug delivery system, especially for gastrointestinal drug. In this research, three modified starch-based hydrogels are synthesized i.e. Crosslinked starch hydrogel, Semi- and Full- Interpenetrating Polymer Network (IPN) starch hydrogel using Poly(N-Vinyl-Pyrrolidone). Non-modified starch hydrogel was also synthesized as a control. All of those samples were compared as biomaterials, floating drug delivery, and their ability in loading drug test. Biomaterial characterizations were swelling test, stereomicroscopy observation, Differential Scanning Calorimetry (DSC), and Fourier Transform Infrared Spectroscopy (FTIR). Buoyancy test and stereomicroscopy scanning were done for floating drug delivery characterizations. Lastly, amoxicillin was used as test drug, and characterized with UV-Vis spectroscopy for loading drug observation. Preliminary observation showed that Full-IPN has the most dense and elastic texture, followed by Semi-IPN, Crosslinked, and Non-modified in the last position. Semi-IPN and Crosslinked starch hydrogel have the most ideal properties and will not be degraded easily during metabolism. Therefore, both hydrogels could be considered as promising candidates for encapsulation material. Further analysis and issues will be discussed in the paper.Keywords: biomaterial, drug delivery system, interpenetrating polymer network, poly(N-vinyl-pyrrolidone), starch hydrogel
Procedia PDF Downloads 2514361 Study of Pseudomonas as Biofertiliser in Salt-Affected Soils of the Northwestern Algeria: Solubilisation of Calcium Phosphate and Growth Promoting of Broad Bean (Vcia faba)
Authors: A. Djoudi, R. Djibaou, H. A. Reguieg Yssaad
Abstract:
Our study focuses on the study of a bacteria belonging to Pseudomonas solubilizing tricalcium phosphate. They were isolated from rhizosphere of a variety of broad bean grown in salt-affected soils (electrical conductivity between 4 and 8 mmhos/cm) of the irrigated perimeter of Mina in northwestern Algeria. Isolates which have advantageous results in the calcium phosphate solubilization index test were subjected to identification using API20 then used to re-inoculate the same soil in pots experimentation to assess the effects of inoculation on the growth of the broad bean (Vicia faba). Based on the results obtained from the in-vitro tests, two isolates P5 and P8 showed a significant effect on the solubilization of tricalcium phosphate with an index I estimated at 314% and 283% sequentially. According to the results of in-vivo tests, the inoculation of the soil with P5 and P8 were significantly and positively influencing the growth in biometric parameters of the broad bean. Inoculation with strain P5 has promoted the growth of the broad bean in stem height, stem fresh weight and stem dry weight of 108.59%, 115.28%, 104.33%, respectively. Inoculation with strain P8 has fostered the growth of the broad bean stem fresh weight of 112.47%. The effect of Pseudomonas on the development of Vicia faba is considered as an interesting process by which PGPR can increase biological production and crop protection.Keywords: Pseudomonas, Vicia faba, promoting of plant growth, solubilization tricalcium phosphate
Procedia PDF Downloads 3294360 Investigating the Environmental Impact of Additive Manufacturing Compared to Conventional Manufacturing through Life Cycle Assessment
Authors: Gustavo Menezes De Souza Melo, Arnaud Heitz, Johannes Henrich Schleifenbaum
Abstract:
Additive manufacturing is a growing market that is taking over in many industries as it offers numerous advantages like new design possibilities, weight-saving solutions, ease of manufacture, and simplification of assemblies. These are all unquestionable technical or financial assets. As to the environmental aspect, additive manufacturing is often discussed whether it is the best solution to decarbonize our industries or if conventional manufacturing remains cleaner. This work presents a life cycle assessment (LCA) comparison based on the technological case of a motorbike swing-arm. We compare the original equipment manufacturer part made with conventional manufacturing (CM) methods to an additive manufacturing (AM) version printed using the laser powder bed fusion process. The AM version has been modified and optimized to achieve better dynamic performance without any regard to weight saving. Lightweight not being a priority in the creation of the 3D printed part brings us a unique perspective in this study. To achieve the LCA, we are using the open-source life cycle, and sustainability software OpenLCA combined with the ReCiPe 2016 at midpoint and endpoint level method. This allows the calculation and the presentation of the results through indicators such as global warming, water use, resource scarcity, etc. The results are then showing the relative impact of the AM version compared to the CM one and give us a key to understand and answer questions about the environmental sustainability of additive manufacturing.Keywords: additive manufacturing, environmental impact, life cycle assessment, laser powder bed fusion
Procedia PDF Downloads 2634359 University Curriculum Policy Processes in Chile: A Case Study
Authors: Victoria C. Valdebenito
Abstract:
Located within the context of accelerating globalization in the 21st-century knowledge society, this paper focuses on one selected university in Chile at which radical curriculum policy changes have been taking place, diverging from the traditional curriculum in Chile at the undergraduate level as a section of a larger investigation. Using a ‘policy trajectory’ framework, and guided by the interpretivist approach to research, interview transcripts and institutional documents were analyzed in relation to the meso (university administration) and the micro (academics) level. Inside the case study, participants from the university administration and academic levels were selected both via snow-ball technique and purposive selection, thus they had different levels of seniority, with some participating actively in the curriculum reform processes. Guided by an interpretivist approach to research, documents and interview transcripts were analyzed to reveal major themes emerging from the data. A further ‘bigger picture’ analysis guided by critical theory was then undertaken, involving interrogation of underlying ideologies and how political and economic interests influence the cultural production of policy. The case-study university was selected because it represents a traditional and old case of university setting in the country, undergoing curriculum changes based on international trends such as the competency model and the liberal arts. Also, it is representative of a particular socioeconomic sector of the country. Access to the university was gained through email contact. Qualitative research methods were used, namely interviews and analysis of institutional documents. In all, 18 people were interviewed. The number was defined by when the saturation criterion was met. Semi-structured interview schedules were based on the four research questions about influences, policy texts, policy enactment and longer-term outcomes. Triangulation of information was used for the analysis. While there was no intention to generalize the specific findings of the case study, the results of the research were used as a focus for engagement with broader themes, often evident in global higher education policy developments. The research results were organized around major themes in three of the four contexts of the ‘policy trajectory’. Regarding the context of influences and the context of policy text production, themes relate to hegemony exercised by first world countries’ universities in the higher education field, its associated neoliberal ideology, with accountability and the discourse of continuous improvement, the local responses to those pressures, and the value of interdisciplinarity. Finally, regarding the context of policy practices and effects (enactment), themes emerged around the impacts of the curriculum changes on university staff, students, and resistance amongst academics. The research concluded with a few recommendations that potentially provide ‘food for thought’ beyond the localized settings of this study, as well as possibilities for further research.Keywords: curriculum, global-local dynamics, higher education, policy, sociology of education
Procedia PDF Downloads 784358 Development of a Dairy Drink Made of Cocoa, Coffee and Orange By-Products with Antioxidant Activity
Authors: Gianella Franco, Karen Suarez, María Quijano, Patricia Manzano
Abstract:
Agro-industries generate large amounts of waste, which are mostly untapped. This research was carried out to use cocoa, coffee and orange industrial by-products to develop a dairy drink. The product was prepared by making a 10% aqueous extract of the mixture of cocoa and coffee beans shells and orange peel. Extreme Vertices Mixture Design was applied to vary the proportions of the ingredients of the aqueous extract, getting 13 formulations. Each formulation was mixed with skim milk and pasteurized. The attributes of taste, smell, color and appearance were evaluated by a semi-trained panel by multiple comparisons test, comparing the formulations against a standard marked as "R", which consisted of a coffee commercial drink. The formulations with the highest scores were selected to maximize the Total Polyphenol Content (TPC) through a process of linear optimization resulting in the formulation 80.5%: 18.37%: 1.13% of cocoa bean shell, coffee bean shell and orange peel, respectively. The Total Polyphenol Content was 4.99 ± 0.34 mg GAE/g of drink, DPPH radical scavenging activity (%) was 80.14 ± 0.05 and caffeine concentration of 114.78 mg / L, while the coffee commercial drink presented 3.93 ± 0.84 mg GAE / g drink, 55.54 ± 0.03 % and 47.44 mg / L of TPC, DPPH radical scavenging activity and caffeine content, respectively. The results show that it is possible to prepare an antioxidant - rich drink with good sensorial attributes made of industrial by-products.Keywords: DPPH, polyphenols, waste, food science
Procedia PDF Downloads 4684357 The Role of Public Management Development in Enhancing Public Service Delivery in the South African Local Government
Authors: Andrew Enaifoghe
Abstract:
The study examined the role of public management development in enhancing public service delivery in the South African local government. The study believes that the ultimate empowerment of the third tier sphere of governments in South Africa remains the instrument required to enhance both national and continental development. This over the year has been overwhelmed with problems and imbalance related to ethical practice, accountability and the functional local government system and the machinery itself. The study finds that imbalances are being strengthened by a lack of understanding and unanimity as to what a public management development in a democratic system is and how it should work to achieve the dividends of democracy in delivering public goods. Studies indicated that the magnitudes are widespread corruption and misrepresentations of government priorities; both of which weaken the ability of governments to enhance broad-based economic growth and social well-being of the people. This study addressed the problem of public management and accountable local government. The study indicates the need for citizens’ participation in the decision-making process in delivering public service in South Africa and how its accountability mechanism supports good governance. The study concludes that good and ethical watersheds in South Africa have since reached such proportions that social pressure, the pressure from the government and various institutions have to re-consider where they stand regarding ethics, ethical behaviour, accountability and professionalism in delivering public goods to the people at the local municipal government.Keywords: accountability, development, democratic system, South Africa
Procedia PDF Downloads 1254356 An Automatic Bayesian Classification System for File Format Selection
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.Keywords: data mining, digital libraries, digital preservation, file format
Procedia PDF Downloads 4994355 Study on Developmental and Pathogenesis Related Genes Expression Deregulation in Brassica compestris Infected with 16Sr-IX Associated Phytoplasma
Authors: Samina Jam Nazeer Ahmad, Samia Yasin, Ijaz Ahmad, Muhammad Tahir, Jam Nazeer Ahmad
Abstract:
Phytoplasmas are phloem-inhibited plant pathogenic bacteria that are transferred by insect vectors. Among biotic factors, Phytoplasma infection induces abnormality influencing the physiology as well as morphology of plants. In 16Sr-IX group phytoplasma-infected brassica compestris, flower abnormalities have been associated with changes in the expression of floral development genes. To determine whether methylation was involved in down-regulation of flower development, the process of DNA methylation and Demethylation was investigated as a possible mechanism for regulation of floral gene expression in phytoplasma infected Brassica transmitted by Orosious orientalis vector by using RT-PCR, MSRE-PCR, Southern blotting, Bisulfite Sequencing, etc. Transcriptional expression of methylated genes was found to be globally down-regulated in plants infected with phytoplasma, but not severely in those infested by insect vectors and variation in expression was found in genes involved in methylation. These results also showed that genes particularly orthologous to Arabidopsis APETALA3 involved in petal formation and flower development was down-regulated severely in phytoplasma-infected brassica and with the fact that phytoplasma and insect induce variation in developmental gene expression. The DNA methylation status of flower developmental gene in phytoplasma infected plants with 5-azacytidine restored gene expression strongly suggesting that DNA methylation was involved in down-regulation of floral development genes in phytoplasma infected brassica.Keywords: genes expression, phytoplasma, DNA methylation, flower development
Procedia PDF Downloads 3744354 Leptin Levels in Cord Blood and Their Associations with the Birth of Small, Large and Appropriate for Gestational Age Infants in Southern Sri Lanka
Authors: R. P. Hewawasam, M. H. A. D. de Silva, M. A. G. Iresha
Abstract:
In recent years childhood obesity has increased to pan-epidemic proportions along with a concomitant increase in obesity-associated morbidity. Birth weight is an important determinant of later adult health, with neonates at both ends of the birth weight spectrum at risk of future health complications. Consequently, infants who are born large for gestational age (LGA) are more likely to be obese in childhood and adolescence and are at risk of cardiovascular and metabolic complications later in life. Adipose tissue plays a role in linking events in fetal growth to the subsequent development of adult diseases. In addition to its role as a storage depot for fat, adipose tissue produces and secrets a number of hormones of importance in modulating metabolism and energy homeostasis. Cord blood leptin level has been positively correlated with fetal adiposity at birth. It is established that Asians have lower skeletal muscle mass, low bone mineral content and excess body fat for a given body mass index indicating a genetic predisposition in the occurrence of obesity. To our knowledge, studies have never been conducted in Sri Lanka to determine the relationship between adipocytokine profile in cord blood and anthropometric parameters in newborns. Thus, the objective of this study is to establish the above relationship for the Sri Lankan population to implement awareness programs to minimize childhood obesity in the future. Umbilical cord blood was collected from 90 newborns (Male 40, Female 50; gestational age 35-42 weeks) after double clamping the umbilical cord before separation of the placenta and the concentration of leptin was measured by ELISA technique. Anthropometric parameters of the newborn such as birth weight, length, ponderal index, occipital frontal, chest, hip and calf circumferences were measured. Pearson’s correlation was used to assess the relationship between leptin and anthropometric parameters while the Mann-Whitney U test was used to assess the differences in cord blood leptin levels between small for gestational age (SGA), appropriate for gestational age (AGA) and LGA infants. There was a significant difference (P < 0.05) between the cord blood leptin concentrations of LGA infants (12.67 ng/mL ± 2.34) and AGA infants (7.10 ng/mL ± 0.90). However, a significant difference was not observed between leptin levels of SGA infants (8.86 ng/mL ± 0.70) and AGA infants. In both male and female neonates, umbilical leptin levels showed significant positive correlations (P < 0.05) with birth weight of the newborn, pre-pregnancy maternal weight and pre pregnancy BMI between the infants of large and appropriate for gestational ages. Increased concentrations of leptin levels in the cord blood of large for gestational age infants suggest that they may be involved in regulating fetal growth. Leptin concentration of Sri Lankan population was not significantly deviated from published data of Asian populations. Fetal leptin may be an important predictor of neonatal adiposity; however, interventional studies are required to assess its impact on the possible risk of childhood obesity.Keywords: appropriate for gestational age, childhood obesity, leptin, anthropometry
Procedia PDF Downloads 1884353 The Romero-System Clarinet: A Milestone in the 19th Century Clarinet Manufacture
Authors: Pedro Rubio
Abstract:
Antonio Romero y Andía, was one of the most active and interesting figures in 19th century Spanish music. He was not only an exceptional clarinetist, he was also a publisher, a brilliant oboist, a music critic, and he revitalized Madrid’s musical scene by promoting orchestras and a national opera. In 1849, Romero was appointed Professor of Clarinet at the Conservatory of Madrid. Shortly after, Romero introduced to Spain the Boehm-System clarinet recently appeared in France. However, when initial interest in that system waned, he conceived his own system in 1853. The clarinet was manufactured in Paris by Lefêvre, who registered its first patent in 1862. In 1867 a second version was patented, and a year earlier, in 1866, the Romero clarinet was adopted as an official instrument for teaching the clarinet at the Conservatory of Madrid. The Romero-System clarinet mechanism has incorporated numerous additional devices and several extra keys, its skillful combination in a single instrument represents not only one of the pinnacles in the manufacture of musical instruments of the 19th century, but also an authentic synthesis of knowledge and practice in an era in which woodwind instruments were shaped as we know them today. Through the description and analysis of the data related to the aforementioned historical period, this lecture will try to show a crucial time in the history of all woodwind instruments, a period of technological effervescence in which the Romero-System clarinet emerged. The different stages of conception of the clarinet will be described, as well as its manufacturing and marketing process. Romero played with his clarinet system over twenty-five years. The research has identified the repertoire associated with this instrument whose conclusions will be presented in its case in the Congress.Keywords: Antonio Romero, clarinet, keywork, 19th century
Procedia PDF Downloads 1264352 Studying the Relationship Between Washback Effects of IELTS Test on Iranian Language Teachers, Teaching Strategies and Candidates
Authors: Afsaneh Jasmine Majidi
Abstract:
Language testing is an important part of language teaching experience and language learning process as it presents assessment strategies for teachers to evaluate the efficiency of teaching and for learners to examine their outcomes. However, language testing is demanding and challenging because it should provide the opportunity for proper and objective decision. In addition to all the efforts test designers put to design valid and reliable tests, there are some other determining factors which are even more complex and complicated. These factors affect the educational system, individuals, and society, and the impact of the tests vary according to the scope of the test. Seemingly, the impact of a simple classroom assessment is not the same as that of high stake tests such as International English Language Testing System (IELTS). As the importance of the test increases, it affects wider domain. Accordingly, the impacts of high stake tests are reflected not only in teaching, learning strategies but also in society. Testing experts use the term ‘washback’ or ‘impact’ to define the different effects of a test on teaching, learning, and community. This paper first looks at the theoretical background of ‘washback’ and ‘impact’ in language testing by reviewing of relevant literature in the field and then investigates washback effects of IELTS test of on Iranian IELTS teachers and students. The study found significant relationship between the washback effect of IELTS test and teaching strategies of Iranian IELTS teachers as well as performance of Iranian IELTS candidates and their community.Keywords: high stake tests, IELTS, Iranian Candidates, language testing, test impact, washback
Procedia PDF Downloads 3274351 Technologies in Municipal Solid Waste Management in Indian Towns
Authors: Gargi Ghosh
Abstract:
Municipal solid waste management (MSWM) is an obligatory function of the local self-government as per the Indian constitution, and this paper gives a glimpse of the system in Indian towns focusing on its present state and use of technology in the system. The paper analyses the MSWM characteristics in 35 towns in the southern state of Karnataka. The lifestyle in these towns was found to be very sustainable with minimal disposal and considerable reuse. Average per capita waste generated in the towns ranged from 300 gm/person to 500 gm/person. The waste collection efficiency varied from 60% to 80%. The waste shows equal share of organic and non-organic waste composition with a low calorific value. Lack of capacity of the municipal body in terms of manpower, assets & knowledge and social consciousness were found to be two major issues in the system. Technical solutions in use in India at present are composting, organic re-reprocessing, bio-methanation, waste to energy etc. The tonnage of waste generated ranged from 8 TPD to 80 TPD. The feasibility of technology has been analysed in the context of the above characteristics. It was found that low calorific value and mixed nature of waste made waste to energy and bio methanation processes unsuitable. Composting – windrow and closed door was found best to treat the bulk of the waste. Organic–re-processors was planned for phase 2 of MSWM program in the towns with effective implementation of segregation at source. GPS and RFID technology was recommended for monitoring the collection process and increasing accountability of the citizens for effective implementation.Keywords: solid waste management, Indian towns, waste management technology, waste charateristics
Procedia PDF Downloads 3214350 Sustainable Reconstruction: Towards Guidelines of Post-Disaster Vulnerability Reduction for Permanent Informal Housing in Malaysia Due to Flooding
Authors: Ruhizal Roosli, Julaihi Wahid, Abu Hassan Abu Bakar, Faizal Baharum
Abstract:
This paper reports on the progress of a study on the reconstruction project after the ‘Yellow Flood’ disaster in Kelantan, Malaysia. Malaysia still does not have guidelines to build housing after a disaster especially in disaster-prone areas. At the international level, many guidelines have been prepared that is found suitable for post-disaster housing. Which guidelines can be adapted that best describes the situation in Malaysia? It was reported that the houses should be built on stilts, which can withstand certain level of impact during flooding. Unfortunately, until today no specific guideline was available to assist homeowners to rebuild their homes after disaster. In addition, there is also no clear operational procedure to monitor the progress of this construction work. This research is an effort to promoting resilient housing; safety and security; and secure tenure in a prone area. At the end of this study, key lessons will be emerged from the review process and data analysis. These inputs will then have influenced to the content that will be developed and presented as guidelines. An overall objective is to support humanitarian responses to disaster and conflicts for resilience house construction to flood prone area. Interviews with the field based staff were from recent post-disaster housing workforce (disaster management mechanism in Malaysia especially in Kelantan). The respondents were selected based on their experiences in disaster response particularly related to housing provision. These key lessons are perhaps the best practical (operational and technical) guidelines comparing to other International cases to be adapted to the national situations.Keywords: disaster, guideline, housing, Malaysia, reconstruction
Procedia PDF Downloads 5214349 Light Sensitive Plasmonic Nanostructures for Photonic Applications
Authors: Istvan Csarnovics, Attila Bonyar, Miklos Veres, Laszlo Himics, Attila Csik, Judit Kaman, Julia Burunkova, Geza Szanto, Laszlo Balazs, Sandor Kokenyesi
Abstract:
In this work, the performance of gold nanoparticles were investigated for stimulation of photosensitive materials for photonic applications. It was widely used for surface plasmon resonance experiments, not in the last place because of the manifestation of optical resonances in the visible spectral region. The localized surface plasmon resonance is rather easily observed in nanometer-sized metallic structures and widely used for measurements, sensing, in semiconductor devices and even in optical data storage. Firstly, gold nanoparticles on silica glass substrate satisfy the conditions for surface plasmon resonance in the green-red spectral range, where the chalcogenide glasses have the highest sensitivity. The gold nanostructures influence and enhance the optical, structural and volume changes and promote the exciton generation in gold nanoparticles/chalcogenide layer structure. The experimental results support the importance of localized electric fields in the photo-induced transformation of chalcogenide glasses as well as suggest new approaches to improve the performance of these optical recording media. Results may be utilized for direct, micrometre- or submicron size geometrical and optical pattern formation and used also for further development of the explanations of these effects in chalcogenide glasses. Besides of that, gold nanoparticles could be added to the organic light-sensitive material. The acrylate-based materials are frequently used for optical, holographic recording of optoelectronic elements due to photo-stimulated structural transformations. The holographic recording process and photo-polymerization effect could be enhanced by the localized plasmon field of the created gold nanostructures. Finally, gold nanoparticles widely used for electrochemical and optical sensor applications. Although these NPs can be synthesized in several ways, perhaps one of the simplest methods is the thermal annealing of pre-deposited thin films on glass or silicon surfaces. With this method, the parameters of the annealing process (time, temperature) and the pre-deposited thin film thickness influence and define the resulting size and distribution of the NPs on the surface. Localized surface plasmon resonance (LSPR) is a very sensitive optical phenomenon and can be utilized for a large variety of sensing purposes (chemical sensors, gas sensors, biosensors, etc.). Surface-enhanced Raman spectroscopy (SERS) is an analytical method which can significantly increase the yield of Raman scattering of target molecules adsorbed on the surface of metallic nanoparticles. The sensitivity of LSPR and SERS based devices is strongly depending on the used material and also on the size and geometry of the metallic nanoparticles. By controlling these parameters the plasmon absorption band can be tuned and the sensitivity can be optimized. The technological parameters of the generated gold nanoparticles were investigated and influence on the SERS and on the LSPR sensitivity was established. The LSPR sensitivity were simulated for gold nanocubes and nanospheres with MNPBEM Matlab toolbox. It was found that the enhancement factor (which characterize the increase in the peak shift for multi-particle arrangements compared to single-particle models) depends on the size of the nanoparticles and on the distance between the particles. This work was supported by GINOP- 2.3.2-15-2016-00041 project, which is co-financed by the European Union and European Social Fund. Istvan Csarnovics is grateful for the support through the New National Excellence Program of the Ministry of Human Capacities, supported by the ÚNKP-17-4 Attila Bonyár and Miklós Veres are grateful for the support of the János Bolyai Research Scholarship of the Hungarian Academy of Sciences.Keywords: light sensitive nanocomposites, metallic nanoparticles, photonic application, plasmonic nanostructures
Procedia PDF Downloads 3064348 In-Flight Aircraft Performance Model Enhancement Using Adaptive Lookup Tables
Authors: Georges Ghazi, Magali Gelhaye, Ruxandra Botez
Abstract:
Over the years, the Flight Management System (FMS) has experienced a continuous improvement of its many features, to the point of becoming the pilot’s primary interface for flight planning operation on the airplane. With the assistance of the FMS, the concept of distance and time has been completely revolutionized, providing the crew members with the determination of the optimized route (or flight plan) from the departure airport to the arrival airport. To accomplish this function, the FMS needs an accurate Aircraft Performance Model (APM) of the aircraft. In general, APMs that equipped most modern FMSs are established before the entry into service of an individual aircraft, and results from the combination of a set of ordinary differential equations and a set of performance databases. Unfortunately, an aircraft in service is constantly exposed to dynamic loads that degrade its flight characteristics. These degradations endow two main origins: airframe deterioration (control surfaces rigging, seals missing or damaged, etc.) and engine performance degradation (fuel consumption increase for a given thrust). Thus, after several years of service, the performance databases and the APM associated to a specific aircraft are no longer representative enough of the actual aircraft performance. It is important to monitor the trend of the performance deterioration and correct the uncertainties of the aircraft model in order to improve the accuracy the flight management system predictions. The basis of this research lies in the new ability to continuously update an Aircraft Performance Model (APM) during flight using an adaptive lookup table technique. This methodology was developed and applied to the well-known Cessna Citation X business aircraft. For the purpose of this study, a level D Research Aircraft Flight Simulator (RAFS) was used as a test aircraft. According to Federal Aviation Administration the level D is the highest certification level for the flight dynamics modeling. Basically, using data available in the Flight Crew Operating Manual (FCOM), a first APM describing the variation of the engine fan speed and aircraft fuel flow w.r.t flight conditions was derived. This model was next improved using the proposed methodology. To do that, several cruise flights were performed using the RAFS. An algorithm was developed to frequently sample the aircraft sensors measurements during the flight and compare the model prediction with the actual measurements. Based on these comparisons, a correction was performed on the actual APM in order to minimize the error between the predicted data and the measured data. In this way, as the aircraft flies, the APM will be continuously enhanced, making the FMS more and more precise and the prediction of trajectories more realistic and more reliable. The results obtained are very encouraging. Indeed, using the tables initialized with the FCOM data, only a few iterations were needed to reduce the fuel flow prediction error from an average relative error of 12% to 0.3%. Similarly, the FCOM prediction regarding the engine fan speed was reduced from a maximum error deviation of 5.0% to 0.2% after only ten flights.Keywords: aircraft performance, cruise, trajectory optimization, adaptive lookup tables, Cessna Citation X
Procedia PDF Downloads 2644347 Mastering Test Automation: Bridging Gaps for Seamless QA
Authors: Rohit Khankhoje
Abstract:
The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.Keywords: automation framework, API integration, test automation, test management tools
Procedia PDF Downloads 73