Search results for: Network Time Protocol
9136 Positive Politeness in Writing Centre Consultations with an Emphasis on Praise
Authors: Avasha Rambiritch, Adelia Carstens
Abstract:
In especially the context of a writing center, learning takes place during, and as part of, the conversations between the writing center tutor and the student. This interaction or dialogue is an integral part of writing center research and is the focus of this largely qualitative study, employing a politeness lens. While there is some research on positive politeness strategies employed by writing center tutors, there is very little research on specifically praising as a positive politeness strategy. This study attempts to fill this gap by analyzing a corpus of 10 video-recorded consultations to determine how tutors in a writing center utilize the positive politeness strategy of praise. Findings indicate that while tutors exploit a range of politeness strategies, praise is used more often than any other strategy. The research indicates that praise as a politeness strategy is utilized significantly more when commenting on higher-order concerns, as in line with the writing center literature. The benefits of this study include insights into how such analyses can be used to better prepare and equip the tutors (usually postgraduate students appointed as part-time tutors in the writing center) for the work they do on a daily basis.Keywords: writing center, academic writing, positive politeness, tutor
Procedia PDF Downloads 2219135 A Thermal Analysis Based Approach to Obtain High Carbonaceous Fibers from Chicken Feathers
Authors: Y. Okumuş, A. Tuna, A. T. Seyhan, H. Çelebi
Abstract:
Useful carbon fibers were derived from chicken feathers (PCFs) based on a two-step pyrolysis method. The collected PCFs were cleaned and categorized as black, white and brown. Differential scanning calorimeter (DSC) and thermo-gravimetric analyzer (TGA) were systemically used to design the pyrolysis steps. Depending on colors, feathers exhibit different glass transition (Tg) temperatures. Long-time heat treatment applied to the feathers emerged influential on the surface quality of the resulting carbon fibers. Fourier Transformation Infrared (FTIR) examination revealed that the extent of disulfide bond cleavage is highly associated with the feather melting stability. Scanning electron microscopy (SEM) examinations were employed to evaluate the morphological changes of feathers after pyrolysis. Of all, brown feathers were found to be the most promising to turn into useful carbon fibers without any trace of melting and shape distortion when pyrolysis was carried out at 230°C for 24 hours and at 450°C for 1 hour.Keywords: poultry chicken feather, keratin protein fiber, pyrolysis, high carbonaceous fibers
Procedia PDF Downloads 3339134 Identifying Dominant Anaerobic Microorganisms for Degradation of Benzene
Authors: Jian Peng, Wenhui Xiong, Zheng Lu
Abstract:
An optimal recipe of amendment (nutrients and electron acceptors) was developed and dominant indigenous benzene-degrading microorganisms were characterized in this study. Lessons were learnt from the development of the optimal amendment recipe: (1) salinity and substantial initial concentration of benzene were detrimental for benzene biodegradation; (2) large dose of amendments can shorten the lag time for benzene biodegradation occurrence; (3) toluene was an essential co-substance for promoting benzene degradation activity. The stable isotope probing study identified incorporation 13C from 13C-benzene into microorganisms, which can be considered as a direct evidence of the occurrence of benzene biodegradation. The dominant mechanism for benzene removal was identified by quantitative polymerase chain reaction analysis to be nitrate reduction. Microbial analyses (denaturing gradient gel electrophoresis and 16S ribosomal RNA) demonstrated that members of genus Dokdonella spp., Pusillimonas spp., and Advenella spp. were predominant within the microbial community and involved in the anaerobic benzene bioremediation.Keywords: benzene, enhanced anaerobic bioremediation, stable isotope probing, biosep biotrap
Procedia PDF Downloads 3459133 Rawson vs. Kerlogue: Two Views on Southeast Asian Art History
Authors: Rin Li Si Samantha
Abstract:
The arts and cultures of Southeast Asia, particularly ancient or precolonial Southeast Asia, are commonly understood via two distinct theories: Indianisation and localisation. Indianisation takes Southeast Asia as a region to be cultural satellites or even colonies of a great Indian civilisation; Philip Rawson, in his 1967 book The Art of Southeast Asia, is to a large degree a proponent of this perspective. Localisation, a theory which has gained much traction in contemporaneous discourse, chooses instead to privilege local continuities and agencies in selectively accepting and adapting foreign influences to give form to new, syncretised traditions. The art historian Fiona Kerlogue’ similarly-named Arts of Southeast Asia, published in 2004, takes this perspective as its bedrock. This essay compares the many opposing ideological commitments of Rawson and Kerlogue: Indianisation versus localisation, evaluation versus explanation, and antiquity versus entirety. In the end, it reconciles the two as hallmarks of their time periods and is complementary in the pursuit of a holistic study of the art history of Southeast Asia.Keywords: art history, Southeast Asia, Indianisation, localisation, precolonial, orientalism, comparative analysis, text
Procedia PDF Downloads 1529132 The Problem of the Use of Learning Analytics in Distance Higher Education: An Analytical Study of the Open and Distance University System in Mexico
Authors: Ismene Ithai Bras-Ruiz
Abstract:
Learning Analytics (LA) is employed by universities not only as a tool but as a specialized ground to enhance students and professors. However, not all the academic programs apply LA with the same goal and use the same tools. In fact, LA is formed by five main fields of study (academic analytics, action research, educational data mining, recommender systems, and personalized systems). These fields can help not just to inform academic authorities about the situation of the program, but also can detect risk students, professors with needs, or general problems. The highest level applies Artificial Intelligence techniques to support learning practices. LA has adopted different techniques: statistics, ethnography, data visualization, machine learning, natural language process, and data mining. Is expected that any academic program decided what field wants to utilize on the basis of his academic interest but also his capacities related to professors, administrators, systems, logistics, data analyst, and the academic goals. The Open and Distance University System (SUAYED in Spanish) of the University National Autonomous of Mexico (UNAM), has been working for forty years as an alternative to traditional programs; one of their main supports has been the employ of new information and communications technologies (ICT). Today, UNAM has one of the largest network higher education programs, twenty-six academic programs in different faculties. This situation means that every faculty works with heterogeneous populations and academic problems. In this sense, every program has developed its own Learning Analytic techniques to improve academic issues. In this context, an investigation was carried out to know the situation of the application of LA in all the academic programs in the different faculties. The premise of the study it was that not all the faculties have utilized advanced LA techniques and it is probable that they do not know what field of study is closer to their program goals. In consequence, not all the programs know about LA but, this does not mean they do not work with LA in a veiled or, less clear sense. It is very important to know the grade of knowledge about LA for two reasons: 1) This allows to appreciate the work of the administration to improve the quality of the teaching and, 2) if it is possible to improve others LA techniques. For this purpose, it was designed three instruments to determinate the experience and knowledge in LA. These were applied to ten faculty coordinators and his personnel; thirty members were consulted (academic secretary, systems manager, or data analyst, and coordinator of the program). The final report allowed to understand that almost all the programs work with basic statistics tools and techniques, this helps the administration only to know what is happening inside de academic program, but they are not ready to move up to the next level, this means applying Artificial Intelligence or Recommender Systems to reach a personalized learning system. This situation is not related to the knowledge of LA, but the clarity of the long-term goals.Keywords: academic improvements, analytical techniques, learning analytics, personnel expertise
Procedia PDF Downloads 1309131 Modeling of Hot Casting Technology of Beryllium Oxide Ceramics with Ultrasonic Activation
Authors: Zamira Sattinova, Tassybek Bekenov
Abstract:
The article is devoted to modeling the technology of hot casting of beryllium oxide ceramics. The stages of ultrasonic activation of beryllium oxide slurry in the plant vessel to improve the rheological property, hot casting in the moulding cavity with cooling and solidification of the casting are described. Thermoplastic slurry (hereinafter referred to as slurry) shows the rheology of a non-Newtonian fluid with yield and plastic viscosity. Cooling-solidification of the slurry in the forming cavity occurs in the liquid, taking into account crystallization and solid state. In this work is the method of calculation of hot casting of the slurry using the method of effective molecular viscosity of viscoplastic fluid. It is shown that the slurry near the cooled wall is in a state of crystallization and plasticity, and the rest may still be in the liquid phase. Nonuniform distribution of temperature, density and concentration of kinetically free binder takes place along the cavity section. This leads to compensation of shrinkage by the influx of slurry from the liquid into the crystallization zones and plasticity of the castings. In the plasticity zone, the shrinkage determined by the concentration of kinetically free binder is compensated under the action of the pressure gradient. The solidification mechanism, as well as the mechanical behavior of the casting mass during casting, the rheological and thermophysical properties of the thermoplastic BeO slurry due to ultrasound exposure have not been well studied. Nevertheless, experimental data allow us to conclude that the effect of ultrasonic vibrations on the slurry mass leads to it: a change in structure, an increase in technological properties, a decrease in heterogeneity and a change in rheological properties. In the course of experiments, the effect of ultrasonic treatment and its duration on the change in viscosity and ultimate shear stress of the slurry depending on temperature (55-75℃) and the mass fraction of the binder (10 - 11.7%) have been studied. At the same time, changes in these properties before and after ultrasound exposure have been analyzed, as well as the nature of the flow in the system under study. The experience of operating the unit with ultrasonic impact has shown that at the same time, the casting capacity of the slurry increases by an average of 15%, and the viscosity decreases by more than half. Experimental study of physicochemical properties and phase change with simultaneous consideration of all factors affecting the quality of products in the process of continuous casting is labor-intensive. Therefore, an effective way to control the physical processes occurring in the formation of articles with predetermined properties and shapes is to simulate the process and determine its basic characteristics. The results of the calculations show the whole stage of hot casting of beryllium oxide slurry, taking into account the change in its state of aggregation. Ultrasonic treatment improves rheological properties and increases the fluidity of the slurry in the forming cavity. Calculations show the influence of velocity, temperature factors and structural data of the cavity on the cooling-solidification process of the casting. In the calculations, conditions for molding with shrinkage of the slurry by hot casting have been found, which makes it possible to obtain a solidifying product with a uniform beryllium oxide structure at the outlet of the cavity.Keywords: hot casting, thermoplastic slurry molding, shrinkage, beryllium oxide
Procedia PDF Downloads 339130 The Solvent Extraction of Uranium, Plutonium and Thorium from Aqueous Solution by 1-Hydroxyhexadecylidene-1,1-Diphosphonic Acid
Authors: M. Bouhoun Ali, A. Y. Badjah Hadj Ahmed, M. Attou, A. Elias, M. A. Didi
Abstract:
In this paper, the solvent extraction of uranium(VI), plutonium(IV) and thorium(IV) from aqueous solutions using 1-hydroxyhexadecylidene-1,1-diphosphonic acid (HHDPA) in treated kerosene has been investigated. The HHDPA was previously synthesized and characterized by FT-IR, 1H NMR, 31P NMR spectroscopy and elemental analysis. The effects contact time, initial pH, initial metal concentration, aqueous/organic phase ratio, extractant concentration and temperature on the extraction process have been studied. An empirical modelling was performed by using a 25 full factorial design, and regression equation for extraction metals was determined from the data. The conventional log-log analysis of the extraction data reveals that ratios of extractant to extracted U(VI), Pu(IV) and Th(IV) are 1:1, 1:2 and 1:2, respectively. Thermodynamic parameters showed that the extraction process was exothermic heat and spontaneous. The obtained optimal parameters were applied to real effluents containing uranium(VI), plutonium(IV) and thorium(IV) ions.Keywords: solvent extraction, uranium, plutonium, thorium, 1-hydroxyhexadecylidene-1-1-diphosphonic acid, aqueous solution
Procedia PDF Downloads 2899129 Preparation and Properties of PP/EPDM Reinforced with Graphene
Authors: M. Haghnegahdar, G. Naderi, M. H. R. Ghoreishy
Abstract:
Polypropylene(PP)/Ethylene Propylene Diene Monomer (EPDM) samples (80/20) containing 0, 0.5, 1, 1.5, 2, 2.5, and 3 (expressed in mass fraction) graphene were prepared using melt compounding method to investigate microstructure, mechanical properties, and thermal stability as well as electrical resistance of samples. X-Ray diffraction data confirmed that graphene platelets are well dispersed in PP/EPDM. Mechanical properties such as tensile strength, impact strength and hardness demonstrated increasing trend by graphene loading which exemplifies substantial reinforcing nature of this kind of nano filler and it's good interaction with polymer chains. At the same time it is found that thermo-oxidative degradation of PP/EPDM nanocomposites is noticeably retarded with the increasing of graphene content. Electrical surface resistivity of the nanocomposite was dramatically changed by forming electrical percolation threshold and leads to change electrical behavior from insulator to semiconductor. Furthermore, these results were confirmed by scanning electron microscopy(SEM), dynamic mechanical thermal analysis (DMTA), and transmission electron microscopy (TEM).Keywords: nanocomposite, graphene, microstructure, mechanical properties
Procedia PDF Downloads 3329128 An Evaluation Model for Enhancing Flexibility in Production Systems through Additive Manufacturing
Authors: Angela Luft, Sebastian Bremen, Nicolae Balc
Abstract:
Additive manufacturing processes have entered large parts of the industry and their range of application have progressed and grown significantly in the course of time. A major advantage of additive manufacturing is the innate flexibility of the machines. This corelates with the ongoing demand of creating highly flexible production environments. However, the potential of additive manufacturing technologies to enhance the flexibility of production systems has not yet been truly considered and quantified in a systematic way. In order to determine the potential of additive manufacturing technologies with regards to the strategic flexibility design in production systems, an integrated evaluation model has been developed, that allows for the simultaneous consideration of both conventional as well as additive production resources. With the described model, an operational scope of action can be identified and quantified in terms of mix and volume flexibility, process complexity, and machine capacity that goes beyond the current cost-oriented approaches and offers a much broader and more holistic view on the potential of additive manufacturing. A respective evaluation model is presented this paper.Keywords: additive manufacturing, capacity planning, production systems, strategic production planning, flexibility enhancement
Procedia PDF Downloads 1619127 Momentum Profits and Investor Behavior
Authors: Aditya Sharma
Abstract:
Profits earned from relative strength strategy of zero-cost portfolio i.e. taking long position in winner stocks and short position in loser stocks from recent past are termed as momentum profits. In recent times, there has been lot of controversy and concern about sources of momentum profits, since the existence of these profits acts as an evidence of earning non-normal returns from publicly available information directly contradicting Efficient Market Hypothesis. Literature review reveals conflicting theories and differing evidences on sources of momentum profits. This paper aims at re-examining the sources of momentum profits in Indian capital markets. The study focuses on assessing the effect of fundamental as well as behavioral sources in order to understand the role of investor behavior in stock returns and suggest (if any) improvements to existing behavioral asset pricing models. This Paper adopts calendar time methodology to calculate momentum profits for 6 different strategies with and without skipping a month between ranking and holding period. For each J/K strategy, under this methodology, at the beginning of each month t stocks are ranked on past j month’s average returns and sorted in descending order. Stocks in upper decile are termed winners and bottom decile as losers. After ranking long and short positions are taken in winner and loser stocks respectively and both portfolios are held for next k months, in such manner that at any given point of time we have K overlapping long and short portfolios each, ranked from t-1 month to t-K month. At the end of period, returns of both long and short portfolios are calculated by taking equally weighted average across all months. Long minus short returns (LMS) are momentum profits for each strategy. Post testing for momentum profits, to study the role market risk plays in momentum profits, CAPM and Fama French three factor model adjusted LMS returns are calculated. In the final phase of studying sources, decomposing methodology has been used for breaking up the profits into unconditional means, serial correlations, and cross-serial correlations. This methodology is unbiased, can be used with the decile-based methodology and helps to test the effect of behavioral and fundamental sources altogether. From all the analysis, it was found that momentum profits do exist in Indian capital markets with market risk playing little role in defining them. Also, it was observed that though momentum profits have multiple sources (risk, serial correlations, and cross-serial correlations), cross-serial correlations plays a major role in defining these profits. The study revealed that momentum profits do have multiple sources however, cross-serial correlations i.e. the effect of returns of other stocks play a major role. This means that in addition to studying the investors` reactions to the information of the same firm it is also important to study how they react to the information of other firms. The analysis confirms that investor behavior does play an important role in stock returns and incorporating both the aspects of investors’ reactions in behavioral asset pricing models help make then better.Keywords: investor behavior, momentum effect, sources of momentum, stock returns
Procedia PDF Downloads 3089126 Problems Arising in Visual Perception: A Philosophical and Epistemological Analysis
Authors: K. A.Tharanga, K. H. H. Damayanthi
Abstract:
Perception is an epistemological concept discussed in Philosophy. Perception, in other word, vision, is one of the ways that human beings get empirical knowledge after five senses. However, we face innumerable problems when achieving knowledge from perception, and therefore the knowledge gained through perception is uncertain. what we see in the external world is not real. These are the major issues that we face when receiving knowledge through perception. Sometimes there is no physical existence of what we really see. In such cases, the perception is relative. The following frames will be taken into consideration when perception is analyzed illusions and delusions, the figure of a physical object, appearance and the reality of a physical object, time factor, and colour of a physical object. seeing and knowing become vary according to the above conceptual frames. We cannot come to a proper conclusion of what we see in the empirical world. Because the things that we see are not really there. Hence the scientific knowledge which is gained from observation is doubtful. All the factors discussed in science remain in the physical world. There is a leap from ones existence to the existence of a world outside his/her mind. Indeed, one can suppose that what he/she takes to be real is just a massive deception. However, depending on the above facts, if someone begins to doubt about the whole world, it is unavoidable to become his/her view a scepticism or nihilism. This is a certain reality.Keywords: empirical, perception, sceptisism, nihilism
Procedia PDF Downloads 1459125 Automation of Finite Element Simulations for the Design Space Exploration and Optimization of Type IV Pressure Vessel
Authors: Weili Jiang, Simon Cadavid Lopera, Klaus Drechsler
Abstract:
Fuel cell vehicle has become the most competitive solution for the transportation sector in the hydrogen economy. Type IV pressure vessel is currently the most popular and widely developed technology for the on-board storage, based on their high reliability and relatively low cost. Due to the stringent requirement on mechanical performance, the pressure vessel is subject to great amount of composite material, a major cost driver for the hydrogen tanks. Evidently, the optimization of composite layup design shows great potential in reducing the overall material usage, yet requires comprehensive understanding on underlying mechanisms as well as the influence of different design parameters on mechanical performance. Given the type of materials and manufacturing processes by which the type IV pressure vessels are manufactured, the design and optimization are a nuanced subject. The manifold of stacking sequence and fiber orientation variation possibilities have an out-standing effect on vessel strength due to the anisotropic property of carbon fiber composites, which make the design space high dimensional. Each variation of design parameters requires computational resources. Using finite element analysis to evaluate different designs is the most common method, however, the model-ing, setup and simulation process can be very time consuming and result in high computational cost. For this reason, it is necessary to build a reliable automation scheme to set up and analyze the di-verse composite layups. In this research, the simulation process of different tank designs regarding various parameters is conducted and automatized in a commercial finite element analysis framework Abaqus. Worth mentioning, the modeling of the composite overwrap is automatically generated using an Abaqus-Python scripting interface. The prediction of the winding angle of each layer and corresponding thickness variation on dome region is the most crucial step of the modeling, which is calculated and implemented using analytical methods. Subsequently, these different composites layups are simulated as axisymmetric models to facilitate the computational complexity and reduce the calculation time. Finally, the results are evaluated and compared regarding the ultimate tank strength. By automatically modeling, evaluating and comparing various composites layups, this system is applicable for the optimization of the tanks structures. As mentioned above, the mechanical property of the pressure vessel is highly dependent on composites layup, which requires big amount of simulations. Consequently, to automatize the simulation process gains a rapid way to compare the various designs and provide an indication of the optimum one. Moreover, this automation process can also be operated for creating a data bank of layups and corresponding mechanical properties with few preliminary configuration steps for the further case analysis. Subsequently, using e.g. machine learning to gather the optimum by the data pool directly without the simulation process.Keywords: type IV pressure vessels, carbon composites, finite element analy-sis, automation of simulation process
Procedia PDF Downloads 1399124 The Use of Substances and Sports Performance among Youth: Implications for Lagos State Sports
Authors: Osifeko Olalekan Remigious, Adesanya Adebisi Joseph, Omolade Akinmade Olatunde
Abstract:
The focus of this study was to determine the factors associated with the use of substances for sport performance of youth in Lagos state sport. Questionnaire was the instrument used for the study. Descriptive research method was used. The estimated population for the study was 2000 sport men and women. The sample size was 200 respondents for purposive sampling techniques were used. The instrument was validated in it content and constructs value. The instrument was administered with the assistance of the coaches. Same 200 copies administered were returned. The data obtained was analysed using simple percentage and chi-square (x2) for stated hypothesis at 0.05 level of significance. The finding reveal that sport injuries exercise induced and anaphylaxis and asthma and feeling of loss of efficacy associated with alcohol used on sport performance among the users of substances. Alcohol users are recommended to partake in sport like swimming, basketball and volleyball because they have space of time for resting while at play. Government should be fully in charge of the health of sport men and women.Keywords: implications, Lagos state, substances, sports performance, youth
Procedia PDF Downloads 5879123 Dual Mode “Turn On-Off-On” Photoluminescence Detection of EDTA and Lead Using Moringa Oleifera Gum-Derived Carbon Dots
Authors: Anisha Mandal, Swambabu Varanasi
Abstract:
Lead is one of the most prevalent toxic heavy metal ions, and its pollution poses a significant threat to the environment and human health. On the other hand, Ethylenediaminetetraacetic acid is a widely used metal chelating agent that, due to its poor biodegradability, is an incessant pollutant to the environment. For the first time, a green, simple, and cost-effective approach is used to hydrothermally synthesise photoluminescent carbon dots using Moringa Oleifera Gum in a single step. Then, using Moringa Oleifera Gum-derived carbon dots, a photoluminescent "ON-OFF-ON" mechanism for dual mode detection of trace Pb2+ and EDTA was proposed. MOG-CDs detect Pb2+ selectively and sensitively using a photoluminescence quenching mechanism, with a detection limit (LOD) of 0.000472 ppm. (1.24 nM). The quenched photoluminescence can be restored by adding EDTA to the MOG-CD+Pb2+ system; this strategy is used to quantify EDTA at a level of detection of 0.0026 ppm. (8.9 nM). The quantification of Pb2+ and EDTA in actual samples encapsulated the applicability and dependability of the proposed photoluminescent probe.Keywords: carbon dots, photoluminescence, sensor, moringa oleifera gum
Procedia PDF Downloads 1219122 Emerging Threats and Adaptive Defenses: Navigating the Future of Cybersecurity in a Hyperconnected World
Authors: Olasunkanmi Jame Ayodeji, Adebayo Adeyinka Victor
Abstract:
In a hyperconnected world, cybersecurity faces a continuous evolution of threats that challenge traditional defence mechanisms. This paper explores emerging cybersecurity threats like malware, ransomware, phishing, social engineering, and the Internet of Things (IoT) vulnerabilities. It delves into the inadequacies of existing cybersecurity defences in addressing these evolving risks and advocates for adaptive defence mechanisms that leverage AI, machine learning, and zero-trust architectures. The paper proposes collaborative approaches, including public-private partnerships and information sharing, as essential to building a robust defence strategy to address future cyber threats. The need for continuous monitoring, real-time incident response, and adaptive resilience strategies is highlighted to fortify digital infrastructures in the face of escalating global cyber risks.Keywords: cybersecurity, hyperconnectivity, malware, adaptive defences, zero-trust architecture, internet of things vulnerabilities
Procedia PDF Downloads 329121 Improving the Budget Distribution Procedure to Ensure Smooth and Efficient Public Service Delivery
Authors: Rizwana Tabassum
Abstract:
Introductive Statement: Delay in budget releases is often cited as one of the biggest bottlenecks to smooth and efficient service delivery. While budget release from the ministry of finance to the line ministries has been expedited by simplifying the procedure, budget distribution within the line ministries remains one of the major causes of slow budget utilization. While the budget preparation is a bottom-up process where all DDOs submit their proposals to their controlling officers (such as Upazila Civil Surgeon sends it to Director General Health), who consolidate the budget proposals in iBAS++ budget preparation module, the approved budget is not disaggregated by all DDOs. Instead, it is left to the discretion of the controlling officers to distribute the approved budget to their sub-ordinate offices over the course of the year. Though there are some need-based criteria/formulae to distribute the approved budget among DDOs in some sectors, there is little evidence that these criteria are actually used. This means that majority of the DDOs don’t know their yearly allocations upfront to enable yearly planning of activities and expenditures. This delays the implementation of critical activities and the payment to the suppliers of goods and services and sometimes leads to undocumented arrears to suppliers for essential goods/services. In addition, social sector budgets are fragmented because of the vertical programs and externally financed interventions that pose several management challenges at the level of the budget holders and frontline service providers. Slow procurement processes further delay the provision of necessary goods and services. For example, it takes an average of 15–18 months for drugs to reach the Upazila Health Complex and below, while it should not take more than 9 months in procuring and distributing these. Aim of the Study: This paper aims to investigate the budget distribution practices of an emerging economy, Bangladesh. The paper identifies challenges of timely distribution and ways to deal with problems as well. Methodology: The study draws conclusions on the basis of document analysis which is a branch of the qualitative research method. Major Findings: Upon approval of the National Budget, the Ministry of Finance is required to distribute the budget to budget holders at the department level; however, budget is distributed to drawing and disbursing officers much later. Conclusions: Timely and predictable budget releases assist completion of development schemes on time and on budget, with sufficient recurrent resources for effective operation. ADP implementation is usually very low at the beginning of the fiscal year and expedited dramatically during the last few months, leading to inefficient use of resources. The timely budget release will resolve this issue and deliver economic benefits faster, better, and more reliably. This will also give the project directors/DDOs the freedom to think and plan the budget execution in a predictable manner, thereby ensuring value for money by reducing time overrun and expediting the completion of capital investments, and improving infrastructure utilization through timely payment of recurrent costs.Keywords: budget distribution, challenges, digitization, emerging economy, service delivery
Procedia PDF Downloads 859120 MEIOSIS: Museum Specimens Shed Light in Biodiversity Shrinkage
Authors: Zografou Konstantina, Anagnostellis Konstantinos, Brokaki Marina, Kaltsouni Eleftheria, Dimaki Maria, Kati Vassiliki
Abstract:
Body size is crucial to ecology, influencing everything from individual reproductive success to the dynamics of communities and ecosystems. Understanding how temperature affects variations in body size is vital for both theoretical and practical purposes, as changes in size can modify trophic interactions by altering predator-prey size ratios and changing the distribution and transfer of biomass, which ultimately impacts food web stability and ecosystem functioning. Notably, a decrease in body size is frequently mentioned as the third "universal" response to climate warming, alongside shifts in distribution and changes in phenology. This trend is backed by ecological theories like the temperature-size rule (TSR) and Bergmann's rule, which have been observed in numerous species, indicating that many species are likely to shrink in size as temperatures rise. However, the thermal responses related to body size are still contradictory, and further exploration is needed. To tackle this challenge, we developed the MEIOSIS project, aimed at providing valuable insights into the relationship between the body size of species, species’ traits, environmental factors, and their response to climate change. We combined a digitized collection of butterflies from the Swiss Federal Institute of Technology in Zürich with our newly digitized butterfly collection from Goulandris Natural History Museum in Greece to analyse trends in time. For a total of 23868 images, the length of the right forewing was measured using ImageJ software. Each forewing was measured from the point at which the wing meets the thorax to the apex of the wing. The forewing length of museum specimens has been shown to have a strong correlation with wing surface area and has been utilized in prior studies as a proxy for overall body size. Temperature data corresponding to the years of collection were also incorporated into the datasets. A second dataset was generated when a custom computer vision tool was implemented for the automated morphological measuring of samples for the digitized collection in Zürich. Using the second dataset, we corrected manual measurements with ImageJ, and a final dataset containing 31922 samples was used for analysis. Setting time as a smoother variable, species identity as a random factor, and the length of right-wing size (a proxy for body size) as the response variable, we ran a global model for a maximum period of 110 years (1900 – 2010). Then, we investigated functional variability between different terrestrial biomes in a second model. Both models confirmed our initial hypothesis and resulted in a decreasing trend in body size over the years. We expect that this first output can be provided as basic data for the next challenge, i.e., to identify the ecological traits that influence species' temperature-size responses, enabling us to predict the direction and intensity of a species' reaction to rising temperatures more accurately.Keywords: butterflies, shrinking body size, museum specimens, climate change
Procedia PDF Downloads 199119 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 349118 The U.S. Missile Defense Shield and Global Security Destabilization: An Inconclusive Link
Authors: Michael A. Unbehauen, Gregory D. Sloan, Alberto J. Squatrito
Abstract:
Missile proliferation and global stability are intrinsically linked. Missile threats continually appear at the forefront of global security issues. North Korea’s recently demonstrated nuclear and intercontinental ballistic missile (ICBM) capabilities, for the first time since the Cold War, renewed public interest in strategic missile defense capabilities. To protect from limited ICBM attacks from so-called rogue actors, the United States developed the Ground-based Midcourse Defense (GMD) system. This study examines if the GMD missile defense shield has contributed to a safer world or triggered a new arms race. Based upon increased missile-related developments and the lack of adherence to international missile treaties, it is generally perceived that the GMD system is a destabilizing factor for global security. By examining the current state of arms control treaties as well as existing missile arsenals and ongoing efforts in technologies to overcome U.S. missile defenses, this study seeks to analyze the contribution of GMD to global stability. A thorough investigation cannot ignore that, through the establishment of this limited capability, the U.S. violated longstanding, successful weapons treaties and caused concern among states that possess ICBMs. GMD capability contributes to the perception that ICBM arsenals could become ineffective, creating an imbalance in favor of the United States, leading to increased global instability and tension. While blame for the deterioration of global stability and non-adherence to arms control treaties is often placed on U.S. missile defense, the facts do not necessarily support this view. The notion of a renewed arms race due to GMD is supported neither by current missile arsenals nor by the inevitable development of new and enhanced missile technology, to include multiple independently targeted reentry vehicles (MIRVs), maneuverable reentry vehicles (MaRVs), and hypersonic glide vehicles (HGVs). The methodology in this study encapsulates a period of time, pre- and post-GMD introduction, while analyzing international treaty adherence, missile counts and types, and research in new missile technologies. The decline in international treaty adherence, coupled with a measurable increase in the number and types of missiles or research in new missile technologies during the period after the introduction of GMD, could be perceived as a clear indicator of GMD contributing to global instability. However, research into improved technology (MIRV, MaRV and HGV) prior to GMD, as well as a decline of various global missile inventories and testing of systems during this same period, would seem to invalidate this theory. U.S. adversaries have exploited the perception of the U.S. missile defense shield as a destabilizing factor as a pretext to strengthen and modernize their militaries and justify their policies. As a result, it can be concluded that global stability has not significantly decreased due to GMD; but rather, the natural progression of technological and missile development would inherently include innovative and dynamic approaches to target engagement, deterrence, and national defense.Keywords: arms control, arms race, global security, GMD, ICBM, missile defense, proliferation
Procedia PDF Downloads 1479117 Investigation of the Variables Affecting the Use of Charcoal to Delay Fermentation in Wet Beans Slurry Using Chemical and Physical Analysis
Authors: Anuoluwapo O. Adewole
Abstract:
Fermentation is the conversion of monomeric sugars into ethanol and carbondioxide in the presence of microorganisms under anaerobic conditions. In line with the aim and objective of this research project, which is to investigate into the variables affecting the use of charcoal to delay fermentation in wet beans slurry, some physical and chemical analysis were carried out on the wet beans slurry using a PH meter in which a thermometer is incorporated in it, and a measuring cylinder was used for the foam level test. About 250 grams of the ground beans slurry was divided into two portions for testing. The sample with charcoal was labeled sample 'A' while the second sample without charcoal was labeled sample 'B' subsequently. The experiment lasted for a period of 41.15 hours (i.e., forty-one hours and nine minutes). During the fourth process, both samples could not be tested as the laboratory had been saturated with foul odor and both samples were packed and sealed in polythene bag for disposal in the trash can. It was generally observed that the sample with the charcoal lasted for a longer time before that without charcoal before total spoilage occurred.Keywords: fermentation, monomeric sugars, beans slurry, charcoal, anaerobic conditions
Procedia PDF Downloads 3389116 Prediction of the Thermal Parameters of a High-Temperature Metallurgical Reactor Using Inverse Heat Transfer
Authors: Mohamed Hafid, Marcel Lacroix
Abstract:
This study presents an inverse analysis for predicting the thermal conductivities and the heat flux of a high-temperature metallurgical reactor simultaneously. Once these thermal parameters are predicted, the time-varying thickness of the protective phase-change bank that covers the inside surface of the brick walls of a metallurgical reactor can be calculated. The enthalpy method is used to solve the melting/solidification process of the protective bank. The inverse model rests on the Levenberg-Marquardt Method (LMM) combined with the Broyden method (BM). A statistical analysis for the thermal parameter estimation is carried out. The effect of the position of the temperature sensors, total number of measurements and measurement noise on the accuracy of inverse predictions is investigated. Recommendations are made concerning the location of temperature sensors.Keywords: inverse heat transfer, phase change, metallurgical reactor, Levenberg–Marquardt method, Broyden method, bank thickness
Procedia PDF Downloads 3369115 The Stock Price Effect of Apple Keynotes
Authors: Ethan Petersen
Abstract:
In this paper, we analyze the volatility of Apple’s stock beginning January 3, 2005 up to October 9, 2014, then focus on a range from 30 days prior to each product announcement until 30 days after. Product announcements are filtered; announcements whose 60 day range is devoid of other events are separated. This filtration is chosen to isolate, and study, a potential cross-effect. Concerning Apple keynotes, there are two significant dates: the day the invitations to the event are received and the day of the event itself. As such, the statistical analysis is conducted for both invite-centered and event-centered time frames. A comparison to the VIX is made to determine if the trend is simply following the market or deviating. Regardless of the filtration, we find that there is a clear deviation from the market. Comparing these data sets, there are significantly different trends: isolated events have a constantly decreasing, erratic trend in volatility but an increasing, linear trend is observed for clustered events. According to the Efficient Market Hypothesis, we would expect a change when new information is publicly known and the results of this study support this claim.Keywords: efficient market hypothesis, event study, volatility, VIX
Procedia PDF Downloads 2839114 An Integrated Emergency Management System for the Tourism Industry in Oman
Authors: Majda Al Salti
Abstract:
Tourism industry is considered globally as one of the leading industries due to its noticeable contribution to countries' gross domestic product (GDP) and job creation. However, tourism is vulnerable to crisis and disaster that requires its preparedness. With its limited capabilities, there is a need to improve links and the understanding between the tourism industry and the emergency services, thus facilitating future emergency response to any potential incident. This study aims to develop the concept of an integrated emergency management system for the tourism industry. The study used face-to-face semi-structured interviews to evaluate the level of crisis and disaster preparedness of the tourism industry in Oman. The findings suggested that there is a lack of understanding of crisis and disaster management, and hence preparedness level among Oman Tourism Authorities appears to be under-expectation. Therefore, a clear need for tourism sector inter- and intra-integration and collaboration is important in the pre-disaster stage. The need for such integrations can help the tourism industry in Oman to prepare for future incidents as well as identifying its requirements in time of crisis for effective response.Keywords: tourism, emergency services, crisis, disaster
Procedia PDF Downloads 1229113 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 969112 Using New Machine Algorithms to Classify Iranian Musical Instruments According to Temporal, Spectral and Coefficient Features
Authors: Ronak Khosravi, Mahmood Abbasi Layegh, Siamak Haghipour, Avin Esmaili
Abstract:
In this paper, a study on classification of musical woodwind instruments using a small set of features selected from a broad range of extracted ones by the sequential forward selection method was carried out. Firstly, we extract 42 features for each record in the music database of 402 sound files belonging to five different groups of Flutes (end blown and internal duct), Single –reed, Double –reed (exposed and capped), Triple reed and Quadruple reed. Then, the sequential forward selection method is adopted to choose the best feature set in order to achieve very high classification accuracy. Two different classification techniques of support vector machines and relevance vector machines have been tested out and an accuracy of up to 96% can be achieved by using 21 time, frequency and coefficient features and relevance vector machine with the Gaussian kernel function.Keywords: coefficient features, relevance vector machines, spectral features, support vector machines, temporal features
Procedia PDF Downloads 3249111 Pesticides Monitoring in Surface Waters of the São Paulo State, Brazil
Authors: Fabio N. Moreno, Letícia B. Marinho, Beatriz D. Ruiz, Maria Helena R. B. Martins
Abstract:
Brazil is a top consumer of pesticides worldwide, and the São Paulo State is one of the highest consumers among the Brazilian federative states. However, representative data about the occurrence of pesticides in surface waters of the São Paulo State is scarce. This paper aims to present the results of pesticides monitoring executed within the Water Quality Monitoring Network of CETESB (The Environmental Agency of the São Paulo State) between the 2018-2022 period. Surface water sampling points (21 to 25) were selected within basins of predominantly agricultural land-use (5 to 85% of cultivated areas). The samples were collected throughout the year, including high-flow and low-flow conditions. The frequency of sampling varied between 6 to 4 times per year. Selection of pesticide molecules for monitoring followed a prioritizing process from EMBRAPA (Brazilian Agricultural Research Corporation) databases of pesticide use. Pesticides extractions in aqueous samples were performed according to USEPA 3510C and 3546 methods following quality assurance and quality control procedures. Determination of pesticides in water (ng L-1) extracts were performed by high-performance liquid chromatography coupled with mass spectrometry (HPLC-MS) and by gas chromatography with nitrogen phosphorus (GC-NPD) and electron capture detectors (GC-ECD). The results showed higher frequencies (20- 65%) in surface water samples for Carbendazim (fungicide), Diuron/Tebuthiuron (herbicides) and Fipronil/Imidaclopride (insecticides). The frequency of observations for these pesticides were generally higher in monitoring points located in sugarcane cultivated areas. The following pesticides were most frequently quantified above the Aquatic life benchmarks for freshwater (USEPA Office of Pesticide Programs, 2023) or Brazilian Federal Regulatory Standards (CONAMA Resolution no. 357/2005): Atrazine, Imidaclopride, Carbendazim, 2,4D, Fipronil, and Chlorpiryfos. Higher median concentrations for Diuron and Tebuthiuron in the rainy months (october to march) indicated pesticide transport through surface runoff. However, measurable concentrations in the dry season (april to september) for Fipronil and Imidaclopride also indicates pathways related to subsurface or base flow discharge after pesticide soil infiltration and leaching or dry deposition following pesticide air spraying. With exception to Diuron, no temporal trends related to median concentrations of the most frequently quantified pesticides were observed. These results are important to assist policymakers in the development of strategies aiming at reducing pesticides migration to surface waters from agricultural areas. Further studies will be carried out in selected points to investigate potential risks as a result of pesticides exposure on aquatic biota.Keywords: pesticides monitoring, são paulo state, water quality, surface waters
Procedia PDF Downloads 639110 Computer-Aided Teaching of Transformers for Undergraduates
Authors: Rajesh Kumar, Roopali Dogra, Puneet Aggarwal
Abstract:
In the era of technological advancement, use of computer technology has become inevitable. Hence it has become the need of the hour to integrate software methods in engineering curriculum as a part to boost pedagogy techniques. Simulations software is a great help to graduates of disciplines such as electrical engineering. Since electrical engineering deals with high voltages and heavy instruments, extra care must be taken while operating with them. The viable solution would be to have appropriate control. The appropriate control could be well designed if engineers have knowledge of kind of waveforms associated with the system. Though these waveforms can be plotted manually, but it consumes a lot of time. Hence aid of simulation helps to understand steady state of system and resulting in better performance. In this paper computer, aided teaching of transformer is carried out using MATLAB/Simulink. The test carried out on a transformer includes open circuit test and short circuit respectively. The respective parameters of transformer are then calculated using the values obtained from open circuit and short circuit test respectively using Simulink.Keywords: computer aided teaching, open circuit test, short circuit test, simulink, transformer
Procedia PDF Downloads 3819109 Mechanical Simulation with Electrical and Dimensional Tests for AISHa Containment Chamber
Authors: F. Noto, G. Costa, L. Celona, F. Chines, G. Ciavola, G. Cuttone, S. Gammino, O. Leonardi, S. Marletta, G. Torrisi
Abstract:
At Istituto Nazionale di Fisica Nucleare – Laboratorio Nazionale del Sud (INFN-LNS), a broad experience in the design, construction and commissioning of ECR and microwave ion sources is available. The AISHa ion source has been designed by taking into account the typical requirements of hospital-based facilities, where the minimization of the mean time between failures (MTBF) is a key point together with the maintenance operations, which should be fast and easy. It is intended to be a multipurpose device, operating at 18 GHz, in order to achieve higher plasma densities. It should provide enough versatility for future needs of the hadron therapy, including the ability to run at larger microwave power to produce different species and highly charged ion beams. The source is potentially interesting for any hadron therapy facility using heavy ions. In this paper, we analyzed the dimensional test and electrical test about an innovative solution for the containment chamber that allows us to solve our isolation and structural problems.Keywords: FEM analysis, electron cyclotron resonance ion source, dielectrical measurement, hadron therapy
Procedia PDF Downloads 2949108 Performance of Neural Networks vs. Radial Basis Functions When Forming a Metamodel for Residential Buildings
Authors: Philip Symonds, Jon Taylor, Zaid Chalabi, Michael Davies
Abstract:
With the world climate projected to warm and major cities in developing countries becoming increasingly populated and polluted, governments are tasked with the problem of overheating and air quality in residential buildings. This paper presents the development of an adaptable model of these risks. Simulations are performed using the EnergyPlus building physics software. An accurate metamodel is formed by randomly sampling building input parameters and training on the outputs of EnergyPlus simulations. Metamodels are used to vastly reduce the amount of computation time required when performing optimisation and sensitivity analyses. Neural Networks (NNs) are compared to a Radial Basis Function (RBF) algorithm when forming a metamodel. These techniques were implemented using the PyBrain and scikit-learn python libraries, respectively. NNs are shown to perform around 15% better than RBFs when estimating overheating and air pollution metrics modelled by EnergyPlus.Keywords: neural networks, radial basis functions, metamodelling, python machine learning libraries
Procedia PDF Downloads 4539107 The Role of Neuroserpin in Rheumatoid Arthritis Patients
Authors: Sevil Arabaci Tamer, Gonul Gurol, Ibrahim Tekeoglu, Halil Harman, Ihsan Hakki Ciftci
Abstract:
Neuroserpin (NSP) is a serine protease inhibitor and member of the serpin family. It is expressed in developing and adult nervous systems, and acts as an inhibitor of protease tissue plasminogen activator (tPA) and a regulator of neuronal growth and plasticity. Also NSP displays anti-inflammatory activity. But, its role in rheumatoid arthritis had never been studied before. So, the aim of the present study was to investigate the effect of neuroserpin in patients with RA. A total of 50 frozen (-20 ºC) serum samples 40 of them belonged to patients with RA, and 10 sample belonged to healthy subjects, were enrolled prospectively. We used DAS-28 to evaluate disease activity. The following clinical data gathered from the original patients' charts. Serum neuroserpin levels were measured by enzyme-linked immunosorbent assay. Our preliminary study results demonstrate, for the first time, that NSP levels are significantly different in RA patients relative to healthy subjects (P = 0.014). So, NSP contribute to pathological condition of RA. Thus, we believe that serum NSP levels can be as a marker in patients with RA. However other inflammatory diseases should be further investigated.Keywords: neuroserpin, rheumatoid arthritis, tPA, tPA inhibitor
Procedia PDF Downloads 475