Search results for: modern information technologies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15395

Search results for: modern information technologies

7505 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 82
7504 An Algorithm to Compute the State Estimation of a Bilinear Dynamical Systems

Authors: Abdullah Eqal Al Mazrooei

Abstract:

In this paper, we introduce a mathematical algorithm which is used for estimating the states in the bilinear systems. This algorithm uses a special linearization of the second-order term by using the best available information about the state of the system. This technique makes our algorithm generalizes the well-known Kalman estimators. The system which is used here is of the bilinear class, the evolution of this model is linear-bilinear in the state of the system. Our algorithm can be used with linear and bilinear systems. We also here introduced a real application for the new algorithm to prove the feasibility and the efficiency for it.

Keywords: estimation algorithm, bilinear systems, Kakman filter, second order linearization

Procedia PDF Downloads 481
7503 A Nucleic Acid Extraction Method for High-Viscosity Floricultural Samples

Authors: Harunori Kawabe, Hideyuki Aoshima, Koji Murakami, Minoru Kawakami, Yuka Nakano, David D. Ordinario, C. W. Crawford, Iri Sato-Baran

Abstract:

With the recent advances in gene editing technologies allowing the rewriting of genetic sequences, additional market growth in the global floriculture market beyond previous trends is anticipated through increasingly sophisticated plant breeding techniques. As a prerequisite for gene editing, the gene sequence of the target plant must first be identified. This necessitates the genetic analysis of plants with unknown gene sequences, the extraction of RNA, and comprehensive expression analysis. Consequently, a technology capable of consistently and effectively extracting high-purity DNA and RNA from plants is of paramount importance. Although model plants, such as Arabidopsis and tobacco, have established methods for DNA and RNA extraction, floricultural species such as roses present unique challenges. Different techniques to extract DNA and RNA from various floricultural species were investigated. Upon sampling and grinding the petals of several floricultural species, it was observed that nucleic acid extraction from the ground petal solutions of low viscosity was straightforward; solutions of high viscosity presented a significant challenge. It is postulated that the presence of substantial quantities of polysaccharides and polyphenols in the plant tissue was responsible for the inhibition of nucleic acid extraction. Consequently, attempts were made to extract high-purity DNA and RNA by improving the CTAB method and combining it with commercially available nucleic acid extraction kits. The quality of the total extracted DNA and RNA was evaluated using standard methods. Finally, the effectiveness of the extraction method was assessed by determining whether it was possible to create a library that could be applied as a suitable template for a next-generation sequencer. In conclusion, a method was developed for consistent and accurate nucleic acid extraction from high-viscosity floricultural samples. These results demonstrate improved techniques for DNA and RNA extraction from flowers, help facilitate gene editing of floricultural species and expand the boundaries of research and commercial opportunities.

Keywords: floriculture, gene editing, next-generation sequencing, nucleic acid extraction

Procedia PDF Downloads 10
7502 Adsorptive Removal of Methylene Blue Dye from Aqueous Solutions by Leaf and Stem Biochar Derived from Lantana camara: Adsorption Kinetics, Equilibrium, Thermodynamics and Possible Mechanism

Authors: Deepa Kundu, Prabhakar Sharma, Sayan Bhattacharya, Jianying Shang

Abstract:

The discharge of dye-containing effluents in the water bodies has raised concern due to the potential hazards related to their toxicity in the environment. There are various treatment technologies available for the removal of dyes from wastewaters. The use of biosorbent to remove dyes from wastewater is one of the effective and inexpensive techniques. In the study, the adsorption of phenothiazine dye methylene blue onto biosorbent prepared from Lantana camara L. has been studied in aqueous solutions. The batch adsorption experiments were conducted and the effects of various parameters such as pH (3-12), contact time, adsorbent dose (100-400 mg/L), initial dye concentration (5-20 mg/L), and temperature (303, 313 and 323 K) were investigated. The prepared leaf (BCL600) and shoot (BCS600) biochar of Lantana were characterized using FTIR, SEM, elemental analysis, and zeta potential (pH~7). A comparison between the adsorption potential of both the biosorbent was also evaluated. The results indicated that the amount of methylene blue dye (mg/g) adsorbed onto the surface of biochar was highly dependent on the pH of the dye solutions as it increased with an increase in pH from 3 to 12. It was observed that the dye treated with BCS600 and BCL600 attained an equilibrium within 60 and 100 minutes, respectively. The rate of the adsorption process was determined by performing the Lagergren pseudo-first-order and pseudo-second-order kinetics. It was found that dye treated with both BCS600 and BCL600 followed pseudo-second-order kinetics implying the multi-step nature of the adsorption process involving external adsorption and diffusion of dye molecules into the interior of the adsorbents. The data obtained from batch experiments were fitted well with Langmuir and Freundlich isotherms (R² > 0.98) to indicate the multilayer adsorption of dye over the biochar surfaces. The thermodynamic studies revealed that the adsorption process is favourable, spontaneous, and endothermic in nature. Based on the results, the inexpensive and easily available Lantana camara biomass can be used to remove methylene blue dye from wastewater. It can also help in managing the growth of the notorious weed in the environment.

Keywords: adsorption kinetics, biochar, Lantana camara, methylene blue dye, possible mechanism, thermodynamics

Procedia PDF Downloads 131
7501 A System Dynamics Approach for Assessing Policy Impacts on Closed-Loop Supply Chain Efficiency: A Case Study on Electric Vehicle Batteries

Authors: Guannan Ren, Thomas Mazzuchi, Shahram Sarkani

Abstract:

Electric vehicle battery recycling has emerged as a critical process in the transition toward sustainable transportation. As the demand for electric vehicles continues to rise, so does the need to address the end-of-life management of their batteries. Electric vehicle battery recycling benefits resource recovery and supply chain stability by reclaiming valuable metals like lithium, cobalt, nickel, and graphite. The reclaimed materials can then be reintroduced into the battery manufacturing process, reducing the reliance on raw material extraction and the environmental impacts of waste. Current battery recycling rates are insufficient to meet the growing demands for raw materials. While significant progress has been made in electric vehicle battery recycling, many areas can still improve. Standardization of battery designs, increased collection and recycling infrastructures, and improved efficiency in recycling processes are essential for scaling up recycling efforts and maximizing material recovery. This work delves into key factors, such as regulatory frameworks, economic incentives, and technological processes, that influence the cost-effectiveness and efficiency of battery recycling systems. A system dynamics model that considers variables such as battery production rates, demand and price fluctuations, recycling infrastructure capacity, and the effectiveness of recycling processes is created to study how these variables are interconnected, forming feedback loops that affect the overall supply chain efficiency. Such a model can also help simulate the effects of stricter regulations on battery disposal, incentives for recycling, or investments in research and development for battery designs and advanced recycling technologies. By using the developed model, policymakers, industry stakeholders, and researchers may gain insights into the effects of applying different policies or process updates on electric vehicle battery recycling rates.

Keywords: environmental engineering, modeling and simulation, circular economy, sustainability, transportation science, policy

Procedia PDF Downloads 87
7500 Streamflow Modeling Using the PyTOPKAPI Model with Remotely Sensed Rainfall Data: A Case Study of Gilgel Ghibe Catchment, Ethiopia

Authors: Zeinu Ahmed Rabba, Derek D Stretch

Abstract:

Remote sensing contributes valuable information to streamflow estimates. Usually, stream flow is directly measured through ground-based hydrological monitoring station. However, in many developing countries like Ethiopia, ground-based hydrological monitoring networks are either sparse or nonexistent, which limits the manage water resources and hampers early flood-warning systems. In such cases, satellite remote sensing is an alternative means to acquire such information. This paper discusses the application of remotely sensed rainfall data for streamflow modeling in Gilgel Ghibe basin in Ethiopia. Ten years (2001-2010) of two satellite-based precipitation products (SBPP), TRMM and WaterBase, were used. These products were combined with the PyTOPKAPI hydrological model to generate daily stream flows. The results were compared with streamflow observations at Gilgel Ghibe Nr, Assendabo gauging station using four statistical tools (Bias, R², NS and RMSE). The statistical analysis indicates that the bias-adjusted SBPPs agree well with gauged rainfall compared to bias-unadjusted ones. The SBPPs with no bias-adjustment tend to overestimate (high Bias and high RMSE) the extreme precipitation events and the corresponding simulated streamflow outputs, particularly during wet months (June-September) and underestimate the streamflow prediction over few dry months (January and February). This shows that bias-adjustment can be important for improving the performance of the SBPPs in streamflow forecasting. We further conclude that the general streamflow patterns were well captured at daily time scales when using SBPPs after bias adjustment. However, the overall results demonstrate that the simulated streamflow using the gauged rainfall is superior to those obtained from remotely sensed rainfall products including bias-adjusted ones.

Keywords: Ethiopia, PyTOPKAPI model, remote sensing, streamflow, Tropical Rainfall Measuring Mission (TRMM), waterBase

Procedia PDF Downloads 275
7499 Reference Management Software: Comparative Analysis of RefWorks and Zotero

Authors: Sujit K. Basak

Abstract:

This paper presents a comparison of reference management software between RefWorks and Zotero. The results were drawn by comparing two software and the novelty of this paper is the comparative analysis of software and it has shown that ReftWorks can import more information from the Google Scholar for the researchers. This finding could help to know researchers to use the reference management software.

Keywords: analysis, comparative analysis, reference management software, researchers

Procedia PDF Downloads 535
7498 A Sustainable and Low-Cost Filter to Treat Pesticides in Water

Authors: T. Abbas, J. McEvoy, E. Khan

Abstract:

Pesticide contamination in water supply is a common environmental problem in rural agricultural communities. Advanced water treatment processes such as membrane filtration and adsorption on activated carbon only remove pesticides from water without degrading them into less toxic/easily degradable compounds leaving behind contaminated brine and activated carbon that need to be managed. Rural communities which normally cannot afford expensive water treatment technologies need an economical and sustainable filter which not only treats pesticides from water but also degrades them into benign products. In this study, iron turning waste experimented as potential point-of-use filtration media for the removal/degradation of a mixture of six chlorinated pesticides (lindane, heptachlor, endosulfan, dieldrin, endrin, and DDT) in water. As a common and traditional medium for water filtration, sand was also tested along with iron turning waste. Iron turning waste was characterized using scanning electron microscopy and energy dispersive X-Ray analyzer. Four glass columns with different filter media layer configurations were set up: (1) only sand, (2) only iron turning, (3) sand and iron turning (two separate layers), and (4) sand, iron turning and sand (three separate layers). The initial pesticide concentration and flow rate were 2 μg/L and 10 mL/min. Results indicate that sand filtration was effective only for the removal of DDT (100%) and endosulfan (94-96%). Iron turning filtration column effectively removed endosulfan, endrin, and dieldrin (85-95%) whereas the lindane and DDT removal were 79-85% and 39-56%, respectively. The removal efficiencies for heptachlor, endosulfan, endrin, dieldrin, and DDT were 90-100% when sand and iron turning waste (two separate layers) were used. However, better removal efficiencies (93-100%) for five out of six pesticides were achieved, when sand, iron turning and sand (three separate layers) were used as filtration media. Moreover, the effects of water pH, amounts of media, and minerals present in water such as magnesium, sodium, calcium, and nitrate on the removal of pesticides were examined. Results demonstrate that iron turning waste efficiently removed all the pesticides under studied parameters. Also, it completely de-chlorinated all the pesticides studied and based on the detection of by-products, the degradation mechanisms for all six pesticides were proposed.

Keywords: pesticide contamination, rural communities, iron turning waste, filtration

Procedia PDF Downloads 251
7497 Features of Formation and Development of Possessory Risk Management Systems of Organization in the Russian Economy

Authors: Mikhail V. Khachaturyan, Inga A. Koryagina, Maria Nikishova

Abstract:

The study investigates the impact of the ongoing financial crisis, started in the 2nd half of 2014, on marketing budgets spent by Fast-moving consumer goods companies. In these conditions, special importance is given to efficient possessory risk management systems. The main objective for establishing and developing possessory risk management systems for FMCG companies in a crisis is to analyze the data relating to the external environment and consumer behavior in a crisis. Another important objective for possessory risk management systems of FMCG companies is to develop measures and mechanisms to maintain and stimulate sales. In this regard, analysis of risks and threats which consumers define as the main reasons affecting their level of consumption become important. It is obvious that in crisis conditions the effective risk management systems responsible for development and implementation of strategies for consumer demand stimulation, as well as the identification, analysis, assessment and management of other types of risks of economic security will be the key to sustainability of a company. In terms of financial and economic crisis, the problem of forming and developing possessory risk management systems becomes critical not only in the context of management models of FMCG companies, but for all the companies operating in other sectors of the Russian economy. This study attempts to analyze the specifics of formation and development of company possessory risk management systems. In the modern economy, special importance among all the types of owner’s risks has the risk of reduction in consumer activity. This type of risk is common not only for the consumer goods trade. Study of consumer activity decline is especially important for Russia due to domestic market of consumer goods being still in the development stage, despite its significant growth. In this regard, it is especially important to form and develop possessory risk management systems for FMCG companies. The authors offer their own interpretation of the process of forming and developing possessory risk management systems within owner’s management models of FMCG companies as well as in Russian economy in general. Proposed methods and mechanisms of problem analysis of formation and development of possessory risk management systems in FMCG companies and the results received can be helpful for researchers interested in problems of consumer goods market development in Russia and overseas.

Keywords: FMCG companies, marketing budget, risk management, owner, Russian economy, organization, formation, development, system

Procedia PDF Downloads 372
7496 Analysis and Design of Offshore Met Mast Supported on Jacket Substructure

Authors: Manu Manu, Pardha J. Saradhi, Ramana M. V. Murthy

Abstract:

Wind Energy is accepted as one of the most developed, cost effective and proven renewable energy technologies to meet increasing electricity demands in a sustainable manner. Preliminary assessment studies along Indian Coastline by Ministry of New and Renewable Energy have indicated prospects for development of offshore wind power along Tamil Nadu Coast, India. The commercial viability of a wind project mainly depends on wind characteristics on site. Hence, it is internationally recommended to perform site-specific wind resource assessment based on two years’ wind profile as a part of the feasibility study. Conventionally, guy wire met mast are used onshore for the collection of wind profile. Installation of similar structure in offshore requires complex marine spread and are very expensive. In the present study, an attempt is made to develop 120 m long lattice tower supported on the jacket, piled to the seabed at Rameshwaram, Tamil Nadu, India. Offshore met-masts are subjected to combined wind and hydrodynamic loads, and these lateral loads should be safely transferred to soil. The wind loads are estimated based on gust factor method, and the hydrodynamic loads are estimated by Morison’s equation along with suitable wave theory. The soil is modeled as three nonlinear orthogonal springs based on API standards. The structure configuration and optimum member sizes are obtained for extreme cyclone events. The dynamic behavior of mast under coupled wind and wave loads is also studied. The static responses of a mast with jacket type offshore platform have been studied using a frame model in SESAM. It is found from the study that the maximum displacement at the top of the mast for the random wave is 0.003 m and that of the tower for wind is 0.08 m during the steady state. The dynamic analysis results indicate that the structure is safe against coupled wind and wave loading.

Keywords: offshore wind, mast, static, aerodynamic load, hydrodynamic load

Procedia PDF Downloads 211
7495 Caregiver Training Results in Accurate Reporting of Stool Frequency

Authors: Matthew Heidman, Susan Dallabrida, Analice Costa

Abstract:

Background:Accuracy of caregiver reported outcomes is essential for infant growth and tolerability study success. Crying/fussiness, stool consistencies, and other gastrointestinal characteristics are important parameters regarding tolerability, and inter-caregiver reporting can see a significant amount of subjectivity and vary greatly within a study, compromising data. This study sought to elucidate how caregiver reported questions related to stool frequency are answered before and after a short amount of training and how training impacts caregivers’ understanding, and how they would answer the question. Methods:A digital survey was issued for 90 daysin the US (n=121) and 30 days in Mexico (n=88), targeting respondents with children ≤4 years of age. Respondents were asked a question in two formats, first without a line of training text and second with a line of training text. The question set was as follows, “If your baby had stool in his/her diaper and you changed the diaper and 10 min later there was more stool in the diaper, how many stools would you report this as?” followed by the same question beginning with “If you were given the instruction that IF there are at least 5 minutes in between stools, then it counts as two (2) stools…”.Four response items were provided for both questions, 1) 2 stools, 2) 1stool, 3) it depends on how much stool was in the first versus the second diaper, 4) There is not enough information to be able to answer the question. Response frequencies between questions were compared. Results: Responses to the question without training saw some variability in the US, with 69% selecting “2 stools”,11% selecting “1 stool”, 14% selecting “it depends on how much stool was in the first versus the second diaper”, and 7% selecting “There is not enough information to be able to answer the question” and in Mexico respondents selected 9%, 78%, 13%, and 0% respectively. However, responses to the question after training saw more consolidation in the US, with 85% of respondents selecting“2 stools,” representing an increase in those selecting the correct answer. Additionally in Mexico, with 84% of respondents selecting “1 episode” representing an increase in the those selecting the correct response. Conclusions: Caregiver reported outcomes are critical for infant growth and tolerability studies, however, they can be highly subjective and see a high variability of responses without guidance. Training is critical to standardize all caregivers’ perspective regarding how to answer questions accurately in order to provide an accurate dataset.

Keywords: infant nutrition, clinical trial optimization, stool reporting, decentralized clinical trials

Procedia PDF Downloads 90
7494 Derivation of Bathymetry from High-Resolution Satellite Images: Comparison of Empirical Methods through Geographical Error Analysis

Authors: Anusha P. Wijesundara, Dulap I. Rathnayake, Nihal D. Perera

Abstract:

Bathymetric information is fundamental importance to coastal and marine planning and management, nautical navigation, and scientific studies of marine environments. Satellite-derived bathymetry data provide detailed information in areas where conventional sounding data is lacking and conventional surveys are inaccessible. The two empirical approaches of log-linear bathymetric inversion model and non-linear bathymetric inversion model are applied for deriving bathymetry from high-resolution multispectral satellite imagery. This study compares these two approaches by means of geographical error analysis for the site Kankesanturai using WorldView-2 satellite imagery. Based on the Levenberg-Marquardt method calibrated the parameters of non-linear inversion model and the multiple-linear regression model was applied to calibrate the log-linear inversion model. In order to calibrate both models, Single Beam Echo Sounding (SBES) data in this study area were used as reference points. Residuals were calculated as the difference between the derived depth values and the validation echo sounder bathymetry data and the geographical distribution of model residuals was mapped. The spatial autocorrelation was calculated by comparing the performance of the bathymetric models and the results showing the geographic errors for both models. A spatial error model was constructed from the initial bathymetry estimates and the estimates of autocorrelation. This spatial error model is used to generate more reliable estimates of bathymetry by quantifying autocorrelation of model error and incorporating this into an improved regression model. Log-linear model (R²=0.846) performs better than the non- linear model (R²=0.692). Finally, the spatial error models improved bathymetric estimates derived from linear and non-linear models up to R²=0.854 and R²=0.704 respectively. The Root Mean Square Error (RMSE) was calculated for all reference points in various depth ranges. The magnitude of the prediction error increases with depth for both the log-linear and the non-linear inversion models. Overall RMSE for log-linear and the non-linear inversion models were ±1.532 m and ±2.089 m, respectively.

Keywords: log-linear model, multi spectral, residuals, spatial error model

Procedia PDF Downloads 292
7493 Game “EZZRA” as an Innovative Solution

Authors: Mane Varosyan, Diana Tumanyan, Agnesa Martirosyan

Abstract:

There are many catastrophic events that end with dire consequences, and to avoid them, people should be well-armed with the necessary information about these situations. During the last years, Serious Games have increasingly gained popularity for training people for different types of emergencies. The major discussed problem is the usage of gamification in education. Moreover, it is mandatory to understand how and what kind of gamified e-learning modules promote engagement. As the theme is emergency, we also find out people’s behavior for creating the final approach. Our proposed solution is an educational video game, “EZZRA”.

Keywords: gamification, education, emergency, serious games, game design, virtual reality, digitalisation

Procedia PDF Downloads 72
7492 Development of a Computer Based, Nutrition and Fitness Programme and Its Effect on Nutritional Status and Fitness of Obese Adults

Authors: Richa Soni, Vibha Bhatnagar, N. K. Jain

Abstract:

This study was conducted to develop a computer mediated programme for weight management and physical fitness and examining its efficacy in reducing weight and improving physical fitness in obese adults. A user friendly, computer based programme was developed to provide a simple, quick, easy and user-friendly method of assessing energy balance at individual level. The programme had four main sections viz. personal Profile, know about your weight, fitness and food exchange list. The computer programme was developed to provide facilities of creating individual profile, tracking meal and physical activities, suggesting nutritional and exercise requirements, planning calorie specific menus, keeping food diaries and revising the diet and exercise plans if needed. The programme was also providing information on obesity, underweight, physical fitness. An exhaustive food exchange list was also given in the programme to assist user to make right food choice decisions. The developed programme was evaluated by a panel of 15 experts comprising endocrinologists, nutritionists and diet counselors. Suggestions given by the experts were paned down and the entire programme was modified in light of suggestions given by the panel members and was reevaluated by the same panel of experts. For assessing the impact of the programme 22 obese subjects were selected purposively and randomly assigned to intervention group (n=12) and no information control group. (n=10). The programme group was asked to strictly follow the programme for one month. Significant reduction in the intake of energy, fat and carbohydrates was observed while intake of fruits, green leafy vegetables was increased. The programme was also found to be effective in reducing body weight, body fat percent and body fat mass whereas total body water and physical fitness scores improved significantly. There was no significant alteration observed in any parameters in the control group.

Keywords: body composition, body weight, computer programme, physical fitness

Procedia PDF Downloads 282
7491 Perception of Eco-Music From the Contents the Earth’s Sound Ecosystem

Authors: Joni Asitashvili, Eka Chabashvili, Maya Virsaladze, Alexander Chokhonelidze

Abstract:

Studying the soundscape is a major challenge in many countries of the civilized world today. The sound environment and music itself are part of the Earth's ecosystem. Therefore, researching its positive or negative impact is important for a clean and healthy environment. The acoustics of nature gave people many musical ideas, and people enriched musical features and performance skills with the ability to imitate the surrounding sound. For example, a population surrounded by mountains invented the technique of antiphonal singing, which mimics the effect of an echo. Canadian composer Raymond Murray Schafer viewed the world as a kind of musical instrument with ever-renewing tuning. He coined the term "Soundscape" as a name of a natural environmental sound, including the sound field of the Earth. It can be said that from which the “music of nature” is constructed. In the 21st century, a new field–Ecomusicology–has emerged in the field of musical art to study the sound ecosystem and various issues related to it. Ecomusicology considers the interconnections between music, culture, and nature–According to the Aaron Allen. Eco-music is a field of ecomusicology concerning with the depiction and realization of practical processes using modern composition techniques. Finding an artificial sound source (instrumental or electronic) for the piece that will blend into the soundscape of Sound Oases. Creating a composition, which sounds in harmony with the vibrations of human, nature, environment, and micro- macrocosm as a whole; Currently, we are exploring the ambient sound of the Georgian urban and suburban environment to discover “Sound Oases" and compose Eco-music works. We called “Sound Oases" an environment with a specific sound of the ecosystem to use in the musical piece as an instrument. The most interesting examples of Eco-music are the round dances, which were already created in the BC era. In round dances people would feel the united energy. This urge to get united revealed itself in our age too, manifesting itself in a variety of social media. The virtual world, however, is not enough for a healthy interaction; we created plan of “contemporary round dance” in sound oasis, found during expedition in Georgian caves, where people interacted with cave's soundscape and eco-music, they feel each other sharing energy and listen to earth sound. This project could be considered a contemporary round dance, a long improvisation, particular type of art therapy, where everyone can participate in an artistic process. We would like to present research result of our eco-music experimental performance.

Keywords: eco-music, environment, sound, oasis

Procedia PDF Downloads 58
7490 Identity of Cultural Food: A Case Study of Traditional Mon Cuisine in Bangkok, Thailand

Authors: Saruda Nitiworakarn

Abstract:

This research aims to identify traditional Mon cuisines as well as gather and classify traditional cuisines of Mon communities in Bangkok. The studying of this research is used by methodology of the quantitative research. Using the questionnaire as the method in collecting information from sampling totally amount of 450 persons analyzed via frequency, percentage and mean value. The results showed that a variety of traditional Mon cuisines of Bangkok could split into 6 categories of meat diet with 54 items and 6 categories of desserts with 19 items.

Keywords: cultural identity, traditional food, Mon cuisine, Thailand

Procedia PDF Downloads 304
7489 A Cross-Sectional Study Assessing Communication Practices among Doctors at a University Hospital in Pakistan

Authors: Muhammad Waqas Baqai, Noman Shahzad, Rehman Alvi

Abstract:

Communication among health care givers is the essence of quality patient care and any compromise results in errors and inefficiency leading to cumbersome outcomes. The use of smartphone among health professionals has increased tremendously. Almost every health professional carries it and majority of them uses a third party communication software called whatsApp for work related communications. It gives instant access to the person responsible for any particular query and therefore helps in efficient and timely decision making. It is also an easy way of sharing medical documents, multimedia and provides platform for consensual decision making through group discussions. However clinical communication through whatsApp has some demerits too including reduction in verbal communication, worsening professional relations, unprofessional behavior, risk of confidentiality breach and threats from cyber-attacks. On the other hand the traditional pager device being used in many health care systems is a unidirectional communication that lacks the ability to convey any information other than the number to which the receiver has to respond. Our study focused on these two widely used modalities of communication among doctors of the largest tertiary care center of Pakistan i.e. The Aga Khan University Hospital. Our aim was to note which modality is considered better and has fewer threats to medical data. Approval from ethical review committee of the institute was taken prior to conduction of this study. We submitted an online survey form to all the interns and residents working at our institute and collected their response in a month’s time. 162 submissions were recorded and analyzed using descriptive statistics. Only 20% of them were comfortable with using pagers exclusively, 52% with whatsApp and 28% with both. 65% think that whatsApp is time-saving and quicker than pager. 54% of them considered whatsApp to be causing nuisance from work related notifications in their off-work hours. 60% think that they are more likely to miss information through pager system because of the unidirectional nature. Almost all (96%) of residents and interns found whatsApp to be useful in terms of saving information for future reference. For urgent issues, majority (70%) preferred pager over whatsApp and also pager was considered more valid in terms of hospital policies and legal issues. Among major advantages of whatsApp as listed by them were; easy mass communication, sharing of clinical pictures, universal access and no need of carrying additional device. However the major drawback of using whatsApp for clinical communication that everyone shared was threat to patients’ confidentiality as clinicians usually share pictures of wounds, clinical documents etc. Lastly we asked them if they think there is a need of a separate application for instant communication dedicated to clinical communication only and 90% responded positively. Therefore, we concluded that both modalities have their merits and demerits but the greatest drawback with whatsApp is the risk of breach in patients’ confidentiality and off-work disturbance. Hence, we recommend a more secure, institute-run application for all intra hospital communications where they can share documents, pictures etc. easily under a controlled environment.

Keywords: WhatsApp, pager, clinical communication, confidentiality

Procedia PDF Downloads 141
7488 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models

Authors: V. Mantey, N. Findlay, I. Maddox

Abstract:

The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.

Keywords: building detection, disaster relief, mask-RCNN, satellite mapping

Procedia PDF Downloads 166
7487 Constructing Digital Memory for Chinese Ancient Village: A Case on Village of Gaoqian

Authors: Linqing Ma, Huiling Feng, Jihong Liang, Yi Qian

Abstract:

In China, some villages have survived in the long history of changes and remain until today with their unique styles and featured culture developed in the past. Those ancient villages, usually aged for hundreds or thousands of years, are the mirror for traditional Chinese culture, especially the farming-studying culture represented by the Confucianism. Gaoqian, an ancient village with a population of 3,000 in Zhejiang province, is such a case. With a history dating back to Yuan Dynasty, Gaoqian Village has 13 well-preserved traditional Chinese houses with a courtyard, which were built in the Ming and Qing Dynasty. It is a fine specimen to study traditional rural China. In China, some villages have survived in the long history of changes and remain until today with their unique styles and featured culture developed in the past. Those ancient villages, usually aged for hundreds or thousands of years, are the mirror for traditional Chinese culture, especially the farming-studying culture represented by the Confucianism. Gaoqian, an ancient village with a population of 3,000 in Zhejiang province, is such a case. With a history dating back to Yuan Dynasty, Gaoqian Village has 13 well-preserved traditional Chinese houses with a courtyard, which were built in the Ming and Qing Dynasty. It is a fine specimen to study traditional rural China. Then a repository for the memory of the Village will be completed by doing arrangement and description for those multimedia resources such as texts, photos, videos and so on. Production of Creative products with digital technologies is also possible based a thorough understanding of the culture feature of Gaoqian Village using research tools for literature and history studies and a method of comparative study. Finally, the project will construct an exhibition platform for the Village and its culture by telling its stories with completed structures and treads.

Keywords: ancient villages, digital exhibition, multimedia, traditional culture

Procedia PDF Downloads 581
7486 Cellular Automata Using Fractional Integral Model

Authors: Yasser F. Hassan

Abstract:

In this paper, a proposed model of cellular automata is studied by means of fractional integral function. A cellular automaton is a decentralized computing model providing an excellent platform for performing complex computation with the help of only local information. The paper discusses how using fractional integral function for representing cellular automata memory or state. The architecture of computing and learning model will be given and the results of calibrating of approach are also given.

Keywords: fractional integral, cellular automata, memory, learning

Procedia PDF Downloads 409
7485 Preparation and Characterization of Biosorbent from Cactus (Opuntia ficus-indica) cladodes and its Application for Dye Removal from Aqueous Solution

Authors: Manisha Choudhary, Sudarsan Neogi

Abstract:

Malachite green (MG), an organic basic dye, has been widely used for the dyeing purpose, as well as a fungicide and antiseptic in aquaculture industry to control fish parasites and disease. However, MG has now turned out to be an extremely controversial compound due to its adverse impact on living beings. Due to high toxicity, proper treatment of wastewater containing MG is utmost important. Among different available technologies, adsorption process is one of the most efficient and cost-effective treatment method due to its simplicity of design, ease of operation and regeneration of used materials. Nonetheless, commercial activated carbon is expensive leading the researchers to focus on utilizing natural resources. In the present work, a species of cactus, Opuntia ficus-indica (OFI), was used to develop a highly efficient, low-cost powdered activated carbon by chemical activation using NaOH. The biosorbent was characterized by Fourier-transform infrared spectroscopy, field emission scanning electron microscope, energy-dispersive X-ray spectroscopy, Brunauer–Emmett–Teller (BET) and X-ray diffraction analysis. Batch adsorption studies were performed to remove MG from an aqueous solution as a function of contact time, initial solution pH, initial dye concentration, biosorbent dosages, the presence of salt and temperature. By increasing the initial dye concentration from 100 to 500 mg/l, adsorption capacity increased from 165.45 to 831.58 mg/g. The adsorption kinetics followed the pseudo-second-order model and the chemisorption mechanisms were revealed. The electrostatic attractions and chemical interactions were observed between amino and hydroxyl groups of the biosorbent and amine groups of the dye. The adsorption was solely controlled by film diffusion. Different isotherm models were used to fit the adsorption data. The excellent recovery of adsorption efficiency after the regeneration of biosorbent indicated the high potential of this adsorbent to remove MG from aqueous solution and an excellent cost-effective biosorbent for wide application in wastewater treatment.

Keywords: adsorption, biosorbent, cactus, malachite green

Procedia PDF Downloads 366
7484 The Amount of Conformity of Persian Subject Headlines with Users' Social Tagging

Authors: Amir Reza Asnafi, Masoumeh Kazemizadeh, Najmeh Salemi

Abstract:

Due to the diversity of information resources in the web0.2 environment, which is increasing in number from time to time, the social tagging system should be used to discuss Internet resources. Studying the relevance of social tags to thematic headings can help enrich resources and make them more accessible to resources. The present research is of applied-theoretical type and research method of content analysis. In this study, using the listing method and content analysis, the level of accurate, approximate, relative, and non-conformity of social labels of books available in the field of information science and bibliography of Kitabrah website with Persian subject headings was determined. The exact matching of subject headings with social tags averaged 22 items, the approximate matching of subject headings with social tags averaged 36 items, the relative matching of thematic headings with social tags averaged 36 social items, and the average matching titles did not match the title. The average is 116. According to the findings, the exact matching of subject headings with social labels is the lowest and the most inconsistent. This study showed that the average non-compliance of subject headings with social labels is even higher than the sum of the three types of exact, relative, and approximate matching. As a result, the relevance of thematic titles to social labels is low. Due to the fact that the subject headings are in the form of static text and users are not allowed to interact and insert new selected words and topics, and on the other hand, in websites based on Web 2 and based on the social classification system, this possibility is available for users. An important point of the present study and the studies that have matched the syntactic and semantic matching of social labels with thematic headings is that the degree of conformity of thematic headings with social labels is low. Therefore, these two methods can complement each other and create a hybrid cataloging that includes subject headings and social tags. The low level of conformity of thematic headings with social tags confirms the results of backgrounds and writings that have compared the social tags of books with the thematic headings of the Library of Congress. It is not enough to match social labels with thematic headings. It can be said that these two methods can be complementary.

Keywords: Web 2/0, social tags, subject headings, hybrid cataloging

Procedia PDF Downloads 156
7483 Comparative Evaluation of a Dynamic Navigation System Versus a Three-Dimensional Microscope in Retrieving Separated Endodontic Files: An in Vitro Study

Authors: Mohammed H. Karim, Bestoon M. Faraj

Abstract:

Introduction: instrument separation is a common challenge in the endodontic field. Various techniques and technologies have been developed to improve the retrieval success rate. This study aimed to compare the effectiveness of a Dynamic Navigation System (DNS) and a three-dimensional microscope in retrieving broken rotary NiTi files when using trepan burs and the extractor system. Materials and Methods: Thirty maxillary first bicuspids with sixty separate roots were split into two comparable groups based on a comprehensive Cone-Beam Computed Tomography (CBCT) analysis of the root length and curvature. After standardised access opening, glide paths, and patency attainment with the K file (sizes 10 and 15), the teeth were arranged on 3D models (three per quadrant, six per model). Subsequently, controlled-memory heat-treated NiTi rotary files (#25/0.04) were notched 4 mm from the tips and fractured at the apical third of the roots. The C-FR1 Endo file removal system was employed under both guidance to retrieve the fragments, and the success rate, canal aberration, treatment time and volumetric changes were measured. The statistical analysis was performed using IBM SPSS software at a significance level of 0.05. Results: The microscope-guided group had a higher success rate than the DNS guidance, but the difference was insignificant (p > 0.05). In addition, the microscope-guided drills resulted in a substantially lower proportion of canal aberration, required less time to retrieve the fragments and caused a minor change in the root canal volume (p < 0.05). Conclusion: Although dynamically guided trephining with the extractor can retrieve separated instruments, it is inferior to three-dimensional microscope guidance regarding treatment time, procedural errors, and volume change.

Keywords: dynamic navigation system, separated instruments retrieval, trephine burs and extractor system, three-dimensional video microscope

Procedia PDF Downloads 92
7482 Subjectivity in Miracle Aesthetic Clinic Ambient Media Advertisement

Authors: Wegig Muwonugroho

Abstract:

Subjectivity in advertisement is a ‘power’ possessed by advertisements to construct trend, concept, truth, and ideology through subconscious mind. Advertisements, in performing their functions as message conveyors, use such visual representation to inspire what’s ideal to the people. Ambient media is advertising medium making the best use of the environment where the advertisement is located. Miracle Aesthetic Clinic (Miracle) popularizes the visual representation of its ambient media advertisement through the omission of face-image of both female mannequins that function as its ambient media models. Usually, the face of a model in advertisement is an image commodity having selling values; however, the faces of ambient media models in Miracle advertisement campaign are suppressed over the table and wall. This face concealing aspect creates not only a paradox of subjectivity but also plurality of meaning. This research applies critical discourse analysis method to analyze subjectivity in obtaining the insight of ambient media’s meaning. First, in the stage of textual analysis, the embedding attributes upon female mannequins imply that the models are denoted as the representation of modern women, which are identical with the identities of their social milieus. The communication signs aimed to be constructed are the women who lose their subjectivities and ‘feel embarrassed’ to flaunt their faces to the public because of pimples on their faces. Second, in the stage of analysis of discourse practice, it points out that ambient media as communication media has been comprehensively responded by the targeted audiences. Ambient media has a role as an actor because of its eyes-catching setting, and taking space over the area where the public are wandering around. Indeed, when the public realize that the ambient media models are motionless -unlike human- stronger relation then appears, marked by several responses from targeted audiences. Third, in the stage of analysis of social practice, soap operas and celebrity gossip shows on the television become a dominant discourse influencing advertisement meaning. The subjectivity of Miracle Advertisement corners women by the absence of women participation in public space, the representation of women in isolation, and the portrayal of women as an anxious person in the social rank when their faces suffered from pimples. The Ambient media as the advertisement campaign of Miracle is quite success in constructing a new trend discourse of face beauty that is not limited on benchmarks of common beauty virtues, but the idea of beauty can be presented by ‘when woman doesn’t look good’ visualization.

Keywords: ambient media, advertisement, subjectivity, power

Procedia PDF Downloads 315
7481 Practice and Understanding of Fracturing Renovation for Risk Exploration Wells in Xujiahe Formation Tight Sandstone Gas Reservoir

Authors: Fengxia Li, Lufeng Zhang, Haibo Wang

Abstract:

The tight sandstone gas reservoir in the Xujiahe Formation of the Sichuan Basin has huge reserves, but its utilization rate is low. Fracturing and stimulation are indispensable technologies to unlock their potential and achieve commercial exploitation. Slickwater is the most widely used fracturing fluid system in the fracturing and renovation of tight reservoirs. However, its viscosity is low, its sand-carrying performance is poor, and the risk of sand blockage is high. Increasing the sand carrying capacity by increasing the displacement will increase the frictional resistance of the pipe string, affecting the resistance reduction performance. The variable viscosity slickwater can flexibly switch between different viscosities in real-time online, effectively overcoming problems such as sand carrying and resistance reduction. Based on a self-developed indoor loop friction testing system, a visualization device for proppant transport, and a HAAKE MARS III rheometer, a comprehensive evaluation was conducted on the performance of variable viscosity slickwater, including resistance reduction, rheology, and sand carrying. The indoor experimental results show that: 1. by changing the concentration of drag-reducing agents, the viscosity of the slippery water can be changed between 2~30mPa. s; 2. the drag reduction rate of the variable viscosity slickwater is above 80%, and the shear rate will not reduce the drag reduction rate of the liquid; under indoor experimental conditions, 15mPa. s of variable viscosity and slickwater can basically achieve effective carrying and uniform placement of proppant. The layered fracturing effect of the JiangX well in the dense sandstone of the Xujiahe Formation shows that the drag reduction rate of the variable viscosity slickwater is 80.42%, and the daily production of the single layer after fracturing is over 50000 cubic meters. This study provides theoretical support and on-site experience for promoting the application of variable viscosity slickwater in tight sandstone gas reservoirs.

Keywords: slickwater, hydraulic fracturing, dynamic sand laying, drag reduction rate, rheological properties

Procedia PDF Downloads 71
7480 Career Guidance System Using Machine Learning

Authors: Mane Darbinyan, Lusine Hayrapetyan, Elen Matevosyan

Abstract:

Artificial Intelligence in Education (AIED) has been created to help students get ready for the workforce, and over the past 25 years, it has grown significantly, offering a variety of technologies to support academic, institutional, and administrative services. However, this is still challenging, especially considering the labor market's rapid change. While choosing a career, people face various obstacles because they do not take into consideration their own preferences, which might lead to many other problems like shifting jobs, work stress, occupational infirmity, reduced productivity, and manual error. Besides preferences, people should properly evaluate their technical and non-technical skills, as well as their personalities. Professional counseling has become a difficult undertaking for counselors due to the wide range of career choices brought on by changing technological trends. It is necessary to close this gap by utilizing technology that makes sophisticated predictions about a person's career goals based on their personality. Hence, there is a need to create an automated model that would help in decision-making based on user inputs. Improving career guidance can be achieved by embedding machine learning into the career consulting ecosystem. There are various systems of career guidance that work based on the same logic, such as the classification of applicants, matching applications with appropriate departments or jobs, making predictions, and providing suitable recommendations. Methodologies like KNN, Neural Networks, K-means clustering, D-Tree, and many other advanced algorithms are applied in the fields of data and compute some data, which is helpful to predict the right careers. Besides helping users with their career choice, these systems provide numerous opportunities which are very useful while making this hard decision. They help the candidate to recognize where he/she specifically lacks sufficient skills so that the candidate can improve those skills. They are also capable to offer an e-learning platform, taking into account the user's lack of knowledge. Furthermore, users can be provided with details on a particular job, such as the abilities required to excel in that industry.

Keywords: career guidance system, machine learning, career prediction, predictive decision, data mining, technical and non-technical skills

Procedia PDF Downloads 76
7479 Examining French Teachers’ Teaching and Learning Approaches in Some Selected Junior High Schools in Ghana

Authors: Paul Koffitse Agobia

Abstract:

In 2020 the Ministry of Education in Ghana and the National Council for Curriculum and Assessment (NaCCA) rolled out a new curriculum, Common Core Programme (CCP) for Basic 7 to 10, that lays emphasis on character building and values which are important to the Ghanaian society by providing education that will produce character–minded learners, with problem solving skills, who can play active roles in dealing with the increasing challenges facing Ghana and the global society. Therefore, learning and teaching approaches that prioritise the use of digital learning resources and active learning are recommended. The new challenge facing Ghanaian teachers is the ability to use new technologies together with the appropriate content pedagogical knowledge to help learners develop, aside the communication skills in French, the essential 21st century skills as recommended in the new curriculum. This article focusses on the pedagogical approaches that are recommended by NaCCA. The study seeks to examine French language teachers’ understanding of the recommended pedagogical approaches and how they use digital learning resources in class to foster the development of these essential skills and values. 54 respondents, comprised 30 teachers and 24 head teachers, were selected in 6 Junior High schools in rural districts (both private and public) and 6 from Junior High schools in an urban setting. The schools were selected in three regions: Volta, Central and Western regions. A class observation checklist and an interview guide were used to collect data for the study. The study reveals that some teachers adopt teaching techniques that do not promote active learning. They demonstrate little understanding of the core competences and values, therefore, fail to integrate them in their lessons. However, some other teachers, despite their lack of understanding of learning and teaching philosophies, adopted techniques that can help learners develop some of the core competences and values. In most schools, digital learning resources are not utilized, though teachers have smartphones or laptops.

Keywords: active learning, core competences, digital learning resources, pedagogical approach, values.

Procedia PDF Downloads 68
7478 Child Labour and Contemporary Slavery: A Nigerian Perspective

Authors: Obiageli Eze

Abstract:

Millions of Nigerian children are subjected daily to all forms of abuse, ranging from trafficking to slavery, and forced labor. These under age children are taken from different parts of the Country to be used as sex slaves and laborers in the big cities, killed for rituals, organ transplantation, or used for money laundering, begging on the streets or are put to work in the fields. These children are made to do inhuman jobs under degrading conditions and face all kinds of abuse at the hands of their owners with no hope of escape. While lots of people blame poverty or culture as a basis for human trafficking in Nigeria, the National Agency for the Prohibition and Trafficking in Persons and other Related Matters (NAPTIP) says other causes of the outrageous rate of human trafficking in the country are ignorance, desperation, and the promotion and commercialization of sex by the European Union (EU) as dozens of young Nigerian children and women are forced to work as prostitutes in European countries including the Netherlands, France, Italy, and Spain. In the cause of searching for greener pastures, they are coerced into work they have not chosen and subjected to perpetual life in bondage. The Universal Declaration of Human Rights 1948 prohibits slave trade and slavery. Despite the fact that Nigeria is a Sovereign member of the United Nations and signatory to this International instrument, Child trafficking and slavery is still on the increase. This may be caused by the fact that the punishment for this crime in Nigeria is a maximum term of 10 years imprisonment with some of the worst offenders getting off with as little as 2 years imprisonment or an option of fine. It goes without saying that this punishment is not sufficient to act as a deterrent to these modern slave traders. Another major factor oiling the wheel of trafficking in the country is voodoo. The victims are taken to shrines of voodoo priests for oath taking. There, underage girls and boys are made to swear that they would never reveal the identities of their traffickers to anyone if arrested whether in the course of the journey or in the destination countries and that they would pay off debt. Nigeria needs tougher Laws in order to be able to combat human trafficking and slave trade. Also there has to be aggressive sensitization and awareness programs designed to educate and enlighten the public as to the dangers faced by these victims and the need to report any suspicious activity to the authorities. This paper attempts to give an insight into the plight of under-age Nigerian children trafficked and sold as slaves and offer a more effective stand in the fight against it.

Keywords: child labor, slavery, slave trade, trafficking

Procedia PDF Downloads 497
7477 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 161
7476 A Multistep Broyden’s-Type Method for Solving Systems of Nonlinear Equations

Authors: M. Y. Waziri, M. A. Aliyu

Abstract:

The paper proposes an approach to improve the performance of Broyden’s method for solving systems of nonlinear equations. In this work, we consider the information from two preceding iterates rather than a single preceding iterate to update the Broyden’s matrix that will produce a better approximation of the Jacobian matrix in each iteration. The numerical results verify that the proposed method has clearly enhanced the numerical performance of Broyden’s Method.

Keywords: mulit-step Broyden, nonlinear systems of equations, computational efficiency, iterate

Procedia PDF Downloads 633