Search results for: greedy randomized adaptive search procedure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5868

Search results for: greedy randomized adaptive search procedure

1518 Estimation of Particle Size Distribution Using Magnetization Data

Authors: Navneet Kaur, S. D. Tiwari

Abstract:

Magnetic nanoparticles possess fascinating properties which make their behavior unique in comparison to corresponding bulk materials. Superparamagnetism is one such interesting phenomenon exhibited only by small particles of magnetic materials. In this state, the thermal energy of particles become more than their magnetic anisotropy energy, and so particle magnetic moment vectors fluctuate between states of minimum energy. This situation is similar to paramagnetism of non-interacting ions and termed as superparamagnetism. The magnetization of such systems has been described by Langevin function. But, the estimated fit parameters, in this case, are found to be unphysical. It is due to non-consideration of particle size distribution. In this work, analysis of magnetization data on NiO nanoparticles is presented considering the effect of particle size distribution. Nanoparticles of NiO of two different sizes are prepared by heating freshly synthesized Ni(OH)₂ at different temperatures. Room temperature X-ray diffraction patterns confirm the formation of single phase of NiO. The diffraction lines are seen to be quite broad indicating the nanocrystalline nature of the samples. The average crystallite size are estimated to be about 6 and 8 nm. The samples are also characterized by transmission electron microscope. Magnetization of both sample is measured as function of temperature and applied magnetic field. Zero field cooled and field cooled magnetization are measured as a function of temperature to determine the bifurcation temperature. The magnetization is also measured at several temperatures in superparamagnetic region. The data are fitted to an appropriate expression considering a distribution in particle size following a least square fit procedure. The computer codes are written in PYTHON. The presented analysis is found to be very useful for estimating the particle size distribution present in the samples. The estimated distributions are compared with those determined from transmission electron micrographs.

Keywords: anisotropy, magnetization, nanoparticles, superparamagnetism

Procedia PDF Downloads 135
1517 Data Mining Spatial: Unsupervised Classification of Geographic Data

Authors: Chahrazed Zouaoui

Abstract:

In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.

Keywords: mining, GIS, geo-clustering, neighborhood

Procedia PDF Downloads 368
1516 The Effect of Transparent Oil Wood Stain on the Colour Stability of Spruce Wood during Weathering

Authors: Eliska Oberhofnerova, Milos Panek, Stepan Hysek, Martin Lexa

Abstract:

Nowadays the use of wood, both indoors and outdoors, is constantly increasing. However wood is a natural organic material and in the exterior is subjected to a degradation process caused by abiotic factors (solar radiation, rain, moisture, wind, dust etc.). This process affects only surface layers of wood but neglecting some of the basic rules of wood protection leads to increased possibility of biological agents attack and thereby influences a function of the wood element. The process of wood degradation can be decreased by proper surface treatment, especially in the case of less naturally durable wood species, as spruce. Modern coating systems are subjected to many requirements such as colour stability, hydrophobicity, low volatile organic compound (VOC) content, long service life or easy maintenance. The aim of this study is to evaluate the colour stability of spruce wood (Picea abies), as the basic parameter indicating the coating durability, treated with two layers of transparent natural oil wood stain and exposed to outdoor conditions. The test specimens were exposed for 2 years to natural weathering and 2000 hours to artificial weathering in UV-chamber. The colour parameters were measured before and during exposure to weathering by the spectrophotometer according to CIELab colour space. The comparison between untreated and treated wood and both testing procedures was carried out. The results showed a significant effect of coating on the colour stability of wood, as expected. Nevertheless, increasing colour changes of wood observed during the exposure to weathering differed according to applied testing procedure - natural and artificial.

Keywords: colour stability, natural and artificial weathering, spruce wood, transparent coating

Procedia PDF Downloads 214
1515 The Impact of Cognitive Load on Deceit Detection and Memory Recall in Children’s Interviews: A Meta-Analysis

Authors: Sevilay Çankaya

Abstract:

The detection of deception in children’s interviews is essential for statement veracity. The widely used method for deception detection is building cognitive load, which is the logic of the cognitive interview (CI), and its effectiveness for adults is approved. This meta-analysis delves into the effectiveness of inducing cognitive load as a means of enhancing veracity detection during interviews with children. Additionally, the effectiveness of cognitive load on children's total number of events recalled is assessed as a second part of the analysis. The current meta-analysis includes ten effect sizes from search using databases. For the effect size calculation, Hedge’s g was used with a random effect model by using CMA version 2. Heterogeneity analysis was conducted to detect potential moderators. The overall result indicated that cognitive load had no significant effect on veracity outcomes (g =0.052, 95% CI [-.006,1.25]). However, a high level of heterogeneity was found (I² = 92%). Age, participants’ characteristics, interview setting, and characteristics of the interviewer were coded as possible moderators to explain variance. Age was significant moderator (β = .021; p = .03, R2 = 75%) but the analysis did not reveal statistically significant effects for other potential moderators: participants’ characteristics (Q = 0.106, df = 1, p = .744), interview setting (Q = 2.04, df = 1, p = .154), and characteristics of interviewer (Q = 2.96, df = 1, p = .086). For the second outcome, the total number of events recalled, the overall effect was significant (g =4.121, 95% CI [2.256,5.985]). The cognitive load was effective in total recalled events when interviewing with children. All in all, while age plays a crucial role in determining the impact of cognitive load on veracity, the surrounding context, interviewer attributes, and inherent participant traits may not significantly alter the relationship. These findings throw light on the need for more focused, age-specific methods when using cognitive load measures. It may be possible to improve the precision and dependability of deceit detection in children's interviews with the help of more studies in this field.

Keywords: deceit detection, cognitive load, memory recall, children interviews, meta-analysis

Procedia PDF Downloads 50
1514 Application Difference between Cox and Logistic Regression Models

Authors: Idrissa Kayijuka

Abstract:

The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.

Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio

Procedia PDF Downloads 442
1513 Comparison of Wake Oscillator Models to Predict Vortex-Induced Vibration of Tall Chimneys

Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta

Abstract:

The present study compares the semi-empirical wake-oscillator models that are used to predict vortex-induced vibration of structures. These models include those proposed by Facchinetti, Farshidian, and Dolatabadi, and Skop and Griffin. These models combine a wake oscillator model resembling the Van der Pol oscillator model and a single degree of freedom oscillation model. In order to use these models for estimating the top displacement of chimneys, the first mode vibration of the chimneys is only considered. The modal equation of the chimney constitutes the single degree of freedom model (SDOF). The equations of the wake oscillator model and the SDOF are simultaneously solved using an iterative procedure. The empirical parameters used in the wake-oscillator models are estimated using a newly developed approach, and response is compared with experimental data, which appeared comparable. For carrying out the iterative solution, the ode solver of MATLAB is used. To carry out the comparative study, a tall concrete chimney of height 210m has been chosen with the base diameter as 28m, top diameter as 20m, and thickness as 0.3m. The responses of the chimney are also determined using the linear model proposed by E. Simiu and the deterministic model given in Eurocode. It is observed from the comparative study that the responses predicted by the Facchinetti model and the model proposed by Skop and Griffin are nearly the same, while the model proposed by Fashidian and Dolatabadi predicts a higher response. The linear model without considering the aero-elastic phenomenon provides a less response as compared to the non-linear models. Further, for large damping, the prediction of the response by the Euro code is relatively well compared to those of non-linear models.

Keywords: chimney, deterministic model, van der pol, vortex-induced vibration

Procedia PDF Downloads 213
1512 Geophysical and Laboratory Evaluation of Aquifer Position, Aquifer Protective Capacity and Groundwater Quality in Selected Dumpsites in Calabar Municipal Local Government Area, South Eastern Nigeria

Authors: Egor Atan Obeten, Abong Augustine Agwul, Bissong A. Samson

Abstract:

The position of the aquifer, its protective capability, and the quality of the groundwater beneath the dumpsite were all investigated. The techniques employed were laboratory, tritium tagging, electrical resistivity tomography (ERT), and vertical electrical sounding (VES). With a maximum electrode spacing of 500 meters, fifteen VES stations were used, and IPI2win software was used to analyze the data collected. The resistivity map of the dumpsite was determined by deploying six ERT stations for the 2 D survey. To ascertain the degree of soil infiltration beneath the dumpsite, the tritium tagging method was used. Using a conventional laboratory procedure, groundwater samples were taken from neighboring boreholes and examined. The findings showed that there were three to five geoelectric layers, with the aquifer position being inferred to be between 24.2 and 75.1 meters deep in the third, fourth, and fifth levels. Siemens with values in the range of 0.0235 to 0.1908 for the load protection capacity were deemed to be, at most, weakly and badly protected. The obtained porosity values ranged from 44.45 to 89.75. Strong calculated values for transmissivity and porosity indicate a permeable aquifer system with considerable storativity. The area has an infiltration value between 8 and 22 percent, according to the results of the tritium tagging technique, which was used to evaluate the level of infiltration from the dumpsite. Groundwater samples that have been analyzed reveal levels of NO2, DO, Pb2+, magnesium, and cadmium that are higher than what the NSDWQ has approved. Overall analysis of the results from the above-described methodologies shows that the study area's aquifer system is porous and that contaminants will circulate through it quickly if they are contaminated.

Keywords: aquifer, transmissivity, dumpsite, groundwater

Procedia PDF Downloads 36
1511 The Impact of Rising Architectural Façade in Improving Terms of the Physical Urban Ambience Inside the Free Space for Urban Fabric - the Street- Case Study the City of Biskra

Authors: Rami Qaoud, Alkama Djamal

Abstract:

When we ask about the impact of rising architectural façade in improving the terms physical urban ambiance inside the free space for urban fabric. Considered as bringing back life and culture values and civilization to these cities. And This will be the theme of this search. Where we have conducted the study about the relationship that connects the empty and full of in the urban fabric in terms of the density construction and the architectural elevation of its façade to street view. In this framework, we adopted in the methodology of this research the technical field experience. And according to three types of Street engineering(H≥2W, H=W, H≤0.5W). Where we conducted a field to raise the values of the physical ambiance according to three main axes of ambiance. The first axe 1 - Thermal ambiance. Where the temperature values were collected, relative humidity, wind speed, temperature of surfaces (the outer wall-ground). The second axe 2- Visual ambiance. Where we took the values of natural lighting levels during the daytime. The third axe 3- Acoustic ambiance . Where we take sound values during the entire day. That experience, which lasted for three consecutive days, and through six stations of measuring, where it has been one measuring station for each type of the street engineering and in two different way street. Through the obtained results and with the comparison of those values. We noticed the difference between this values and the three type of street engineering. Where the difference the calorific values of air equal 4 ° C , in terms of the visual ambiance the difference in the direct lighting natural periods amounted six hours between the three types of street engineering. As well in terms of sound ambience, registered a difference in values of up 15 (db) between the three types. This difference in values indicates The impact of rising architectural façade in improving the physical urban ambiance within the free field - street- for urban fabric.

Keywords: street, physical urban ambience, rising architectural façade, urban fabric

Procedia PDF Downloads 284
1510 Phytochemical Composition and Characterization of Bioactive Compounds of the Green Seaweed Ulva lactuca: A Phytotherapeutic Approach

Authors: Mariame Taibi, Marouane Aouiji, Rachid Bengueddour

Abstract:

The Moroccan coastline is particularly rich in algae and constitutes a reserve of species with considerable economic, social and ecological potential. This work focuses on the research and characterization of algae bioactive compounds that can be used in pharmacology or phytopathology. The biochemical composition of the green alga Ulva lactuca (Ulvophyceae) was studied by determining the content of moisture, ash, phenols, flavonoids, total tannins, and chlorophyll. Seven solvents: distilled water, methanol, ethyl acetate, chloroform, benzene, petroleum ether, and hexane, were tested for their effectiveness in recovering chemical compounds. The identification of functional groupings, as well as the bioactive chemical compounds, was determined by FT-IR and GC-MS. The moisture content of the alga was 77%, while the ash content was 15%. Phenol content differed from one solvent studied to another, while chlorophyll a, b, and total chlorophyll were determined at 14%, 9.52%, and 25%, respectively. Carotenoid was present in a considerable amount (8.17%). The experimental results show that methanol is the most effective solvent for recovering bioactive compounds, followed by water. Moreover, the green alga Ulva lactuca is characterized by a high level of total polyphenols (45±3.24 mg GAE/gDM), average levels of total tannins and flavonoids (22.52±8.23 mg CE/gDM, 15.49±0.064 mg QE/gDM) respectively. The results of Fourier transform infrared spectroscopy (FT-IR) confirmed the presence of alcohol/phenol and amide functions in Ulva lactuca. The GC-MS analysis gave precisely the compounds contained in the various extracts, such as phenolic compounds, fatty acids, terpenoids, alcohols, alkanes, hydrocarbons, and steroids. All these results represent only a first step in the search for biologically active natural substances from seaweed. Additional tests are envisaged to confirm the bioactivity of seaweed.

Keywords: algae, Ulva lactuca, phenolic compounds, FTIR, GC-MS

Procedia PDF Downloads 98
1509 Design of a Surveillance Drone with Computer Aided Durability

Authors: Maram Shahad Dana Anfal

Abstract:

This research paper presents the design of a surveillance drone with computer-aided durability and model analyses that provides a cost-effective and efficient solution for various applications. The quadcopter's design is based on a lightweight and strong structure made of materials such as aluminum and titanium, which provide a durable structure for the quadcopter. The structure of this product and the computer-aided durability system are both designed to ensure frequent repairs or replacements, which will save time and money in the long run. Moreover, the study discusses the drone's ability to track, investigate, and deliver objects more quickly than traditional methods, makes it a highly efficient and cost-effective technology. In this paper, a comprehensive analysis of the quadcopter's operation dynamics and limitations is presented. In both simulation and experimental data, the computer-aided durability system and the drone's design demonstrate their effectiveness, highlighting the potential for a variety of applications, such as search and rescue missions, infrastructure monitoring, and agricultural operations. Also, the findings provide insights into possible areas for improvement in the design and operation of the drone. Ultimately, this paper presents a reliable and cost-effective solution for surveillance applications by designing a drone with computer-aided durability and modeling. With its potential to save time and money, increase reliability, and enhance safety, it is a promising technology for the future of surveillance drones. operation dynamic equations have been evaluated successfully for different flight conditions of a quadcopter. Also, CAE modeling techniques have been applied for the modal risk assessment at operating conditions.Stress analysis have been performed under the loadings of the worst-case combined motion flight conditions.

Keywords: drone, material, solidwork, hypermesh

Procedia PDF Downloads 126
1508 Proposing Smart Clothing for Addressing Criminal Acts Against Women in South Africa

Authors: Anne Mastamet-Mason

Abstract:

Crimes against women is a global concern, and South Africa, in particular, is in a dilemma of dealing with constant criminal acts that face the country. Debates on violence against women in South Africa cannot be overemphasised any longer as crimes continue to rise year by year. The recent death of a university student at the University of Cape Town, as well as many other cases, continues to strengthen the need to find solutions from all the spheres of South African society. The advanced textiles market contains a high number and variety of technologies, many of which have protected status and constitute a relatively small portion of the textiles used for the consumer market. Examples of advanced textiles include nanomaterials, such as silver, titanium dioxide and zinc oxide, designed to create an anti-microbial and self-cleaning layer on top of the fibers, thereby reducing body smell and soiling. Smart textiles propose materials and fabrics versatile and adaptive to different situations and functions. Integrating textiles and computing technologies offer an opportunity to come up with differentiated characteristics and functionality. This paper presents a proposal to design a smart camisole/Yoga sports brazier and a smart Yoga sports pant garment to be worn by women while alone and while in purported danger zones. The smart garments are to be worn under normal clothing and cannot be detected or seen, or suspected by perpetrators. The garments are imbued with devices to sense any physical aggression and any abnormal or accelerated heartbeat that may be exhibited by the victim of violence. The signals created during the attack can be transmitted to the police and family members who own a mobile application system that accepts signals emitted. The signals direct the receiver to the exact location of the offence, and the victim can be rescued before major violations are committed. The design of the Yoga sports garments will be done by Professor Mason, who is a fashion designer by profession, while the mobile phone application system will be developed by Mr. Amos Yegon, who is an independent software developer.

Keywords: smart clothing, wearable technology, south africa, 4th industrial revolution

Procedia PDF Downloads 194
1507 Computer-Aided Diagnosis System Based on Multiple Quantitative Magnetic Resonance Imaging Features in the Classification of Brain Tumor

Authors: Chih Jou Hsiao, Chung Ming Lo, Li Chun Hsieh

Abstract:

Brain tumor is not the cancer having high incidence rate, but its high mortality rate and poor prognosis still make it as a big concern. On clinical examination, the grading of brain tumors depends on pathological features. However, there are some weak points of histopathological analysis which can cause misgrading. For example, the interpretations can be various without a well-known definition. Furthermore, the heterogeneity of malignant tumors is a challenge to extract meaningful tissues under surgical biopsy. With the development of magnetic resonance imaging (MRI), tumor grading can be accomplished by a noninvasive procedure. To improve the diagnostic accuracy further, this study proposed a computer-aided diagnosis (CAD) system based on MRI features to provide suggestions of tumor grading. Gliomas are the most common type of malignant brain tumors (about 70%). This study collected 34 glioblastomas (GBMs) and 73 lower-grade gliomas (LGGs) from The Cancer Imaging Archive. After defining the region-of-interests in MRI images, multiple quantitative morphological features such as region perimeter, region area, compactness, the mean and standard deviation of the normalized radial length, and moment features were extracted from the tumors for classification. As results, two of five morphological features and three of four image moment features achieved p values of <0.001, and the remaining moment feature had p value <0.05. Performance of the CAD system using the combination of all features achieved the accuracy of 83.18% in classifying the gliomas into LGG and GBM. The sensitivity is 70.59% and the specificity is 89.04%. The proposed system can become a second viewer on clinical examinations for radiologists.

Keywords: brain tumor, computer-aided diagnosis, gliomas, magnetic resonance imaging

Procedia PDF Downloads 251
1506 Current Methods for Drug Property Prediction in the Real World

Authors: Jacob Green, Cecilia Cabrera, Maximilian Jakobs, Andrea Dimitracopoulos, Mark van der Wilk, Ryan Greenhalgh

Abstract:

Predicting drug properties is key in drug discovery to enable de-risking of assets before expensive clinical trials and to find highly active compounds faster. Interest from the machine learning community has led to the release of a variety of benchmark datasets and proposed methods. However, it remains unclear for practitioners which method or approach is most suitable, as different papers benchmark on different datasets and methods, leading to varying conclusions that are not easily compared. Our large-scale empirical study links together numerous earlier works on different datasets and methods, thus offering a comprehensive overview of the existing property classes, datasets, and their interactions with different methods. We emphasise the importance of uncertainty quantification and the time and, therefore, cost of applying these methods in the drug development decision-making cycle. To the best of the author's knowledge, it has been observed that the optimal approach varies depending on the dataset and that engineered features with classical machine learning methods often outperform deep learning. Specifically, QSAR datasets are typically best analysed with classical methods such as Gaussian Processes, while ADMET datasets are sometimes better described by Trees or deep learning methods such as Graph Neural Networks or language models. Our work highlights that practitioners do not yet have a straightforward, black-box procedure to rely on and sets a precedent for creating practitioner-relevant benchmarks. Deep learning approaches must be proven on these benchmarks to become the practical method of choice in drug property prediction.

Keywords: activity (QSAR), ADMET, classical methods, drug property prediction, empirical study, machine learning

Procedia PDF Downloads 70
1505 Research on Health Emergency Management Based on the Bibliometrics

Authors: Meng-Na Dai, Bao-Fang Wen, Gao-Pei Zhu, Chen-Xi Zhang, Jing Sun, Chang-Hai Tang, Zhi-Qiang Feng, Wen-Qiang Yin

Abstract:

Based on the analysis of literature in the health emergency management in China with recent 10 years, this paper discusses the Chinese current research hotspots, development trends and shortcomings in this field, and provides references for scholars to conduct follow-up research. CNKI(China National Knowledge Infrastructure), Weipu, and Wanfang were the databases of this literature. The key words during the database search were health, emergency, and management with the time from 2009 to 2018. The duplicate, non-academic, and unrelated documents were excluded. 901 articles were included in the literature review database. The main indicators of abstraction were, the number of articles published every year, authors, institutions, periodicals, etc. There are some research findings through the analysis of the literature. Overall, the number of literature in the health emergency management in China has shown a fluctuating downward trend in recent 10 years. Specifically, there is a lack of close cooperation between authors, which has not constituted the core team among them yet. Meanwhile, in this field, the number of high-level periodicals and quality literature is scarce. In addition, there are a lot of research hotspots, such as emergency management system, mechanism research, capacity evaluation index system research, plans and capacity-building research, etc. In the future, we should increase the scientific research funding of the health emergency management, encourage collaborative innovation among authors in multi-disciplinary fields, and create high-quality and high-impact journals in this field. The states should encourage scholars in this field to carry out more academic cooperation and communication with the whole world and improve the research in breadth and depth. Generally speaking, the research in health emergency management in China is still insufficient and needs to be improved.

Keywords: health emergency management, research situation, bibliometrics, literature

Procedia PDF Downloads 128
1504 Response of Local Cowpea to Intra Row Spacing and Weeding Regimes in Yobe State, Nigeria

Authors: A. G. Gashua, T. T. Bello, I. Alhassan, K. K. Gwiokura

Abstract:

Weeds are known to interfere seriously with crop growth, thereby affecting the productivity and quality of crops. Crops are also known to compete for natural growth resources if they are not adequately spaced, also affecting the performance of the growing crop. Farmers grow cowpea in mixtures with cereals and this is known to affect its yield. For this reason, a field experiment was conducted at Yobe State College of Agriculture Gujba, Damaturu station in the 2014 and 2015 rainy seasons to determine the appropriate intra row spacing and weeding regime for optimum growth and yield of cowpea (Vigna unguiculata L.) in pure stand in Sudan Savanna ecology. The treatments consist of three levels of spacing within rows (20 cm, 30 cm and 40 cm) and four weeding regimes (none, once at 3 weeks after sowing (WAS), twice at 3 and 6WAS, thrice at 3WAS, 6WAS and 9WAS); arranged in a Randomized Complete Block Design (RCBD) and replicated three times. The variety used was the local cowpea variety (white, early and spreading) commonly grown by farmers. The growth and yield data were collected and subjected to analysis of variance using SAS software, and the significant means were ranked by Students Newman Keul’s test (SNK). The findings of this study revealed better crop performance in 2015 than in 2014 despite poor soil condition. Intra row spacing significantly influenced vegetative growth especially the number of main branches, leaves and canopy spread at 6WAS and 9WAS with the highest values obtained at wider spacing (40 cm). The values obtained in 2015 doubled those obtained in 2014 in most cases. Spacing also significantly affected the number of pods in 2015, seed weight in both years and grain yield in 2014 with the highest values obtained when the crop was spaced at 30-40 cm. Similarly, weeding regime significantly influenced almost all the growth attributes of cowpea with higher values obtained from where cowpea was weeded three times at 3-week intervals, though statistically similar results were obtained even from where cowpea was weeded twice. Weeding also affected the entire yield and yield components in 2015 with the highest values obtained with increase weeding. Based on these findings, it is recommended that spreading cowpea varieties should be grown at 40 cm (or wider spacing) within rows and be weeded twice at three-week intervals for better crop performance in related ecologies.

Keywords: intra-row spacing, local cowpea, Nigeria, weeding

Procedia PDF Downloads 209
1503 Drippers Scaling Inhibition of the Localized Irrigation System by Green Inhibitors Based on Plant Extracts

Authors: Driouiche Ali, Karmal Ilham

Abstract:

The Agadir region is characterized by a dry climate, ranging from arid attenuated by oceanic influences to hyper-arid. The water mobilized in the agricultural sector of greater Agadir is 95% of underground origin and comes from the water table of Chtouka. The rest represents the surface waters of the Youssef Ben Tachfine dam. These waters are intended for the irrigation of 26880 hectares of modern agriculture. More than 120 boreholes and wells are currently exploited. Their depth varies between 10 m and 200 m and the unit flow rates of the boreholes are 5 to 50 l/s. A drop in the level of the water table of about 1.5 m/year, on average, has been observed during the last five years. Farmers are thus called upon to improve irrigation methods. Thus, localized or drip irrigation is adopted to allow rational use of water. The importance of this irrigation system is due to the fact that water is applied directly to the root zone and its compatibility with fertilization. However, this irrigation system faces a thorny problem which is the clogging of pipes and drippers. This leads to a lack of uniformity of irrigation over time. This so-called scaling phenomenon, the consequences of which are harmful (cleaning or replacement of pipes), leads to considerable unproductive expenditure. The objective set by this work is the search for green inhibitors likely to prevent this phenomenon of scaling. This study requires a better knowledge of these waters, their physico-chemical characteristics and their scaling power. Thus, using the "LCGE" controlled degassing technique, we initially evaluated, on pure calco-carbonic water at 30°F, the scaling-inhibiting power of some available plant extracts in our region of Souss-Massa. We then carried out a comparative study of the efficacy of these green inhibitors. The action of the most effective green inhibitor on real agricultural waters was then studied.

Keywords: green inhibitors, localized irrigation, plant extracts, scaling inhibition

Procedia PDF Downloads 76
1502 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems

Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen

Abstract:

Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.

Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis

Procedia PDF Downloads 556
1501 External Vacuum Dressing: Optimising Non-Operative Management of Flail Sternum Post CPR

Authors: Nicholas Bayfield, Mark Newman

Abstract:

Case Presentation: A 48-year-old male was brought in by ambulance after an out-of-hospital cardiac arrest, with 20 minutes of good-quality cardiopulmonary resuscitation in the community. Return of spontaneous circulation was achieved with defibrillation, revealing an inferior ST-elevation myocardial infarction. He was revascularized emergently in the cath lab and stabilised. Following the procedure, he was noted to have paradoxical respiratory movements of the sternum and high oxygen requirements. CT imaging demonstrated a flail chest with bilateral anterior rib 1-7 fractures as well as a large left-sided extra-pleural haematoma and small haemopneumothorax, secondary to CPR. The patient’s ventilation was stabilised with oxygen via a high-flow humidifier. Pain relief was provided. The anatomy of his rib fractures was not easily amenable to operative fixation. In addition, he was considered to be a high-risk operative candidate due to his recent arrest. He was managed thus non-operatively with an external vacuum dressing applied to the anterior chest wall to minimise respiratory compromise and minimise pain from the motion around the rib fracture sites. Non-operative management was successful, and the patient was reviewed one month later. The paradoxical sternal movement had abated. Discussion: External vacuum dressing has been trialled for non-operative management of rib fractures with varying success. It provides an external brace to minimise fracture site movement during respiration and coughing, thus minimising pain. This modality should be considered a low-cost, high-reward adjunct to non-operative management of bony thoracic trauma.

Keywords: thoracic surgery, thoracic trauma, rib fractures, negative pressure dressing

Procedia PDF Downloads 151
1500 Nectariferous Plant Genetic Resources for Apicultural Entrepreneurship in Nigeria: Prerequisite for Conservation, Sustainable Management and Policy

Authors: C. V. Nnamani, O. L. Adedeji

Abstract:

The contemporary global economic meltdown has devastating effect on the Nigerian’s economy and its frantic search for alternative source of national revenue aside from oil and gas has become imperative for economic emancipation for Nigerians. Apicultural entrepreneurship could provide a source of livelihood if the basic knowledge of those plant genetic resources needed by bees is made available. A palynological evaluation of those palynotaxa which honey bees forage for pollen and nectar was carried out after standard acetolysis method. Results showed that the honey samples were highly diversified and rich in honey plants. A total of 9544.3 honey pollen, consisting of 39 honey plants belonging to 21 plant families and distributed within 38 genera were identified excluding 238 unidentified pollen grains. Data from the analysis equally revealed that Elaeis guineensis Jacq, Anacardium occidentale L, Diospyros mespiliformis Hochist xe ADC, Alchornea cordifolia Muell, Arg, Daniella oliveri (Rolfe) Hutch & Dalz, Irvingia wombolu Okafor ex Baill, Treculia africana Decne, Nauclea latifolia Smith and Crossopteryx febrifuga Afzil ex Benth were the predominant honey plants. It provided a guide to the optimal utilization of floral resources by honeybees in these regions, showing the opportunity and amazing potentials for apiculture entrepreneurship of these palytaxa. Most of these plants are rare, threatened and endangered. It calls for urgent conservation techniques and step by all players. Critical awareness creation to ensure farmers knowledge of these palynotaxa to ensure proper understanding and attendance boost from them as economic empowerment is needed.

Keywords: palynotaxa, acetolysis, enterprise, livelihood, Nigeria

Procedia PDF Downloads 286
1499 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 573
1498 Optimization of Artisanal Fishing Waste Fermentation for Volatile Fatty Acids Production

Authors: Luz Stella Cadavid-Rodriguez, Viviana E. Castro-Lopez

Abstract:

Fish waste (FW) has a high content of potentially biodegradable components, so it is amenable to be digested anaerobically. In this line, anaerobic digestion (AD) of FW has been studied for biogas production. Nevertheless, intermediate products such as volatile fatty acids (VFA), generated during the acidogenic stage, have been scarce investigated, even though they have a high potential as a renewable source of carbon. In the literature, there are few studies about the Inoculum-Substrate (I/S) ratio on acidogenesis. On the other hand, it is well known that pH is a critical factor in the production of VFA. The optimum pH for the production of VFA seems to change depending on the substrate and can vary in a range between 5.25 and 11. Nonetheless, the literature about VFA production from protein-rich waste, such as FW, is scarce. In this context, it is necessary to deepen on the determination of the optimal operating conditions of acidogenic fermentation for VFA production from protein-rich waste. Therefore, the aim of this research was to optimize the volatile fatty acid production from artisanal fishing waste, studying the effect of pH and the I/S ratio on the acidogenic process. For this research, the inoculum used was a methanogenic sludge (MS) obtained from a UASB reactor treating wastewater of a slaughterhouse plant, and the FW was collected in the port of Tumaco (Colombia) from the local artisanal fishers. The acidogenic fermentation experiments were conducted in batch mode, in 500 mL glass bottles as anaerobic reactors, equipped with rubber stoppers provided with a valve to release biogas. The effective volume used was 300 mL. The experiments were carried out for 15 days at a mesophilic temperature of 37± 2 °C and constant agitation of 200 rpm. The effect of 3 pH levels: 5, 7, 9, coupled with five I/S ratios, corresponding to 0.20, 0.15, 0.10, 0.05, 0.00 was evaluated taking as a response variable the production of VFA. A complete randomized block design was selected for the experiments in a 5x3 factorial arrangement, with two repetitions per treatment. At the beginning and during the process, pH in the experimental reactors was adjusted to the corresponding values of 5, 7, and 9 using 1M NaOH or 1M H2SO4, as was appropriated. In addition, once the optimum I/S ratio was determined, the process was evaluated at this condition without pH control. The results indicated that pH is the main factor in the production of VFA, obtaining the highest concentration with neutral pH. By reducing the I/S ratio, as low as 0.05, it was possible to maximize VFA production. Thus, the optimum conditions found were natural pH (6.6-7.7) and I/S ratio of 0.05, with which it was possible to reach a maximum total VFA concentration of 70.3 g Ac/L, whose major components were acetic acid (35%) and butyric acid (32%). The findings showed that the acidogenic fermentation of FW is an efficient way of producing VFA and that the operating conditions can be simple and economical.

Keywords: acidogenesis, artisanal fishing waste, inoculum to substrate ratio, volatile fatty acids

Procedia PDF Downloads 114
1497 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 379
1496 Finding a Redefinition of the Relationship between Rural and Urban Knowledge

Authors: Bianca Maria Rulli, Lenny Valentino Schiaretti

Abstract:

The considerable recent urbanization has increasingly sharpened environmental and social problems all over the world. During the recent years, many answers to the alarming attitudes in modern cities have emerged: a drastic reduction in the rate of growth is becoming essential for future generations and small scale economies are considered more adaptive and sustainable. According to the concept of degrowth, cities should consider surpassing the centralization of urban living by redefining the relationship between rural and urban knowledge; growing food in cities fundamentally contributes to the increase of social and ecological resilience. Through an innovative approach, this research combines the benefits of urban agriculture (increase of biological diversity, shorter and thus more efficient supply chains, food security) and temporary land use. They stimulate collaborative practices to satisfy the changing needs of communities and stakeholders. The concept proposes a coherent strategy to create a sustainable development of urban spaces, introducing a productive green-network to link specific areas in the city. By shifting the current relationship between architecture and landscape, the former process of ground consumption is deeply revised. Temporary modules can be used as concrete tools to create temporal areas of innovation, transforming vacant or marginal spaces into potential laboratories for the development of the city. The only permanent ground traces, such as foundations, are minimized in order to allow future land re-use. The aim is to describe a new mindset regarding the quality of space in the metropolis which allows, in a completely flexible way, to bring back the green and the urban farming into the cities. The wide possibilities of the research are analyzed in two different case-studies. The first is a regeneration/connection project designated for social housing, the second concerns the use of temporary modules to answer to the potential needs of social structures. The intention of the productive green-network is to link the different vacant spaces to each other as well as to the entire urban fabric. This also generates a potential improvement of the current situation of underprivileged and disadvantaged persons.

Keywords: degrowth, green network, land use, temporary building, urban farming

Procedia PDF Downloads 495
1495 Alternate Methods to Visualize 2016 U.S. Presidential Election Result

Authors: Hong Beom Hur

Abstract:

Politics in America is polarized. The best illustration of this is the 2016 presidential election result map. States with megacities like California, New York, Illinois, Virginia, and others are marked blue to signify the color of the Democratic party. States located in inland and south like Texas, Florida, Tennesse, Kansas and others are marked red to signify the color of the Republican party. Such a stark difference between two colors, red and blue, combined with geolocations of each state with their borderline remarks one central message; America is divided into two colors between urban Democrats and rural Republicans. This paper seeks to defy the visualization by pointing out its limitations and search for alternative ways to visualize the 2016 election result. One such limitation is that geolocations of each state and state borderlines limit the visualization of population density. As a result, the election result map does not convey the fact that Clinton won the popular vote and only accentuates the voting patterns of urban and rural states. The paper seeks whether an alternative narrative can be observed by factoring in the population number into the size of each state and manipulating the state borderline according to the normalization. Yet another alternative narrative may be reached by factoring the size of each state by the number of the electoral college of each state by voting and visualize the number. Other alternatives will be discussed but are not implemented in visualization. Such methods include dividing the land of America into about 120 million cubes each representing a voter or by the number of whole population 300 million cubes. By exploring these alternative methods to visualize the politics of the 2016 election map, the public may be able to question whether it is possible to be free from the narrative of the divide-conquer when interpreting the election map and to look at both parties as a story of the United States of America.

Keywords: 2016 U.S. presidential election, data visualization, population scale, geo-political

Procedia PDF Downloads 115
1494 The Value of Routine Terminal Ileal Biopsies for the Investigation of Diarrhea

Authors: Swati Bhasin, Ali Ahmed, Valence Xavier, Ben Liu

Abstract:

Aims: Diarrhea is a problem that is a frequent clinic referral to the gastroenterology and surgical team from the General practitioner. To establish a diagnosis, these patients undergo colonoscopy. The current practice at our district general hospital is to perform random left and right colonic biopsies. National guidelines issued by the British Society of Gastroenterology advise all patients presenting with chronic diarrhea should have an Ileoscopy as an indicator for colonoscopy completion. Our primary aim was to check if Terminal ileum (TI) biopsy is required to establish a diagnosis of inflammatory bowel disease (IBD). Methods: Data was collected retrospectively from November 2018 to November 2019. The target population were patients who underwent colonoscopies for diarrhea. Demographic data, endoscopic and histology findings of TI were assessed and analyzed. Results: 140 patients with a mean age of 57 years (19-84) underwent a colonoscopy (M: F; 1:2.3). 92 patients had random colonic biopsies taken and based on the histological results of these, 15 patients (16%) were diagnosed with IBD. The TI was successfully intubated in 40 patients, of which 32 patients had colonic biopsies taken as well. 8 patients did not have a colonic biopsy taken. Macroscopic abnormality in the TI was detected in 5 patients, all of whom were biopsied. Based on histological results of the biopsy, 3 patients (12%) were diagnosed with IBD. These 3 patients (100%) also had colonic biopsies taken simultaneously and showed inflammation. None of the patients had a diagnosis of IBD confirmed on TI intubation alone (where colonic biopsies were not done). None of the patients has a diagnosis of IBD confirmed on TI intubation alone (where colonic biopsies were negative). Conclusion: TI intubation is a highly-skilled, time-consuming procedure with a higher risk of perforation, which as per our study, has little additional diagnostic value in finding IBD for symptoms of diarrhea if colonic biopsies are taken. We propose that diarrhea is a colonic symptom; therefore, colonic biopsies are positive for inflammation if the diarrhea is secondary to IBD. We conclude that all of the IBDs can be diagnosed simply with colonic biopsies.

Keywords: biopsy, colon, IBD, terminal ileum

Procedia PDF Downloads 115
1493 Cellular Technologies in Urology

Authors: R. Zhankina, U. Zhanbyrbekuly, A. Tamadon, M. Askarov, R. Sherkhanov, D. Akhmetov, D. Saipiyeva, N. Keulimzhaev

Abstract:

Male infertility affects about 15% of couples of reproductive age. Approximately 10–15% have azoospermia who have previously been diagnosed with male infertility. Azoospermia is regarded as the absence of spermatozoa in the ejaculate and is found in 10-15% of infertile men. Non-obstructive azoospermia is considered a cause of male infertility that is not amenable to drug therapy. Patients with non-obstructive azoospermia are unable to have their "own" children and have only options for adoption or use of donor sperm. Advances in assisted reproductive technologies such as intracytoplasmic sperm injection in vitro fertilization have significantly changed the management of patients with non-obstructive azoospermia. Advances in biotechnology have increased the options for treating patients with non-obstructive azoospermia. Mesenchymal stem cell therapy has been recognized as a new option for infertility treatment. Material and methods of the study: After obtaining informed consent, 5 patients diagnosed with non-obstructive azoospermia were included in an open, non-randomized study. The age of the patients ranged from 24 to 35 years. The examination was carried out before the start of treatment, which included biochemical blood tests, hormonal profile levels (luteinizing hormone, follicle-stimulating hormone, testosterone, prolactin, inhibin B); tests for tumor markers; genetic research. All studies were carried out in compliance with the requirements of Protocol No. 8 dated 06/09/20, approved by the Local Ethical Commission of NJSC "Astana Medical University". The control examination of patients was carried out after 6 months, by re-taking the program and hormonal profile (testosterone, luteinizing hormone, follicle-stimulating hormone, prolactin, inhibin B). Before micro-TESE of the testis, all 5 patients underwent myeloexfusion in the operating room. During the micro-TESE, autotransplantation of mesenchymal stem cells into the testicular network, previously cultured in a cell technology laboratory for 2 weeks, was performed. Results of the study: in all patients, the levels of total testosterone increased, the level of follicle-stimulating hormone decreased, the levels of luteinizing hormone returned to normal, the level of inhibin B increased. IVF with a positive result; another patient (20%) had spermatogenesis cells. Non-obstructive azoospermia and mesenchymal stem cells Conclusions: The positive results of this work serve as the basis for the application of a new cellular therapeutic approach for the treatment of non-obstructive azoospermia using mesenchymal stem cells.

Keywords: cell therapy, regenerative medicine, male infertility, mesenchymal stem cells

Procedia PDF Downloads 110
1492 Rethinking the Use of Online Dispute Resolution in Resolving Cross-Border Small E-Disputes in EU

Authors: Sajedeh Salehi, Marco Giacalone

Abstract:

This paper examines the role of existing online dispute resolution (ODR) mechanisms and their effects on ameliorating access to justice – as a protected right by Art. 47 of the EU Charter of Fundamental Rights – for consumers in EU. The major focus of this study will be on evaluating ODR as the means of dispute resolution for Business-to-Consumer (B2C) cross-border small claims raised in e-commerce transactions. The authors will elaborate the consequences of implementing ODR methods in the context of recent developments in EU regulatory safeguards on promoting consumer protection. In this analysis, both non-judiciary and judiciary ODR redress mechanisms are considered, however, the significant consideration is given to – obligatory and non-obligatory – judiciary ODR methods. For that purpose, this paper will particularly investigate the impact of the EU ODR platform as well as the European Small Claims Procedure (ESCP) Regulation 861/2007 and their role on accelerating the access to justice for consumers in B2C e-disputes. Although, considerable volume of research has been carried out on ODR for consumer claims, rather less (or no-) attention has been paid to provide a combined doctrinal and empirical evaluation of ODR’s potential in resolving cross-border small e-disputes, in EU. Hence, the methodological approach taken in this study is a mixed methodology based on qualitative (interviews) and quantitative (surveys) research methods which will be mainly based on the data acquired through the findings of the Small Claims Analysis Net (SCAN) project. This project contributes towards examining the ESCP Regulation implementation and efficiency in providing consumers with a legal watershed through using the ODR for their transnational small claims. The outcomes of this research may benefit both academia and policymakers at national and international level.

Keywords: access to justice, consumers, e-commerce, small e-Disputes

Procedia PDF Downloads 125
1491 The Effect of Probiotic and Vitamin B Complex Supplementation on Interferon-γ and Interleukin-10 Levels in Patients with TB Infection during Intensive Phase Therapy

Authors: Yulistiani Yulistiani, Wenny Nilamsari, Laurin Winarso, Rizkiya Rizkiya, Zamrotul Izzah, Budi Suprapti, Arif Bachtiar

Abstract:

Approximately, a million new cases of TB have been found out per year, making Indonesia as the second greatest country with TBC after India. Nevertheless, until now, there are still many patients failure to conventional therapy with oral anti tuberculosis. Thus, the discovery of supplement therapy is urgently needed. Many studies showed that probiotic had the positive impact in lung diseases, diarrhea, pneumonia and it was attributed to its capability to balance the level of cytokine pro-inflammatory and anti-inflammatory. It was demonstrated in active disease the production of IFN-γ is strongly depressed and IL-10 level increases. This study aimed to investigate the effect of probiotic (multi strains) and vitamin B complex supplementation on IFN-γ and IL-10 level in patients with TB infection during intensive phase therapy. A randomized controlled trial, open labeled was conducted in TB patients with the following criteria: 1) age 18-55 years old 2) receiving oral antituberculosis during intensive therapy 3) not using probiotic, vitamin B1, B6, B12 2 weeks before enrollment 4) willing to participate in this study and signed an informed consent. While, patients with HIV, pregnant, had the history of diabetes mellitus, using corticosteroid or other immunosuppressants were excluded. IFN-γ and IL-10 levels were drawn before observation and after a month observation. The assay was performed by ELISA. There were seven patients in treated group and five patients in controlled group obtained in this study. Between groups, there was no statistical difference in comorbid, age, and disease duration. The mean level of IFN-γ after a month observation increased in treated group and controlled group, which were 31.47 ± 105.46 pg/ml and 15.09 ± 24.23 pg/ml, respectively (p> 0.005). Although, there were not statistically different, treated group showed a greater increase of IFN-γ level than that of the controlled group. IFN-γ plays an important role in immune response to Mycobacterium Tuberculosis, by activating macrofag, monosit and furthermore killing Mycobacterium Tuberculosis. Thus the level was expected to increase after supplementation with probiotic and Vitamin B complex. While the mean level of IL-10 also increased after one month observation in the treated group and controlled group (4.28 ± 12.29 pg/ml and 5.77± 6.21 pg/ml, respectively) (p>0.005). To be compared, the increased level of IL-10 in the treated group were lower than the controlled group, although it was not statistically different. IL-10 is a cytokine anti-inflammatory, thus, the level after the observation was expected to decrease. In this study, a month therapy of probiotic and vitamin B complex was not able to demonstrate the decrease of the IL-10 level. It is suggested to prolong observation up to 2 months, because, in intensive phase, the level of cytokine anti-inflammatory is very high, so the longer therapy is needed. It is indicated that supplementation therapy with probiotic and vitamin B complex to Oral Anti-Tuberculosis may have a positive effect on increasing IFN-γ level and slowing the progression of IL-10.

Keywords: TB Infection, IFN-γ, IL-10, probiotic, vitamin B complex

Procedia PDF Downloads 369
1490 Role of Imaging in Predicting the Receptor Positivity Status in Lung Adenocarcinoma: A Chapter in Radiogenomics

Authors: Sonal Sethi, Mukesh Yadav, Abhimanyu Gupta

Abstract:

The upcoming field of radiogenomics has the potential to upgrade the role of imaging in lung cancer management by noninvasive characterization of tumor histology and genetic microenvironment. Receptor positivity like epidermal growth factor receptor (EGFR) and anaplastic lymphoma kinase (ALK) genotyping are critical in lung adenocarcinoma for treatment. As conventional identification of receptor positivity is an invasive procedure, we analyzed the features on non-invasive computed tomography (CT), which predicts the receptor positivity in lung adenocarcinoma. Retrospectively, we did a comprehensive study from 77 proven lung adenocarcinoma patients with CT images, EGFR and ALK receptor genotyping, and clinical information. Total 22/77 patients were receptor-positive (15 had only EGFR mutation, 6 had ALK mutation, and 1 had both EGFR and ALK mutation). Various morphological characteristics and metastatic distribution on CT were analyzed along with the clinical information. Univariate and multivariable logistic regression analyses were used. On multivariable logistic regression analysis, we found spiculated margin, lymphangitic spread, air bronchogram, pleural effusion, and distant metastasis had a significant predictive value for receptor mutation status. On univariate analysis, air bronchogram and pleural effusion had significant individual predictive value. Conclusions: Receptor positive lung cancer has characteristic imaging features compared with nonreceptor positive lung adenocarcinoma. Since CT is routinely used in lung cancer diagnosis, we can predict the receptor positivity by a noninvasive technique and would follow a more aggressive algorithm for evaluation of distant metastases as well as for the treatment.

Keywords: lung cancer, multidisciplinary cancer care, oncologic imaging, radiobiology

Procedia PDF Downloads 118
1489 Application of a Universal Distortion Correction Method in Stereo-Based Digital Image Correlation Measurement

Authors: Hu Zhenxing, Gao Jianxin

Abstract:

Stereo-based digital image correlation (also referred to as three-dimensional (3D) digital image correlation (DIC)) is a technique for both 3D shape and surface deformation measurement of a component, which has found increasing applications in academia and industries. The accuracy of the reconstructed coordinate depends on many factors such as configuration of the setup, stereo-matching, distortion, etc. Most of these factors have been investigated in literature. For instance, the configuration of a binocular vision system determines the systematic errors. The stereo-matching errors depend on the speckle quality and the matching algorithm, which can only be controlled in a limited range. And the distortion is non-linear particularly in a complex imaging acquisition system. Thus, the distortion correction should be carefully considered. Moreover, the distortion function is difficult to formulate in a complex imaging acquisition system using conventional models in such cases where microscopes and other complex lenses are involved. The errors of the distortion correction will propagate to the reconstructed 3D coordinates. To address the problem, an accurate mapping method based on 2D B-spline functions is proposed in this study. The mapping functions are used to convert the distorted coordinates into an ideal plane without distortions. This approach is suitable for any image acquisition distortion models. It is used as a prior process to convert the distorted coordinate to an ideal position, which enables the camera to conform to the pin-hole model. A procedure of this approach is presented for stereo-based DIC. Using 3D speckle image generation, numerical simulations were carried out to compare the accuracy of both the conventional method and the proposed approach.

Keywords: distortion, stereo-based digital image correlation, b-spline, 3D, 2D

Procedia PDF Downloads 490