Search results for: concentrate level
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12923

Search results for: concentrate level

923 Exploring Identity of Female British Pakistani Student with Shifting and Re-shifting of Cultures

Authors: Haleema Sadia

Abstract:

The study is aimed at exploring the identity construction of female British born Pakistani postgraduate student who shifted to Pakistan at the age of 12, stayed there for 8 years and re-shifted to UK for Higher Education. Research questions are: 1. What is the academic and socio-cultural background of the participant prior to joining the UoM as a postgrad student? 2. How the participant talk, see herself and act in relation to cultural and social norms and practices? Participant’ identity is explored through positioning theory of Holland et al. (1998), referring to the ways people understand and enact their social positions in the figured world. The research is a case study based on narrative interview of Shabana, a British-born Pakistani female postgraduate student, who has recently joined the university of Manchester. Shabana received her primary education in UK during the first twelve years of her life. She is the youngest among the three sisters, with only one brother younger to her. Her father, although not well educated is a successful entrepreneur, maintaining offices in UK and Pakistan. Her mother is a housewife with no formal education. Shabana’s elder sister got involved in a relationship with a Pakistani boy against cultural norms of arranged marriage. Resultantly the three sisters were shifted to Pakistan to be equated with socio-religious norms. Shabana termed her first year in Pakistan as disgusting and she hated her father for the decision. However after a year’s time and shifting from an orthodox city to the provincial capital Lahore, she developed liking for the Pakistani culture. She gradually developed a new socio-religious identity during her stay, which she expressed as a turning point in her life. After completing O level Shabana returned back to UK and joined the University of Hull as undergraduate Student. At Hull she remained isolated, missed the religious environment and relished the memories of Lahore. She would visit Pakistan almost three times a year. After obtaining her BSc degree from Hull she went back to Pakistan. Soon after she decided to improve her academic qualification. She came to UK to join her parents and got admission in the MSc chemistry program at UoM. Presently Shabana talks about the dominant role of male members in the family culture in decision-making. She strongly feels to struggle hard and attain equal status with males in education, employment, earning, authority and freedom. She sees herself in a position to share the authority with her (would be) husband in important family and other matters. Shabana has developed a new identity of a mix of both Pakistani and UK culture. She is appreciative of the socio-cultural values of UK while still regarding the cultural and religious values of Pakistan in high esteem.

Keywords: postgraduate students, identity construction, cultural shifts, female british pakistani student

Procedia PDF Downloads 626
922 Implementation Research on the Singapore Physical Activity and Nutrition Program: A Mixed-Method Evaluation

Authors: Elaine Wong

Abstract:

Introduction: The Singapore Physical Activity and Nutrition Study (SPANS) aimed to assess the effects of a community-based intervention on physical activity (PA) and nutrition behaviours as well as chronic disease risk factors for Singaporean women aged above 50 years. This article examines the participation, dose, fidelity, reach, satisfaction and reasons for completion and non-completion of the SPANS. Methods: The SPANS program integrated constructs of Social Cognitive Theory (SCT) and is composed of PA activities; nutrition workshops; dietary counselling coupled with motivational interviewing (MI) through phone calls; and text messages promoting healthy behaviours. Printed educational resources and health incentives were provided to participants. Data were collected via a mixed-method design strategy from a sample of 295 intervention participants. Quantitative data were collected using self-completed survey (n = 209); qualitative data were collected via research assistants’ notes, post feedback sessions and exit interviews with program completers (n = 13) and non-completers (n = 12). Results: Majority of participants reported high ‘satisfactory to excellent’ ratings for the program pace, suitability of interest and overall program (96.2-99.5%). Likewise, similar ratings for clarity of presentation; presentation skills, approachability, knowledge; and overall rating of trainers and program ambassadors were achieved (98.6-100%). Phone dietary counselling had the highest level of participation (72%) at less than or equal to 75% attendance rate followed by nutrition workshops (65%) and PA classes (60%). Attrition rate of the program was 19%; major reasons for withdrawal were personal commitments, relocation and health issues. All participants found the program resources to be colourful, informative and practical for their own reference. Reasons for program completion and maintenance were: desired health benefits; social bonding opportunities and to learn more about PA and nutrition. Conclusions: Process evaluation serves as an appropriate tool to identify recruitment challenges, effective intervention strategies and to ensure program fidelity. Program participants were satisfied with the educational resources, program components and delivery strategies implemented by the trainers and program ambassadors. The combination of printed materials and intervention components, when guided by the SCT and MI, were supportive in encouraging and reinforcing lifestyle behavioural changes. Mixed method evaluation approaches are integral processes to pinpoint barriers, motivators, improvements and effective program components in optimising the health status of Singaporean women.

Keywords: process evaluation, Singapore, older adults, lifestyle changes, program challenges

Procedia PDF Downloads 122
921 Measuring Biobased Content of Building Materials Using Carbon-14 Testing

Authors: Haley Gershon

Abstract:

The transition from using fossil fuel-based building material to formulating eco-friendly and biobased building materials plays a key role in sustainable building. The growing demand on a global level for biobased materials in the building and construction industries heightens the importance of carbon-14 testing, an analytical method used to determine the percentage of biobased content that comprises a material’s ingredients. This presentation will focus on the use of carbon-14 analysis within the building materials sector. Carbon-14, also known as radiocarbon, is a weakly radioactive isotope present in all living organisms. Any fossil material older than 50,000 years will not contain any carbon-14 content. The radiocarbon method is thus used to determine the amount of carbon-14 content present in a given sample. Carbon-14 testing is performed according to ASTM D6866, a standard test method developed specifically for biobased content determination of material in solid, liquid, or gaseous form, which requires radiocarbon dating. Samples are combusted and converted into a solid graphite form and then pressed onto a metal disc and mounted onto a wheel of an accelerator mass spectrometer (AMS) machine for the analysis. The AMS instrument is used in order to count the amount of carbon-14 present. By submitting samples for carbon-14 analysis, manufacturers of building materials can confirm the biobased content of ingredients used. Biobased testing through carbon-14 analysis reports results as percent biobased content, indicating the percentage of ingredients coming from biomass sourced carbon versus fossil carbon. The analysis is performed according to standardized methods such as ASTM D6866, ISO 16620, and EN 16640. Products 100% sourced from plants, animals, or microbiological material are therefore 100% biobased, while products sourced only from fossil fuel material are 0% biobased. Any result in between 0% and 100% biobased indicates that there is a mixture of both biomass-derived and fossil fuel-derived sources. Furthermore, biobased testing for building materials allows manufacturers to submit eligible material for certification and eco-label programs such as the United States Department of Agriculture (USDA) BioPreferred Program. This program includes a voluntary labeling initiative for biobased products, in which companies may apply to receive and display the USDA Certified Biobased Product label, stating third-party verification and displaying a product’s percentage of biobased content. The USDA program includes a specific category for Building Materials. In order to qualify for the biobased certification under this product category, examples of product criteria that must be met include minimum 62% biobased content for wall coverings, minimum 25% biobased content for lumber, and a minimum 91% biobased content for floor coverings (non-carpet). As a result, consumers can easily identify plant-based products in the marketplace.

Keywords: carbon-14 testing, biobased, biobased content, radiocarbon dating, accelerator mass spectrometry, AMS, materials

Procedia PDF Downloads 158
920 Benjaminian Translatability and Elias Canetti's Life Component: The Other German Speaking Modernity

Authors: Noury Bakrim

Abstract:

Translatability is one of Walter Benjamin’s most influential notions, it is somehow representing the philosophy of language and history of what we might call and what we indeed coined as ‘the other German Speaking Modernity’ which could be shaped as a parallel thought form to the Marxian-Hegelian philosophy of history, the one represented by the school of Frankfurt. On the other hand, we should consider the influence of the plural German speaking identity and the Nietzschian and Goethean heritage, this last being focused on a positive will of power: the humanised human being. Having in perspective the benjaminian notion of translatability (Übersetzbarkeit), to be defined as an internal permanent hermeneutical possibility as well as a phenomenological potential of a translation relation, we are in fact touching this very double limit of both historical and linguistic reason. By life component, we mean the changing conditions of genetic and neurolinguistic post-partum functions, to be grasped as an individuation beyond the historical determinism and teleology of an event. It is, so to speak, the retrospective/introspective canettian auto-fiction, the benjaminian crystallization of the language experience in the now-time of writing/transmission. Furthermore, it raises various questioning points when it comes to translatability, they are basically related to psycholinguistic separate poles, the fatherly ladino Spanish and the motherly Vienna German, but relating more in particular to the permanent ontological quest of a world loss/belonging. Another level of this quest would be the status of Veza Canetti-Taubner Calderón, german speaking Author, Canetti’s ‘literary wife’, writer’s love, his inverted logos, protective and yet controversial ‘official private life partner’, the permanence of the jewish experience in the exiled german language. It sheds light on a traumatic relation of an inadequate/possible language facing the reconstruction of an oral life, the unconscious split of the signifier and above all on the frustrating status of writing in Canetti’s work : Using a suffering/suffered written German to save his remembered acquisition of his tongue/mother tongue by saving the vanishing spoken multilingual experience. While Canetti’s only novel ‘Die Blendung’ designates that fictional referential dynamics focusing on the nazi worldless horizon: the figure of Kien is an onomastic signifier, the anti-Canetti figure, the misunderstood legacy of Kant, the system without thought. Our postulate would be the double translatability of his auto-fiction inventing the bios oral signifier basing on the new praxemes created by Canetti’s german as observed in the English, French translations of his memory corpus. We aim at conceptualizing life component and translatability as two major features of a german speaking modernity.

Keywords: translatability, language biography, presentification, bioeme, life Order

Procedia PDF Downloads 426
919 Hybrid Solutions in Physicochemical Processes for the Removal of Turbidity in Andean Reservoirs

Authors: María Cárdenas Gaudry, Gonzalo Ramces Fano Miranda

Abstract:

Sediment removal is very important in the purification of water, not only for reasons of visual perception but also because of its association with odor and taste problems. The Cuchoquesera reservoir, which is in the Andean region of Ayacucho (Peru) at an altitude of 3,740 meters above sea level, visually presents suspended particles and organic impurities indicating that it contains water of dubious quality to deduce that it is suitable for direct consumption of human beings. In order to quantitatively know the degree of impurities, water quality monitoring was carried out from February to August 2018, in which four sampling stations were established in the reservoir. The selected measured parameters were electrical conductivity, total dissolved solids, pH, color, turbidity, and sludge volume. The indicators of the studied parameters exceed the permissible limits except for electrical conductivity (190 μS/cm) and total dissolved solids (255 mg/L). In this investigation, the best combination and the optimal doses of reagents were determined that allowed the removal of sediments from the waters of the Cuchoquesera reservoir, through the physicochemical process of coagulation-flocculation. In order to improve this process during the rainy season, six combinations of reagents were evaluated, made up of three coagulants (ferric chloride, ferrous sulfate, and aluminum sulfate) and two natural flocculants: prickly pear powder (Opuntia ficus-indica) and tara gum (Caesalpinia spinoza). For each combination of reagents, jar tests were developed following the central composite experimental design (CCED), where the design factors were the doses of coagulant and flocculant and the initial turbidity. The results of the jar tests were adjusted to mathematical models, obtaining that to treat the water from the Cuchoquesera reservoir, with a turbidity of 150 UTN and a color of 137 U Pt-Co, 27.9 mg/L of the coagulant aluminum sulfate with 3 mg/L of the natural tara gum flocculant to produce a purified water quality of 1.7 UTN of turbidity and 3.2 U Pt-Co of apparent color. The estimated cost of the dose of coagulant and flocculant found was 0.22 USD/m³. This is how “grey-green” technologies can be used as a combination in nature-based solutions in water treatment, in this case, to achieve potability, making it more sustainable, especially economically, if green technology is available at the site of application of the nature-based hybrid solution. This research is a demonstration of the compatibility of natural coagulants/flocculants with other treatment technologies in the integrated/hybrid treatment process, such as the possibility of hybridizing natural coagulants with other types of coagulants.

Keywords: prickly pear powder, tara gum, nature-based solutions, aluminum sulfate, jar test, turbidity, coagulation, flocculation

Procedia PDF Downloads 108
918 Evaluation of Batch Splitting in the Context of Load Scattering

Authors: S. Wesebaum, S. Willeke

Abstract:

Production companies are faced with an increasingly turbulent business environment, which demands very high production volumes- and delivery date flexibility. If a decoupling by storage stages is not possible (e.g. at a contract manufacturing company) or undesirable from a logistical point of view, load scattering effects the production processes. ‘Load’ characterizes timing and quantity incidence of production orders (e.g. in work content hours) to workstations in the production, which results in specific capacity requirements. Insufficient coordination between load (demand capacity) and capacity supply results in heavy load scattering, which can be described by deviations and uncertainties in the input behavior of a capacity unit. In order to respond to fluctuating loads, companies try to implement consistent and realizable input behavior using the capacity supply available. For example, a uniform and high level of equipment capacity utilization keeps production costs down. In contrast, strong load scattering at workstations leads to performance loss or disproportionately fluctuating WIP, whereby the logistics objectives are affected negatively. Options for reducing load scattering are e.g. shifting the start and end dates of orders, batch splitting and outsourcing of operations or shifting to other workstations. This leads to an adjustment of load to capacity supply, and thus to a reduction of load scattering. If the adaptation of load to capacity cannot be satisfied completely, possibly flexible capacity must be used to ensure that the performance of a workstation does not decrease for a given load. Where the use of flexible capacities normally raises costs, an adjustment of load to capacity supply reduces load scattering and, in consequence, costs. In the literature you mostly find qualitative statements for describing load scattering. Quantitative evaluation methods that describe load mathematically are rare. In this article the authors discuss existing approaches for calculating load scattering and their various disadvantages such as lack of opportunity for normalization. These approaches are the basis for the development of our mathematical quantification approach for describing load scattering that compensates the disadvantages of the current quantification approaches. After presenting our mathematical quantification approach, the method of batch splitting will be described. Batch splitting allows the adaptation of load to capacity to reduce load scattering. After describing the method, it will be explicitly analyzed in the context of the logistic curve theory by Nyhuis using the stretch factor α1 in order to evaluate the impact of the method of batch splitting on load scattering and on logistic curves. The conclusion of this article will be to show how the methods and approaches presented can help companies in a turbulent environment to quantify the occurring work load scattering accurately and apply an efficient method for adjusting work load to capacity supply. In this way, the achievements of the logistical objectives are increased without causing additional costs.

Keywords: batch splitting, production logistics, production planning and control, quantification, load scattering

Procedia PDF Downloads 399
917 Furniko Flour: An Emblematic Traditional Food of Greek Pontic Cuisine

Authors: A. Keramaris, T. Sawidis, E. Kasapidou, P. Mitlianga

Abstract:

Although the gastronomy of the Greeks of Pontus is highly prominent, it has not received the same level of scientific analysis as another local cuisine of Greece, that of Crete. As a result, we intended to focus our research on Greek Pontic cuisine to shed light on its unique recipes, food products, and, ultimately, its features. The Greeks of Pontus, who lived for a long time in the northern part (Black Sea Region) of contemporary Turkey and now widely inhabit northern Greece, have one of Greece's most distinguished local cuisines. Despite their gastronomy being simple, it features several inspiring delicacies. It's been a century since they immigrated to Greece, yet their gastronomic culture remains a critical component of their collective identity. As a first step toward comprehending Greek Pontic cuisine, it was attempted to investigate the production of one of its most renowned traditional products, furniko flour. In this project, we targeted residents of Western Macedonia, a province in northern Greece with a large population of descendants of Greeks of Pontus who are primarily engaged in agricultural activities. In this quest, we approached a descendant of the Greeks of Pontus who is involved in the production of furniko flour and who consented to show us the entire process of its production as we participated in it. The furniko flour is made from non-hybrid heirloom corn. It is harvested by hand when the moisture content of the seeds is low enough to make them suitable for roasting. Manual harvesting entails removing the cob from the plant and detaching the husks. The harvested cobs are then roasted for 24 hours in a traditional wood oven. The roasted cobs are then collected and stored in sacks. The next step is to extract the seeds, which is accomplished by rubbing the cobs. The seeds should ideally be ground in a traditional stone hand mill. We end up with aromatic and dark golden furniko flour, which is used to cook havitz. Accompanied by the preparation of the furnikoflour, we also recorded the cooking process of the havitz (a porridge-like cornflour dish). A savory delicacy that is simple to prepare and one of the most delightful dishes in Greek Pontic cuisine. According to the research participant, havitzis a highly nutritious dish due to the ingredients of furniko flour. In addition, he argues that preparing havitz is a great way to bring families together, share stories, and revisit fond memories. In conclusion, this study illustrates the traditional preparation of furnikoflour and its use in various traditional recipes as an initial effort to highlight the elements of Pontic Greek cuisine. As a continuation of the current study, it could be the analysis of the chemical components of the furniko flour to evaluate its nutritional content.

Keywords: furniko flour, greek pontic cuisine, havitz, traditional foods

Procedia PDF Downloads 136
916 Residual Plastic Deformation Capacity in Reinforced Concrete Beams Subjected to Drop Weight Impact Test

Authors: Morgan Johansson, Joosef Leppanen, Mathias Flansbjer, Fabio Lozano, Josef Makdesi

Abstract:

Concrete is commonly used for protective structures and how impact loading affects different types of concrete structures is an important issue. Often the knowledge gained from static loading is also used in the design of impulse loaded structures. A large plastic deformation capacity is essential to obtain a large energy absorption in an impulse loaded structure. However, the structural response of an impact loaded concrete beam may be very different compared to a statically loaded beam. Consequently, the plastic deformation capacity and failure modes of the concrete structure can be different when subjected to dynamic loads; and hence it is not sure that the observations obtained from static loading are also valid for dynamic loading. The aim of this paper is to investigate the residual plastic deformation capacity in reinforced concrete beams subjected to drop weight impact tests. A test-series consisting of 18 simply supported beams (0.1 x 0.1 x 1.18 m, ρs = 0.7%) with a span length of 1.0 m and subjected to a point load in the beam mid-point, was carried out. 2x6 beams were first subjected to drop weight impact tests, and thereafter statically tested until failure. The drop in weight had a mass of 10 kg and was dropped from 2.5 m or 5.0 m. During the impact tests, a high-speed camera was used with 5 000 fps and for the static tests, a camera was used with 0.5 fps. Digital image correlation (DIC) analyses were conducted and from these the velocities of the beam and the drop weight, as well as the deformations and crack propagation of the beam, were effectively measured. Additionally, for the static tests, the applied load and midspan deformation were measured. The load-deformation relations for the beams subjected to an impact load were compared with 6 reference beams that were subjected to static loading only. The crack pattern obtained were compared using DIC, and it was concluded that the resulting crack formation depended much on the test method used. For the static tests, only bending cracks occurred. For the impact loaded beams, though, distinctive diagonal shear cracks also formed below the zone of impact and less wide shear cracks were observed in the region half-way to the support. Furthermore, due to wave propagation effects, bending cracks developed in the upper part of the beam during initial loading. The results showed that the plastic deformation capacity increased for beams subjected to drop weight impact tests from a high drop height of 5.0 m. For beams subjected to an impact from a low drop height of 2.5 m, though, the plastic deformation capacity was in the same order of magnitude as for the statically loaded reference beams. The beams tested were designed to fail due to bending when subjected to a static load. However, for the impact tested beams, one beam exhibited a shear failure at a significantly reduced load level when it was tested statically; indicating that there might be a risk of reduced residual load capacity for impact loaded structures.

Keywords: digital image correlation (DIC), drop weight impact, experiments, plastic deformation capacity, reinforced concrete

Procedia PDF Downloads 147
915 Assessing Prescribed Burn Severity in the Wetlands of the Paraná River -Argentina

Authors: Virginia Venturini, Elisabet Walker, Aylen Carrasco-Millan

Abstract:

Latin America stands at the front of climate change impacts, with forecasts projecting accelerated temperature and sea level rises compared to the global average. These changes are set to trigger a cascade of effects, including coastal retreat, intensified droughts in some nations, and heightened flood risks in others. In Argentina, wildfires historically affected forests, but since 2004, wetland fires have emerged as a pressing concern. By 2021, the wetlands of the Paraná River faced a dangerous situation. In fact, during the year 2021, a high-risk scenario was naturally formed in the wetlands of the Paraná River, in Argentina. Very low water levels in the rivers, and excessive standing dead plant material (fuel), triggered most of the fires recorded in the vast wetland region of the Paraná during 2020-2021. During 2008 fire events devastated nearly 15% of the Paraná Delta, and by late 2021 new fires burned more than 300,000 ha of these same wetlands. Therefore, the goal of this work is to explore remote sensing tools to monitor environmental conditions and the severity of prescribed burns in the Paraná River wetlands. Thus, two prescribed burning experiments were carried out in the study area (31°40’ 05’’ S, 60° 34’ 40’’ W) during September 2023. The first experiment was carried out on Sept. 13th, in a plot of 0.5 ha which dominant vegetation were Echinochloa sp., and Thalia, while the second trial was done on Sept 29th in a plot of 0.7 ha, next to the first burned parcel; here the dominant vegetation species were Echinochloa sp. and Solanum glaucophyllum. Field campaigns were conducted between September 8th and November 8th to assess the severity of the prescribed burns. Flight surveys were conducted utilizing a DJI® Inspire II drone equipped with a Sentera® NDVI camera. Then, burn severity was quantified by analyzing images captured by the Sentera camera along with data from the Sentinel 2 satellite mission. This involved subtracting the NDVI images obtained before and after the burn experiments. The results from both data sources demonstrate a highly heterogeneous impact of fire within the patch. Mean severity values obtained with drone NDVI images of the first experience were about 0.16 and 0.18 with Sentinel images. For the second experiment, mean values obtained with the drone were approximately 0.17 and 0.16 with Sentinel images. Thus, most of the pixels showed low fire severity and only a few pixels presented moderated burn severity, based on the wildfire scale. The undisturbed plots maintained consistent mean NDVI values throughout the experiments. Moreover, the severity assessment of each experiment revealed that the vegetation was not completely dry, despite experiencing extreme drought conditions.

Keywords: prescribed-burn, severity, NDVI, wetlands

Procedia PDF Downloads 68
914 Epidemiological Data of Schistosoma haematobium Bilharzia in Rural and Urban Localities in the Republic of Congo

Authors: Jean Akiana, Digne Merveille Nganga Bouanga, Nardiouf Sjelin Nsana, Wilfrid Sapromet Ngoubili, Chyvanelle Ndous Akiridzo, Vishnou Reize Ampiri, Henri-Joseph Parra, Florence Fenollar, Didier Raoult, Oleg Mediannikov, Cheikh Sadhibou Sokhna

Abstract:

Schistosoma haematobium schistosomiasis is an endemic disease in which the level of human exposure, incidence, and fatality attributed to it remains, unfortunately, high worldwide. The erection of hydroelectric infrastructures constitute a major factor in the emergence of this disease. In the context of the Republic of the Congo, which considers industrialization and modernization as two essential pillars of development, building the hydroelectric dams of Liouesso (19 Mw) and the feasibility studies of the dams of Chollet (600MW) in the Sangha, of Sounda (1000MW) in Kouilou and Kouembali (150MW) on Lefini is necessary to increase the country's energy capacities. Likewise, the urbanization of former endemic localities should take into account the maintenance of contamination points. However, health impact studies on schistosomiasis epidemiology in general and urinary bilharzia, in particular, have never been carried out in these areas, neither before nor after the erection of those dams. Participants benefited from an investigative questionnaire, urinalysis both by dipstick and urine filtrate examined under a microscope. Assessment of the genetic diversity of schistosoma species populations was considered as well as PCR analysis to confirm the test strip and microscopy tests. 405 participants were registered in five localities. The sampling was made up of a balanced population in terms of male/female ratio, which is around 1. The prevalence rate was 45% (55/123) in Nkayi, 10.40% (11/106) in Loudima, 1 case in Mbomo (West Cuvette), which would probably be imported, zero in Liouesso and Kabo. The highest oviuria (number of eggs per volume of urine) is 150 S. haematobium eggs/10ml in Nkayi, apart from the case of imported Mbomo, imported from Gabon, which has 160 S. haematobium eggs/10ml. The lowest oviuria was 2 S. haematobium eggs/10ml. Prevalence rates are still high in semi-urban areas (Nkayi). As praziquantel treatments are available and effective, it is important to step up mass treatment campaigns in high risk areas already largely initiated by the National Schistosomiasis Control Program. Prevalence rates are still high in semi-urban areas (Nkayi). As praziquantel treatments are available and effective, it is important to step up mass treatment campaigns in high risk areas already largely initiated by the National Schistosomiasis Control Program.

Keywords: Bilharzia, Schistosoma haematobium, oviuria, urbanization, Congo

Procedia PDF Downloads 149
913 Pharmacological Mechanisms of an Indolic Compound in Chemoprevention of Colonic Acf Formation in Azoxymethane-Induced Colon Cancer Rat Model and Cell Lines

Authors: Nima Samie, Sekaran Muniandy, Zahurin Mohamed, M. S. Kanthimathi

Abstract:

Although number of indole containing compounds have been reported to have anticancer properties in vitro but only a few of them show potential as anticancer compounds in vivo. The current study was to evaluate the mechanism of cytotoxicity of selected indolic compound in vivo and in vitro. In this context, we determined the potency of the compound in the induction of apoptosis, cell cycle arrest, and cytoskeleton rearrangement. HT-29, WiDr, CCD-18Co, human monocyte/macrophage CRL-9855, and B lymphocyte CCL-156 cell lines were used to determine the IC50 of the compound using the MTT assay. Analysis of apoptosis was carried out using immunofluorescence, acridine orange/ propidium iodide double staining, Annexin-V-FITC assay, evaluation of the translocation of NF-kB, oxygen radical antioxidant capacity, quenching of reactive oxygen species content, measurement of LDH release, caspase-3/-7, -8 and -9 assays and western blotting. The cell cycle arrest was examined using flowcytometry and gene expression was assessed using qPCR array. Results displayed a potent suppressive effect on HT-29 and WiDr after 24 h of treatment with IC50 value of 2.52±0.34 µg/ml and 2.13±0.65 µg/ml respectively. This cytotoxic effect on normal, monocyte/macrophage and B-cells was insignificant. Dipping in the mitochondrial membrane potential and increased release of cytochrome c from the mitochondria indicated induction of the intrinsic apoptosis pathway by the compound. Activation of this pathway was further evidenced by significant activation of caspase-9 and 3/7. The compound was also shown to activate the extrinsic pathways of apoptosis via activation of caspase-8 which is linked to the suppression of NF-kB translocation to the nucleus. Cell cycle arrest in the G1 phase and up-regulation of glutathione reductase, based on excessive ROS production were also observed. These findings were further investigated for inhibitory efficiency of the compound on colonic aberrant crypt foci in male rats. Rats were divided in to 5 groups: vehicle, cancer control, positive control groups and the groups treated with 25 and 50 mg/kg of compounds for 10 weeks. Administration of compound suppressed total colonic ACF formation up to 73.4%. The results also showed that treatment with the compound significantly reduced the level of malondialdehyde while increasing superoxide dismutase and catalase activities. Furthermore, the down-regulation of PCNA and Bcl2 and the up-regulation of Bax was confirmed by immunohistochemical staining. The outcome of this study suggest sthat the indolic compound is a potent anti-cancer agent against colon cancer and can be further evaluated by animal trial.

Keywords: indolic compound, chemoprevention, crypt, azoxymethane, colon cancer

Procedia PDF Downloads 348
912 Towards an Environmental Knowledge System in Water Management

Authors: Mareike Dornhoefer, Madjid Fathi

Abstract:

Water supply and water quality are key problems of mankind at the moment and - due to increasing population - in the future. Management disciplines like water, environment and quality management therefore need to closely interact, to establish a high level of water quality and to guarantee water supply in all parts of the world. Groundwater remediation is one aspect in this process. From a knowledge management perspective it is only possible to solve complex ecological or environmental problems if different factors, expert knowledge of various stakeholders and formal regulations regarding water, waste or chemical management are interconnected in form of a knowledge base. In general knowledge management focuses the processes of gathering and representing existing and new knowledge in a way, which allows for inference or deduction of knowledge for e.g. a situation where a problem solution or decision support are required. A knowledge base is no sole data repository, but a key element in a knowledge based system, thus providing or allowing for inference mechanisms to deduct further knowledge from existing facts. In consequence this knowledge provides decision support. The given paper introduces an environmental knowledge system in water management. The proposed environmental knowledge system is part of a research concept called Green Knowledge Management. It applies semantic technologies or concepts such as ontology or linked open data to interconnect different data and information sources about environmental aspects, in this case, water quality, as well as background material enriching an established knowledge base. Examples for the aforementioned ecological or environmental factors threatening water quality are among others industrial pollution (e.g. leakage of chemicals), environmental changes (e.g. rise in temperature) or floods, where all kinds of waste are merged and transferred into natural water environments. Water quality is usually determined with the help of measuring different indicators (e.g. chemical or biological), which are gathered with the help of laboratory testing, continuous monitoring equipment or other measuring processes. During all of these processes data are gathered and stored in different databases. Meanwhile the knowledge base needs to be established through interconnecting data of these different data sources and enriching its semantics. Experts may add their knowledge or experiences of previous incidents or influencing factors. In consequence querying or inference mechanisms are applied for the deduction of coherence between indicators, predictive developments or environmental threats. Relevant processes or steps of action may be modeled in form of a rule based approach. Overall the environmental knowledge system supports the interconnection of information and adding semantics to create environmental knowledge about water environment, supply chain as well as quality. The proposed concept itself is a holistic approach, which links to associated disciplines like environmental and quality management. Quality indicators and quality management steps need to be considered e.g. for the process and inference layers of the environmental knowledge system, thus integrating the aforementioned management disciplines in one water management application.

Keywords: water quality, environmental knowledge system, green knowledge management, semantic technologies, quality management

Procedia PDF Downloads 221
911 Understanding Project Failures in Construction: The Critical Impact of Financial Capacity

Authors: Nnadi Ezekiel Oluwaseun Ejiofor

Abstract:

This research investigates the effects of poor cost estimation, material cost variations, and payment punctuality on the financial health and execution of construction projects in Nigeria. To achieve the objectives of the study, a quantitative research approach was employed, and data was gathered through an online survey of 74 construction industry professionals consisting of quantity surveyors, contractors, and other professionals. The study surveyed input on cost estimation errors, price fluctuations, and payment delays, among other factors. The responses of the respondents were analyzed using a five-point Likert scale and the Relative Importance Index (RII). The findings demonstrated that the errors in cost estimating in the Bill of Quantity (BOQ) have a high degree of negative impact on the reputation and image of the participants in the projects. The greatest effect was experienced on the likelihood of obtaining future endeavors for contractors (mean value = 3.42), followed by the likelihood of obtaining new commissions by quantity surveyors (mean value = 3.40). The level of inaccuracy in costing that undershoots exposes them to risks was most serious in terms of easement of construction and effects of shortage of funds to pursue bankruptcy (hence fears of mean value = 3.78). There was also considerable financial damage as a result of cost underestimation, whereby contractors suffered the worst loss in profit (mean value = 3.88). Every expense comes with its own peculiar risk and uncertainty. Pressure on the cost of materials and every other expense attributed to the building and completion of a structure adds risks to the performance figures of a project. The greatest weight (mean importance score = 4.92) was attributed to issues like market inflation in building materials, while the second greatest weight (mean importance score = 4.76) was due to increased transportation charges. On the other hand, delays in payments arising from issues of the clients like poor availability of funds (RII=0.71) and contracting issues such as disagreements on the valuation of works done (RII=0.72) or other reasons were also found to lead to project delays and additional costs. The results affirm the importance of proper cost estimation on the health of organization finances and project risks and finishes within set time limits. As for the suggestions, it is proposed to progress on the methods of costing, engender better communications with the stakeholders, and manage the delays by way of contracting and financial control. This study enhances the existing literature on construction project management by suggesting ways to deal with adverse cost inaccuracies and availability of materials due to delays in payments which, if addressed, would greatly improve the economic performance of the construction business.

Keywords: cost estimation, construction project management, material price fluctuations, payment delays, financial impact

Procedia PDF Downloads 8
910 Investigation of the Effects of Visually Disabled and Typical Development Students on Their Multiple Intelligence by Applying Abacus and Right Brain Training

Authors: Sidika Di̇lşad Kaya, Ahmet Seli̇m Kaya, Ibrahi̇m Eri̇k, Havva Yaldiz, Yalçin Kaya

Abstract:

The aim of this study was to reveal the effects of right brain development on reading, comprehension, learning and concentration levels and rapid processing skills in students with low vision and students with standard development, and to explore the effects of right and left brain integration on students' academic success and the permanence of the learned knowledge. A total of 68 students with a mean age of 10.01±0.12 were included in the study, 58 of them with standard development, 9 partially visually impaired and 1 totally visually disabled student. The student with a total visual impairment could not participate in the reading speed test due to her total visual impairment. The following data were measured in the participant students before the project; Reading speed measurement in 1 minute, Reading comprehension questions, Burdon attention test, 50 questions of math quiz timed with a stopwatch. Participants were trained for 3 weeks, 5 days a week, for a total of two hours a day. In this study, right-brain developing exercises were carried out with the use of an abacus, and it was aimed to develop both mathematical and attention of students with questions prepared with numerical data taken from fairy tale activities. Among these problems, the study was supported with multiple-choice, 5W (what, where, who, why, when?), 1H (how?) questions along with true-false and fill-in-the-blank activities. By using memory cards, students' short-term memories were strengthened, photographic memory studies were conducted and their visual intelligence was supported. Auditory intelligence was supported by aiming to make calculations by using the abacus in the minds of the students with the numbers given aurally. When calculating the numbers by touching the real abacus, the development of students' tactile intelligence is enhanced. Research findings were analyzed in SPSS program, Kolmogorov Smirnov test was used for normality analysis. Since the variables did not show normal distribution, Wilcoxon test, one of the non-parametric tests, was used to compare the dependent groups. Statistical significance level was accepted as 0.05. The reading speed of the participants was 83.54±33.03 in the pre-test and 116.25±38.49 in the post-test. Narration pre-test 69.71±25.04 post-test 97.06±6.70; BURDON pretest 84.46±14.35 posttest 95.75±5.67; rapid math processing skills pretest 90.65±10.93, posttest 98.18±2.63 (P<0.05). It was determined that the pre-test and post-test averages of students with typical development and students with low vision were also significant for all four values (p<0.05). As a result of the data obtained from the participants, it is seen that the study was effective in terms of measurement parameters, and the findings were statistically significant. Therefore, it is recommended to use the method widely.

Keywords: Abacus, reading speed, multiple intelligences, right brain training, visually impaired

Procedia PDF Downloads 182
909 Monitoring and Evaluation of Web-Services Quality and Medium-Term Impact on E-Government Agencies' Efficiency

Authors: A. F. Huseynov, N. T. Mardanov, J. Y. Nakhchivanski

Abstract:

This practical research is aimed to improve the management quality and efficiency of public administration agencies providing e-services. The monitoring system developed will provide continuous review of the websites compliance with the selected indicators, their evaluation based on the selected indicators and ranking of services according to the quality criteria. The responsible departments in the government agencies were surveyed; the questionnaire includes issues of management and feedback, e-services provided, and the application of information systems. By analyzing the main affecting factors and barriers, the recommendations will be given that lead to the relevant decisions to strengthen the state agencies competencies for the management and the provision of their services. Component 1. E-services monitoring system. Three separate monitoring activities are proposed to be executed in parallel: Continuous tracing of e-government sites using built-in web-monitoring program; this program generates several quantitative values which are basically related to the technical characteristics and the performance of websites. The expert assessment of e-government sites in accordance with the two general criteria. Criterion 1. Technical quality of the site. Criterion 2. Usability/accessibility (load, see, use). Each high-level criterion is in turn subdivided into several sub-criteria, such as: the fonts and the color of the background (Is it readable?), W3C coding standards, availability of the Robots.txt and the site map, the search engine, the feedback/contact and the security mechanisms. The on-line survey of the users/citizens – a small group of questions embedded in the e-service websites. The questionnaires comprise of the information concerning navigation, users’ experience with the website (whether it was positive or negative), etc. Automated monitoring of web-sites by its own could not capture the whole evaluation process, and should therefore be seen as a complement to expert’s manual web evaluations. All of the separate results were integrated to provide the complete evaluation picture. Component 2. Assessment of the agencies/departments efficiency in providing e-government services. - the relevant indicators to evaluate the efficiency and the effectiveness of e-services were identified; - the survey was conducted in all the governmental organizations (ministries, committees and agencies) that provide electronic services for the citizens or the businesses; - the quantitative and qualitative measures are covering the following sections of activities: e-governance, e-services, the feedback from the users, the information systems at the agencies’ disposal. Main results: 1. The software program and the set of indicators for internet sites evaluation has been developed and the results of pilot monitoring have been presented. 2. The evaluation of the (internal) efficiency of the e-government agencies based on the survey results with the practical recommendations related to the human potential, the information systems used and e-services provided.

Keywords: e-government, web-sites monitoring, survey, internal efficiency

Procedia PDF Downloads 304
908 Polarimetric Study of System Gelatin / Carboxymethylcellulose in the Food Field

Authors: Sihem Bazid, Meriem El Kolli, Aicha Medjahed

Abstract:

Proteins and polysaccharides are the two types of biopolymers most frequently used in the food industry to control the mechanical properties and structural stability and organoleptic properties of the products. The textural and structural properties of these two types of blend polymers depend on their interaction and their ability to form organized structures. From an industrial point of view, a better understanding of mixtures protein / polysaccharide is an important issue since they are already heavily involved in processed food. It is in this context that we have chosen to work on a model system composed of a fibrous protein mixture (gelatin)/anionic polysaccharide (sodium carboxymethylcellulose). Gelatin, one of the most popular biopolymers, is widely used in food, pharmaceutical, cosmetic and photographic applications, because of its unique functional and technological properties. Sodium Carboxymethylcellulose (NaCMC) is an anionic linear polysaccharide derived from cellulose. It is an important industrial polymer with a wide range of applications. The functional properties of this anionic polysaccharide can be modified by the presence of proteins with which it might interact. Another factor may also manage the interaction of protein-polysaccharide mixtures is the triple helix of the gelatin. Its complex synthesis method results in an extracellular assembly containing several levels. Collagen can be in a soluble state or associate into fibrils, which can associate in fiber. Each level corresponds to an organization recognized by the cellular and metabolic system. Gelatin allows this approach, the formation of gelatin gel has triple helical folding of denatured collagen chains, this gel has been the subject of numerous studies, and it is now known that the properties depend only on the rate of triple helices forming the network. Chemical modification of this system is quite controlled. Observe the dynamics of the triple helix may be relevant in understanding the interactions involved in protein-polysaccharides mixtures. Gelatin is central to any industrial process, understand and analyze the molecular dynamics induced by the triple helix in the transitions gelatin, can have great economic importance in all fields and especially the food. The goal is to understand the possible mechanisms involved depending on the nature of the mixtures obtained. From a fundamental point of view, it is clear that the protective effect of NaCMC on gelatin and conformational changes of the α helix are strongly influenced by the nature of the medium. Our goal is to minimize the maximum the α helix structure changes to maintain more stable gelatin and protect against denaturation that occurs during such conversion processes in the food industry. In order to study the nature of interactions and assess the properties of mixtures, polarimetry was used to monitor the optical parameters and to assess the rate of helicity gelatin.

Keywords: gelatin, sodium carboxymethylcellulose, interaction gelatin-NaCMC, the rate of helicity, polarimetry

Procedia PDF Downloads 312
907 Modeling the Relation between Discretionary Accrual Earnings Management, International Financial Reporting Standards and Corporate Governance

Authors: Ikechukwu Ndu

Abstract:

This study examines the econometric modeling of the relation between discretionary accrual earnings management, International Financial Reporting Standards (IFRS), and certain corporate governance factors with regard to listed Nigerian non-financial firms. Although discretionary accrual earnings management is a well-known and global problem that has an adverse impact on users of the financial statements, its relationship with IFRS and corporate governance is neither adequately researched nor properly systematically investigated in Nigeria. The dearth of research in the relation between discretionary accrual earnings management, IFRS and corporate governance in Nigeria has made it difficult for academics, practitioners, government setting bodies, regulators and international bodies to achieve a clearer understanding of how discretionary accrual earnings management relates to IFRS and certain corporate governance characteristics. This is the first study to the author’s best knowledge to date that makes interesting research contributions that significantly add to the literature of discretionary accrual earnings management and its relation with corporate governance and IFRS pertaining to the Nigerian context. A comprehensive review is undertaken of the literature of discretionary total accrual earnings management, IFRS, and certain corporate governance characteristics as well as the data, models, methodologies, and different estimators used in the study. Secondary financial statement, IFRS, and corporate governance data are sourced from Bloomberg database and published financial statements of Nigerian non-financial firms for the period 2004 to 2016. The methodology uses both the total and working capital accrual basis. This study has a number of interesting preliminary findings. First, there is a negative relationship between the level of discretionary accrual earnings management and the adoption of IFRS. However, this relationship does not appear to be statistically significant. Second, there is a significant negative relationship between the size of the board of directors and discretionary accrual earnings management. Third, CEO Separation of roles does not constrain earnings management, indicating the need to preserve relationships, personal connections, and maintain bonded friendships between the CEO, Chairman, and executive directors. Fourth, there is a significant negative relationship between discretionary accrual earnings management and the use of a Big Four firm as an auditor. Fifth, including shareholders in the audit committee, leads to a reduction in discretionary accrual earnings management. Sixth, the debt and return on assets (ROA) variables are significant and positively related to discretionary accrual earnings management. Finally, the company size variable indicated by the log of assets is surprisingly not found to be statistically significant and indicates that all Nigerian companies irrespective of size engage in discretionary accrual management. In conclusion, this study provides key insights that enable a better understanding of the relationship between discretionary accrual earnings management, IFRS, and corporate governance in the Nigerian context. It is expected that the results of this study will be of interest to academics, practitioners, regulators, governments, international bodies and other parties involved in policy setting and economic development in areas of financial reporting, securities regulation, accounting harmonization, and corporate governance.

Keywords: discretionary accrual earnings management, earnings manipulation, IFRS, corporate governance

Procedia PDF Downloads 144
906 Prevalence of Positive Serology for Celiac Disease in Children With Autism Spectrum Disorder

Authors: A. Venkatakrishnan, M. Juneja, S. Kapoor

Abstract:

Background: Gastrointestinal dysfunction is an emerging co morbidity seen in autism and may further strengthen the association between autism and celiac disease. This is supported by increased rates (22-70%) of gastrointestinal symptoms like diarrhea, constipation, abdominal discomfort/pain, and gastrointestinal inflammation in children with the etiology of autism is still elusive. In addition to genetic factors, environmental factors such as toxin exposure, intrauterine exposure to certain teratogenic drugs, are being proposed as possible contributing factors in the etiology of Autism Spectrum Disorders (ASD) in cognizance with reports of increased gut permeability and high rates of gastrointestinal symptoms noted in children with ASD, celiac disease has also been proposed as a possible etiological factor. Despite insufficient evidence regarding the benefit of restricted diets in Autism, GFD has been promoted as an alternative treatment for ASD. This study attempts to discern any correlation between ASD and celiac disease. Objective: This cross sectional study aims to determine the proportion of celiac disease in children with ASD. Methods: Study included 155 participants aged 2-12 yrs, diagnosed as ASD as per DSM-5 attending the child development center at a tertiary care hospital in Northern India. Those on gluten free diet or having other autoimmune conditions were excluded. A detailed Performa was filled which included sociodemographic details, history of gastrointestinal symptoms, anthropometry, systemic examination, and pertinent psychological testing was done using was assessed using Developmental Profile-3(DP-3) for Developmental Quotient, Childhood Autism Rating Scale-2 (CARS-2) for severity of ASD, Vineland Adaptive Behavior Scales (VABS) for adaptive behavior, Child Behavior Checklist (CBCL) for behavioral problems and BAMBI (Brief Autism Mealtime Behavior Scales) for feeding problems. Screening for celiac was done by TTG-IgA levels, and total serum IgA levels were measured to exclude IgA deficiency. Those with positive screen were further planned for HLA typing and endoscopic biopsy. Results: A total of 155 cases were included, out of which 5 had low IgA levels and were hence excluded from the study. The rest 150 children had TTG levels below the ULN and normal total serum IgA level. History of Gastrointestinal symptoms was present in 51 (34%) cases abdominal pain was the most frequent complaint (16.6%), followed by constipation (12.6%). Diarrhea was seen in 8 %. Gastrointestinal symptoms were significantly more common in children with ASD above 5 yrs (p-value 0.006) and those who were verbal (p = 0.000). There was no significant association between socio-demographic factors, anthropometric data, or severity of autism with gastrointestinal symptoms. Conclusion: None of the150 patients with ASD had raised TTG levels; hence no association was found between ASD and celiac disease. There is no justification for routine screening for celiac disease in children with ASD. Further studies are warranted to evaluate association of Non Celiac Gluten Sensitivity with ASD and any role of gluten-free diet in such patients.

Keywords: autism, celiac, gastrointestinal, gluten

Procedia PDF Downloads 120
905 Nursing Preceptors' Perspectives of Assessment Competency

Authors: Watin Alkhelaiwi, Iseult Wilson, Marian Traynor, Katherine Rogers

Abstract:

Clinical nursing education allows nursing students to gain essential knowledge from practice experience and develop nursing skills in a variety of clinical environments. Integrating theoretical knowledge and practical skills is made easier for nursing students by providing opportunities for practice in a clinical environment. Nursing competency is an essential capability required to fulfill nursing responsibilities. Effective mentoring in clinical settings helps nursing students develop the necessary competence and promotes the integration of theory and practice. Preceptors play a considerable role in clinical nursing education, including the supervision of nursing students undergoing a rigorous clinical practicum. Preceptors are also involved in the clinical assessment of nursing students’ competency. The assessment of nursing students’ competence by professional practitioners is essential to investigate whether nurses have developed an adequate level of competence to deliver safe nursing care. Competency assessment remains challenging among nursing educators and preceptors, particularly owing to the complexity of the process. Consistency in terms of assessment methods and tools and valid and reliable assessment tools for measuring competence in clinical practice are lacking. Nurse preceptors must assess students’ competencies to prepare them for future professional responsibilities. Preceptors encounter difficulties in the assessment of competency owing to the nature of the assessment process, lack of standardised assessment tools, and a demanding clinical environment. The purpose of the study is to examine nursing preceptors’ experiences of assessing nursing interns’ competency in Saudi Arabia. There are three objectives in this study; the first objective is to examine the preceptors’ view of the Saudi assessment tool in relation to preceptorship, assessment, the assessment tool, the nursing curriculum, and the grading system. The second and third objectives are to examine preceptors’ view of "competency'' in nursing and their interpretations of the concept of competency and to assess the implications of the research in relation to the Saudi 2030 vision. The study uses an exploratory sequential mixed-methods design that involves a two-phase project: a qualitative focus group study is conducted in phase 1, and a quantitative study- a descriptive cross-sectional design (online survey) is conducted in phase 2. The results will inform the preceptors’ view of the Saudi assessment tool in relation to specific areas, including preceptorship and how the preceptors are prepared to be assessors, and assessment and assessment tools through identifying the appropriateness of the instrument for clinical practice. The results will also inform the challenges and difficulties that face the preceptors. These results will be analysed thematically for the focus group interview data, and SPSS software will be used for the analysis of the online survey data.

Keywords: clinical assessment tools, clinical competence, competency assessment, mentor, nursing, nurses, preceptor

Procedia PDF Downloads 66
904 Determination of 1-Deoxynojirimycin and Phytochemical Profile from Mulberry Leaves Cultivated in Indonesia

Authors: Yasinta Ratna Esti Wulandari, Vivitri Dewi Prasasty, Adrianus Rio, Cindy Geniola

Abstract:

Mulberry is a plant that widely cultivated around the world, mostly for silk industry. In recent years, the study showed that the mulberry leaves have an anti-diabetic effect which mostly comes from the compound known as 1-deoxynojirimycin (DNJ). DNJ is a very potent α-glucosidase inhibitor. It will decrease the degradation rate of carbohydrates in digestive tract, leading to slower glucose absorption and reducing the post-prandial glucose level significantly. The mulberry leaves also known as the best source of DNJ. Since then, the DNJ in mulberry leaves had received a considerable attention, because of the increased number of diabetic patients and the raise of people awareness to find a more natural cure for diabetic. The DNJ content in mulberry leaves varied depend on the mulberry species, leaf’s age, and the plant’s growth environment. Few of the mulberry varieties that were cultivated in Indonesiaare Morus alba var. kanva-2, M. alba var. multicaulis, M. bombycis var. lembang, and M. cathayana. The lack of data concerning phytochemicals contained in the Indonesian mulberry leaves are restraining their use in the medicinal field. The aim of this study is to fully utilize the use of mulberry leaves cultivated in Indonesia as a medicinal herb in local, national, or global community, by determining the DNJ and other phytochemical contents in them. This study used eight leaf samples which are the young leaves and mature leaves of both Morus alba var. kanva-2, M. alba var. multicaulis, M. bombycis var. lembang, and M. cathayana. The DNJ content was analyzed using reverse phase high performance liquid chromatography (HPLC). The stationary phase was silica C18 column and the mobile phase was acetonitrile:acetic acid 0.1% 1:1 with elution rate 1 mL/min. Prior to HPLC analysis the samples were derivatized with FMOC to ensure the DNJ detectable by VWD detector at 254 nm. Results showed that the DNJ content in samples are ranging from 2.90-0.07 mg DNJ/ g leaves, with the highest content found in M. cathayana mature leaves (2.90 ± 0.57 mg DNJ/g leaves). All of the mature leaf samples also found to contain higher amount of DNJ from their respective young leaf samples. The phytochemicals in leaf samples was tested using qualitative test. Result showed that all of the eight leaf samples contain alkaloids, phenolics, flavonoids, tannins, and terpenes. The presence of this phytochemicals contribute to the therapeutic effect of mulberry leaves. The pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) analysis was also performed to the eight samples to quantitatively determine their phytochemicals content. The pyrolysis temperature was set at 400 °C, with capillary column Phase Rtx-5MS 60 × 0.25 mm ID stationary phase and helium gas mobile phase. Few of the terpenes found are known to have anticancer and antimicrobial properties. From all the results, all of four samples of mulberry leaves which are cultivated in Indonesia contain DNJ and various phytochemicals like alkaloids, phenolics, flavonoids, tannins, and terpenes which are beneficial to our health.

Keywords: Morus, 1-deoxynojirimycin, HPLC, Py-GC-MS

Procedia PDF Downloads 330
903 Enhanced Recoverable Oil in Northern Afghanistan Kashkari Oil Field by Low-Salinity Water Flooding

Authors: Zabihullah Mahdi, Khwaja Naweed Seddiqi

Abstract:

Afghanistan is located in a tectonically complex and dynamic area, surrounded by rocks that originated on the mother continent of Gondwanaland. The northern Afghanistan basin, which runs along the country's northern border, has the potential for petroleum generation and accumulation. The Amu Darya basin has the largest petroleum potential in the region. Sedimentation occurred in the Amu Darya basin from the Jurassic to the Eocene epochs. Kashkari oil field is located in northern Afghanistan's Amu Darya basin. The field structure consists of a narrow northeast-southwest (NE-SW) anticline with two structural highs, the northwest limb being mild and the southeast limb being steep. The first oil production well in the Kashkari oil field was drilled in 1976, and a total of ten wells were drilled in the area between 1976 and 1979. The amount of original oil in place (OOIP) in the Kashkari oil field, based on the results of surveys and calculations conducted by research institutions, is estimated to be around 140 MMbbls. The objective of this study is to increase recoverable oil reserves in the Kashkari oil field through the implementation of low-salinity water flooding (LSWF) enhanced oil recovery (EOR) technique. The LSWF involved conducting a core flooding laboratory test consisting of four sequential steps with varying salinities. The test commenced with the use of formation water (FW) as the initial salinity, which was subsequently reduced to a salinity level of 0.1%. Afterwards, the numerical simulation model of core scale oil recovery by LSWF was designed by Computer Modelling Group’s General Equation Modeler (CMG-GEM) software to evaluate the applicability of the technology to the field scale. Next, the Kahskari oil field simulation model was designed, and the LSWF method was applied to it. To obtain reasonable results, laboratory settings (temperature, pressure, rock, and oil characteristics) are designed as far as possible based on the condition of the Kashkari oil field, and several injection and production patterns are investigated. The relative permeability of oil and water in this study was obtained using Corey’s equation. In the Kashkari oilfield simulation model, three models: 1. Base model (with no water injection), 2. FW injection model, and 3. The LSW injection model were considered for the evaluation of the LSWF effect on oil recovery. Based on the results of the LSWF laboratory experiment and computer simulation analysis, the oil recovery increased rapidly after the FW was injected into the core. Subsequently, by injecting 1% salinity water, a gradual increase of 4% oil can be observed. About 6.4% of the field, is produced by the application of the LSWF technique. The results of LSWF (salinity 0.1%) on the Kashkari oil field suggest that this technology can be a successful method for developing Kashkari oil production.

Keywords: low salinity water flooding, immiscible displacement, kashkari oil field, twophase flow, numerical reservoir simulation model

Procedia PDF Downloads 42
902 Tailoring Quantum Oscillations of Excitonic Schrodinger’s Cats as Qubits

Authors: Amit Bhunia, Mohit Kumar Singh, Maryam Al Huwayz, Mohamed Henini, Shouvik Datta

Abstract:

We report [https://arxiv.org/abs/2107.13518] experimental detection and control of Schrodinger’s Cat like macroscopically large, quantum coherent state of a two-component Bose-Einstein condensate of spatially indirect electron-hole pairs or excitons using a resonant tunneling diode of III-V Semiconductors. This provides access to millions of excitons as qubits to allow efficient, fault-tolerant quantum computation. In this work, we measure phase-coherent periodic oscillations in photo-generated capacitance as a function of an applied voltage bias and light intensity over a macroscopically large area. Periodic presence and absence of splitting of excitonic peaks in the optical spectra measured by photocapacitance point towards tunneling induced variations in capacitive coupling between the quantum well and quantum dots. Observation of negative ‘quantum capacitance’ due to a screening of charge carriers by the quantum well indicates Coulomb correlations of interacting excitons in the plane of the sample. We also establish that coherent resonant tunneling in this well-dot heterostructure restricts the available momentum space of the charge carriers within this quantum well. Consequently, the electric polarization vector of the associated indirect excitons collective orients along the direction of applied bias and these excitons undergo Bose-Einstein condensation below ~100 K. Generation of interference beats in photocapacitance oscillation even with incoherent white light further confirm the presence of stable, long-range spatial correlation among these indirect excitons. We finally demonstrate collective Rabi oscillations of these macroscopically large, ‘multipartite’, two-level, coupled and uncoupled quantum states of excitonic condensate as qubits. Therefore, our study not only brings the physics and technology of Bose-Einstein condensation within the reaches of semiconductor chips but also opens up experimental investigations of the fundamentals of quantum physics using similar techniques. Operational temperatures of such two-component excitonic BEC can be raised further with a more densely packed, ordered array of QDs and/or using materials having larger excitonic binding energies. However, fabrications of single crystals of 0D-2D heterostructures using 2D materials (e.g. transition metal di-chalcogenides, oxides, perovskites etc.) having higher excitonic binding energies are still an open challenge for semiconductor optoelectronics. As of now, these 0D-2D heterostructures can already be scaled up for mass production of miniaturized, portable quantum optoelectronic devices using the existing III-V and/or Nitride based semiconductor fabrication technologies.

Keywords: exciton, Bose-Einstein condensation, quantum computation, heterostructures, semiconductor Physics, quantum fluids, Schrodinger's Cat

Procedia PDF Downloads 180
901 Influence of Glass Plates Different Boundary Conditions on Human Impact Resistance

Authors: Alberto Sanchidrián, José A. Parra, Jesús Alonso, Julián Pecharromán, Antonia Pacios, Consuelo Huerta

Abstract:

Glass is a commonly used material in building; there is not a unique design solution as plates with a different number of layers and interlayers may be used. In most façades, a security glazing have to be used according to its performance in the impact pendulum. The European Standard EN 12600 establishes an impact test procedure for classification under the point of view of the human security, of flat plates with different thickness, using a pendulum of two tires and 50 kg mass that impacts against the plate from different heights. However, this test does not replicate the actual dimensions and border conditions used in building configurations and so the real stress distribution is not determined with this test. The influence of different boundary conditions, as the ones employed in construction sites, is not well taking into account when testing the behaviour of safety glazing and there is not a detailed procedure and criteria to determinate the glass resistance against human impact. To reproduce the actual boundary conditions on site, when needed, the pendulum test is arranged to be used "in situ", with no account for load control, stiffness, and without a standard procedure. Fracture stress of small and large glass plates fit a Weibull distribution with quite a big dispersion so conservative values are adopted for admissible fracture stress under static loads. In fact, test performed for human impact gives a fracture strength two or three times higher, and many times without a total fracture of the glass plate. Newest standards, as for example DIN 18008-4, states for an admissible fracture stress 2.5 times higher than the ones used for static and wing loads. Now two working areas are open: a) to define a standard for the ‘in situ’ test; b) to prepare a laboratory procedure that allows testing with more real stress distribution. To work on both research lines a laboratory that allows to test medium size specimens with different border conditions, has been developed. A special steel frame allows reproducing the stiffness of the glass support substructure, including a rigid condition used as reference. The dynamic behaviour of the glass plate and its support substructure have been characterized with finite elements models updated with modal tests results. In addition, a new portable impact machine is being used to get enough force and direction control during the impact test. Impact based on 100 J is used. To avoid problems with broken glass plates, the test have been done using an aluminium plate of 1000 mm x 700 mm size and 10 mm thickness supported on four sides; three different substructure stiffness conditions are used. A detailed control of the dynamic stiffness and the behaviour of the plate is done with modal tests. Repeatability of the test and reproducibility of results prove that procedure to control both, stiffness of the plate and the impact level, is necessary.

Keywords: glass plates, human impact test, modal test, plate boundary conditions

Procedia PDF Downloads 307
900 Integration of Gravity and Seismic Methods in the Geometric Characterization of a Dune Reservoir: Case of the Zouaraa Basin, NW Tunisia

Authors: Marwa Djebbi, Hakim Gabtni

Abstract:

Gravity is a continuously advancing method that has become a mature technology for geological studies. Increasingly, it has been used to complement and constrain traditional seismic data and even used as the only tool to get information of the sub-surface. In fact, in some regions the seismic data, if available, are of poor quality and hard to be interpreted. Such is the case for the current study area. The Nefza zone is part of the Tellian fold and thrust belt domain in the north west of Tunisia. It is essentially made of a pile of allochthonous units resulting from a major Neogene tectonic event. Its tectonic and stratigraphic developments have always been subject of controversies. Considering the geological and hydrogeological importance of this area, a detailed interdisciplinary study has been conducted integrating geology, seismic and gravity techniques. The interpretation of Gravity data allowed the delimitation of the dune reservoir and the identification of the regional lineaments contouring the area. It revealed the presence of three gravity lows that correspond to the dune of Zouara and Ouchtata separated along with a positive gravity axis espousing the Ain Allega_Aroub Er Roumane axe. The Bouguer gravity map illustrated the compartmentalization of the Zouara dune into two depressions separated by a NW-SE anomaly trend. This constitution was confirmed by the vertical derivative map which showed the individualization of two depressions with slightly different anomaly values. The horizontal gravity gradient magnitude was performed in order to determine the different geological features present in the studied area. The latest indicated the presence of NE-SW parallel folds according to the major Atlasic direction. Also, NW-SE and EW trends were identified. The maxima tracing confirmed this direction by the presence of NE-SW faults, mainly the Ghardimaou_Cap Serrat accident. The quality of the available seismic sections and the absence of borehole data in the region, except few hydraulic wells that been drilled and showing the heterogeneity of the substratum of the dune, required the process of gravity modeling of this challenging area that necessitates to be modeled for the geometrical characterization of the dune reservoir and determine the different stratigraphic series underneath these deposits. For more detailed and accurate results, the scale of study will be reduced in coming research. A more concise method will be elaborated; the 4D microgravity survey. This approach is considered as an expansion of gravity method and its fourth dimension is time. It will allow a continuous and repeated monitoring of fluid movement in the subsurface according to the micro gal (μgall) scale. The gravity effect is a result of a monthly variation of the dynamic groundwater level which correlates with rainfall during different periods.

Keywords: 3D gravity modeling, dune reservoir, heterogeneous substratum, seismic interpretation

Procedia PDF Downloads 298
899 Large-Scale Production of High-Performance Fiber-Metal-Laminates by Prepreg-Press-Technology

Authors: Christian Lauter, Corin Reuter, Shuang Wu, Thomas Troester

Abstract:

Lightweight construction became more and more important over the last decades in several applications, e.g. in the automotive or aircraft sector. This is the result of economic and ecological constraints on the one hand and increasing safety and comfort requirements on the other hand. In the field of lightweight design, different approaches are used due to specific requirements towards the technical systems. The use of endless carbon fiber reinforced plastics (CFRP) offers the largest weight saving potential of sometimes more than 50% compared to conventional metal-constructions. However, there are very limited industrial applications because of the cost-intensive manufacturing of the fibers and production technologies. Other disadvantages of pure CFRP-structures affect the quality control or the damage resistance. One approach to meet these challenges is hybrid materials. This means CFRP and sheet metal are combined on a material level. Therefore, new opportunities for innovative process routes are realizable. Hybrid lightweight design results in lower costs due to an optimized material utilization and the possibility to integrate the structures in already existing production processes of automobile manufacturers. In recent and current research, the advantages of two-layered hybrid materials have been pointed out, i.e. the possibility to realize structures with tailored mechanical properties or to divide the curing cycle of the epoxy resin into two steps. Current research work at the Chair for Automotive Lightweight Design (LiA) at the Paderborn University focusses on production processes for fiber-metal-laminates. The aim of this work is the development and qualification of a large-scale production process for high-performance fiber-metal-laminates (FML) for industrial applications in the automotive or aircraft sector. Therefore, the prepreg-press-technology is used, in which pre-impregnated carbon fibers and sheet metals are formed and cured in a closed, heated mold. The investigations focus e.g. on the realization of short process chains and cycle times, on the reduction of time-consuming manual process steps, and the reduction of material costs. This paper gives an overview over the considerable steps of the production process in the beginning. Afterwards experimental results are discussed. This part concentrates on the influence of different process parameters on the mechanical properties, the laminate quality and the identification of process limits. Concluding the advantages of this technology compared to conventional FML-production-processes and other lightweight design approaches are carried out.

Keywords: composite material, fiber-metal-laminate, lightweight construction, prepreg-press-technology, large-series production

Procedia PDF Downloads 240
898 The Impact of Glass Additives on the Functional and Microstructural Properties of Sand-Lime Bricks

Authors: Anna Stepien

Abstract:

The paper presents the results of research on modifications of sand-lime bricks, especially using glass additives (glass fiber and glass sand) and other additives (e.g.:basalt&barite aggregate, lithium silicate and microsilica) as well. The main goal of this paper is to answer the question ‘How to use glass additives in the sand-lime mass and get a better bricks?’ The article contains information on modification of sand-lime bricks using glass fiber, glass sand, microsilica (different structure of silica). It also presents the results of the conducted compression tests, which were focused on compressive strength, water absorption, bulk density, and their microstructure. The Scanning Electron Microscope, spectrum EDS, X-ray diffractometry and DTA analysis helped to define the microstructural changes of modified products. The interpretation of the products structure revealed the existence of diversified phases i.e.the C-S-H and tobermorite. CaO-SiO2-H2O system is the object of intensive research due to its meaning in chemistry and technologies of mineral binding materials. Because the blocks are the autoclaving materials, the temperature of hydrothermal treatment of the products is around 200°C, the pressure - 1,6-1,8 MPa and the time - up to 8hours (it means: 1h heating + 6h autoclaving + 1h cooling). The microstructure of the products consists mostly of hydrated calcium silicates with a different level of structural arrangement. The X-ray diffraction indicated that the type of used sand is an important factor in the manufacturing of sand-lime elements. Quartz sand of a high hardness is also a substrate hardly reacting with other possible modifiers, which may cause deterioration of certain physical and mechanical properties. TG and DTA curves show the changes in the weight loss of the sand-lime bricks specimen against time as well as the endo- and exothermic reactions that took place. The endothermic effect with the maximum at T=573°C is related to isomorphic transformation of quartz. This effect is not accompanied by a change of the specimen weight. The next endothermic effect with the maximum at T=730-760°C is related to the decomposition of the calcium carbonates. The bulk density of the brick it is 1,73kg/dm3, the presence of xonotlite in the microstructure and significant weight loss during DTA and TG tests (around 0,6% after 70 minutes) have been noticed. Silicate elements were assessed on the basis of their compressive property. Orthogonal compositional plan type 3k (with k=2), i.e.full two-factor experiment was applied in order to carry out the experiments both, in the compression strength test and bulk density test. Some modification (e.g.products with barite and basalt aggregate) have improved the compressive strength around 41.3 MPa and water absorption due to capillary raising have been limited to 12%. The next modification was adding glass fiber to sand-lime mass, then glass sand. The results show that the compressive strength was higher than in the case of traditional bricks, while modified bricks were lighter.

Keywords: bricks, fiber, glass, microstructure

Procedia PDF Downloads 347
897 The Lighthouse Project: Recent Initiatives to Navigate Australian Families Safely Through Parental Separation

Authors: Kathryn McMillan

Abstract:

A recent study of 8500 adult Australians aged 16 and over revealed 62% had experienced childhood maltreatment. In response to multiple recommendations by bodies such as the Australian Law Reform Commission, parliamentary reports and stakeholder input, a number of key initiatives have been developed to grapple with the difficulties of a federal-state system and to screen and triage high-risk families navigating their way through the court system. The Lighthouse Project (LHP) is a world-first initiative of the Federal Circuit and Family Courts in Australia (FCFOCA) to screen family law litigants for major risk factors, including family violence, child abuse, alcohol or substance abuse and mental ill-health at the point of filing in all applications that seek parenting orders. It commenced on 7 December 2020 on a pilot basis but has now been expanded to 15 registries across the country. A specialist risk screen, Family DOORS, Triage has been developed – focused on improving the safety and wellbeing of families involved in the family law system safety planning and service referral, and ¬ differentiated case management based on risk level, with the Evatt List specifically designed to manage the highest risk cases. Early signs are that this approach is meeting the needs of families with multiple risks moving through the Court system. Before the LHP, there was no data available about the prevalence of risk factors experienced by litigants entering the family courts and it was often assumed that it was the litigation process that was fueling family violence and other risks such as suicidality. Data from the 2022 FCFCOA annual report indicated that in parenting proceedings, 70% alleged a child had been or was at risk of abuse, 80% alleged a party had experienced Family Violence, 74 % of children had been exposed to Family Violence, 53% alleged through substance misuse by party children had caused or was at risk of causing harm to children and 58% of matters allege mental health issues of a party had caused or placed a child at risk of harm. Those figures reveal the significant overlap between child protection and family violence, both of which are under the responsibility of state and territory governments. Since 2020, a further key initiative has been the co-location of child protection and police officials amongst a number of registries of the FCFOCA. The ability to access in a time-effective way details of family violence or child protection orders, weapons licenses, criminal convictions or proceedings is key to managing issues across the state and federal divide. It ensures a more cohesive and effective response to family law, family violence and child protection systems.

Keywords: child protection, family violence, parenting, risk screening, triage.

Procedia PDF Downloads 77
896 Effect of 12 Weeks Pedometer-Based Workplace Program on Inflammation and Arterial Stiffness in Young Men with Cardiovascular Risks

Authors: Norsuhana Omar, Amilia Aminuddina Zaiton Zakaria, Raifana Rosa Mohamad Sattar, Kalaivani Chellappan, Mohd Alauddin Mohd Ali, Norizam Salamt, Zanariyah Asmawi, Norliza Saari, Aini Farzana Zulkefli, Nor Anita Megat Mohd. Nordin

Abstract:

Inflammation plays an important role in the pathogenesis of vascular dysfunction leading to arterial stiffness. Pulse wave velocity (PWV) and augmentation index (AS), as tools for the assessment of vascular damages are widely used and have been shown to predict cardiovascular disease (CVD). C-reactive protein (CRP) is a marker of inflammation. Several studies noted that regular exercise is associated with reduced arterial stiffness. The lack of exercise among Malaysians and the increasing CVD morbidity and mortality among young men are of concern. In Malaysia data on the workplace exercise intervention is scarce. A programme was designed to enable subjects to increase their level of walking as part of their daily work routine and self-monitored by using pedometers. The aim of this study to evaluate the reducing of inflammation by measuring CRP and improvement arterial stiffness measured by carotid femoral PWV (PWVCF) and AI. A total of 70 young men (20 - 40 years) who were sedentary, achieving less than 5,000 steps/day in casual walking with 2 or more cardiovascular risk factors were recruited in Institute of Vocational Skills for Youth (IKBN Hulu Langat). Subjects were randomly assigned to a control (CG) (n=34; no change in walking) and pedometer group (PG) (n=36; minimum target: 8,000 steps/day). The CRP was measured by using immunological method while PWVCF and AI were measured using Vicorder. All parameters were measured at baseline and after 12 weeks. Data for analysis was conducted using Statistical Package of Social Sciences Version 22 (SPSS Inc., Chicago, IL, USA). At post intervention, the CG step counts were similar (4983 ± 366vs 5697 ± 407steps/day). The PG increased step count from 4996 ± 805 to 10,128 ±511 steps/day (P<0.001). The PG showed significant improvement in anthropometric variables and lipid (time and group effect p<0.001). For vascular assessment, the PG showed significantly decreased for time and effect (p<0.001) for PWV (7.21± 0.83 to 6.42 ± 0.89) m/s; AI (11.88± 6.25 to 8.83 ± 3.7) % and CRP (pre= 2.28 ± 3.09, post=1.08± 1.37mg/L). However, no changes were seen in CG. As a conclusion, a pedometer-based walking programme may be an effective strategy for promoting increased daily physical activity which reduces cardiovascular risk markers and thus improve cardiovascular health in terms of inflammation and arterial stiffness. The community intervention for health maintenance has potential to adopt walking as an exercise and adopting vascular fitness index as the performance measuring tools.

Keywords: arterial stiffness, exercise, inflammation, pedometer

Procedia PDF Downloads 353
895 Notes on Matter: Ibn Arabi, Bernard Silvestris, and Other Ghosts

Authors: Brad Fox

Abstract:

Between something and nothing, a bit of both, neither/nor, a figment of the imagination, the womb of the universe - questions of what matter is, where it exists and what it means continue to surge up from the bottom of our concepts and theories. This paper looks at divergences and convergences, intimations and mistranslations, in a lineage of thought that begins with Plato’s Timaeus, travels through Arabic Spain and Syria, finally to end up in the language of science. Up to the 13th century, philosophers in Christian France based such inquiries on a questionable and fragmented translation of the Timaeus by Calcidius, with a commentary that conflated the Platonic concept of khora (‘space’ or ‘void’) with Aristotle’s hyle (‘primal matter’ as derived from ‘wood’ as a building material). Both terms were translated by Calcidius as silva. For 700 years, this was the only source for philosophers of matter in the Latin-speaking world. Bernard Silvestris, in his Cosmographia, exemplifies the concepts developed before new translations from Arabic began to pour into the Latin world from such centers as the court of Toledo. Unlike their counterparts across the Pyrenees, 13th century philosophers in Muslim Spain had access to a broad vocabulary for notions of primal matter. The prolific and visionary theologian, philosopher, and poet Muhyiddin Ibn Arabi could draw on the Ikhwan Al-Safa’s 10th Century renderings of Aristotle, which translated the Greek hyle as the everyday Arabic word maddah, still used for building materials today. He also often used the simple transliteration of hyle as hayula, probably taken from Ibn Sina. The prophet’s son-in-law Ali talked of dust in the air, invisible until it is struck by sunlight. Ibn Arabi adopted this dust - haba - as an expression for an original metaphysical substance, nonexistent but susceptible to manifesting forms. Ibn Arabi compares the dust to a phoenix, because we have heard about it and can conceive of it, but it has no existence unto itself and can be described only in similes. Elsewhere he refers to it as quwwa wa salahiyya - pure potentiality and readiness. The final portion of the paper will compare Bernard and Ibn Arabi’s notions of matter to the recent ontology developed by theoretical physicist and philosopher Karen Barad. Looking at Barad’s work with the work of Nils Bohr, it will argue that there is a rich resonance between Ibn Arabi’s paradoxical conceptions of matter and the quantum vacuum fluctuations verified by recent lab experiments. The inseparability of matter and meaning in Barad recall Ibn Arabi’s original response to Ibn Rushd’s question: Does revelation offer the same knowledge as rationality? ‘Yes and No,’ Ibn Arabi said, ‘and between the yes and no spirit is divided from matter and heads are separated from bodies.’ Ibn Arabi’s double affirmation continues to offer insight into our relationship to momentary experience at its most fundamental level.

Keywords: Karen Barad, Muhyiddin Ibn Arabi, primal matter, Bernard Silvestris

Procedia PDF Downloads 427
894 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 125