Search results for: Standard.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1677

Search results for: Standard.

147 Primary School Teachers’ Conceptual and Procedural Knowledge of Rational Number and Its Effects on Pupils’ Achievement in Rational Numbers

Authors: R. M. Kashim

Abstract:

The study investigated primary school teachers’ conceptual and procedural knowledge of rational numbers and its effects on pupil’s achievement in rational numbers. Specifically, primary school teachers’ level of conceptual knowledge about rational numbers, primary school teachers’ level of procedural knowledge about rational numbers, and the effects of teachers conceptual and procedural knowledge on their pupils understanding of rational numbers in primary schools is investigated. The study was carried out in Bauchi metropolis in the Bauchi state of Nigeria. The design of the study was a multi-stage design. The first stage was a descriptive design. The second stage involves a pre-test, post-test only quasi-experimental design. Two instruments were used for the data collection in the study. These were Conceptual and Procedural knowledge test (CPKT) and Rational number achievement test (RAT), the population of the study comprises of three (3) mathematics teachers’ holders of Nigerian Certificate in Education (NCE) teaching primary six and 210 pupils in their intact classes were used for the study. The data collected were analyzed using mean, standard deviation, analysis of variance, analysis of covariance and t- test. The findings indicated that the pupils taught rational number by a teacher that has high conceptual and procedural knowledge understand and perform better than the pupil taught by a teacher who has low conceptual and procedural knowledge of rational number. It is, therefore, recommended that teachers in primary schools should be encouraged to enrich their conceptual knowledge of rational numbers. Also, the superiority performance of teachers in procedural knowledge in rational number should not become an obstruction of understanding. Teachers Conceptual and procedural knowledge of rational numbers should be balanced so that primary school pupils will have a view of better teaching and learning of rational number in our contemporary schools.

Keywords: Achievement, conceptual knowledge, procedural knowledge, rational numbers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 886
146 Polyphenolic Profile and Antioxidant Activities of Nigella Sativa Seed Extracts In Vitro and In Vivo

Authors: Asma Meziti, Hicham Meziti, Kaouthar Boudiaf, Benboubetra Mustapha, Hemama Bouriche.

Abstract:

Nigella sativa L. is an aromatic plant belonging to the family Ranunculaceae. It has been used traditionally, especially in the middle East and India, for the treatment of asthma, cough, bronchitis, headache, rheumatism, fever, influenza and eczema. Several biological activities have been reported in Nigella sativa seeds, including antioxidant. In this context we tried to estimate the antioxidant activity of various extracts prepared from Nigella sativa seeds, methanolic extract (ME), chloroformic extract (CE), hexanic extract (HE : fixed oil), ethyl acetate extract (EAE) water extract (WE). The Folin-Ciocalteu assay showed that CE and EAE contained high level of phenolic compounds 81.31 and 72.43μg GAE/mg of extract respectively. Similarly, the CE and EAE exhibited the highest DPPH radical scavenging activity, with IC50 values of 106.56μg/ml and 121.62μg/ml respectively. In addition, CE and HE showed the most scavenging activity against superoxide radical generated in the PMS-NADH-NBT system with respective IC50 values of 361.86 μg/ml and 371.80 μg/ml, which is comparable to the activity of the standard antioxidant BHT (344.59 μg/ml). Ferrous ion chelating capacity assay showed that WE, EAE and ME are the most active with 40.57, 39.70 and 22.02 mg EDTA-E/g of extract. The inhibition of linoleic acid/ß-carotene coupled oxidation was estimated by ßcarotene bleaching assay, this showed a highest relative antioxidant activity with CE and EAE (69.82% of inhibition). The antioxidant activities of the methanolic extract and the fixed oil are confirmed by an in vivo assay in mice, the daily oral administration of methanolic extract (500 and 800 mg/kg/day) and fixed oil (2 and 4 ml/kg/day) during 21 days, resulted in a significant enhancement of the blood total antioxidant capacity (measured by KRL test) and the plasmatic antioxidant capacity towards DPPH radical.

Keywords: Antioxidant Capacity, Chelating, Phenolic Compounds, Nigella Sativa, Scavenger

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4293
145 Increase Success by Decreasing Admission for Maths– Fairytale or Reality?

Authors: L.A du Plessis

Abstract:

South Africa is facing a crisis with not being able to produce enough graduates in the scarce skills areas to sustain economic growth. The crisis is fuelled by a school system that does not produce enough potential students with Mathematics, Accounting and Science. Since the introduction of the new school curriculum in 2008, there is no longer an option to take pure maths on a standard grade level. Instead, only two mathematical subjects are offered: pure maths (which is on par with higher grade maths) and mathematical literacy. It is compulsory to take one or the other. As a result, lees student finishes Grade 12 with pure mathematics every year. This national problem needs urgent attention if South Africa is to make any headway in critical skills development as mathematics is a gateway to scarce skills professions. Higher education institutions initiated several initiatives in an attempt to address the above, including preparatory courses, bridging programmes and extended curricula with foundation provisions. In view of the above, and government policy directives to broaden access in the scarce skills areas to increase student throughput, foundation provision was introduced for Commerce and Information Technology programmes at the Vaal Triangle Campus (VTC) of North-West University (NWU) in 2010. Students enrolling for extended programmes do not comply with the minimum prerequisites for the normal programmes. The question then arises as to whether these programmes have the intended impact? This paper reports the results of a two year longitudinal study, tracking the first year academic achievement of the two cohorts of enrolments since 2010. The results provide valuable insight into the structuring of an extended programme and its potential impact.

Keywords: Access, extended programmes, foundation provision, mathematics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576
144 Hydrogeological Factors of the Ore Genesis in the Sedimentary Basins

Authors: O. Abramova, L. Abukova, A. Goreva, G. Isaeva

Abstract:

The present work was made for the purpose of evaluating the interstitial water’s role in the mobilization of metal elements of clay deposits and occurrences in sedimentary formation in the hydro-geological basins. The experiments were performed by using a special facility, which allows adjusting the pressure, temperature, and the frequency of the acoustic vibrations. The dates for study were samples of the oil shales (Baltic career, O2kk) and clay rocks, mainly montmorillonite composition (Borehole SG-12000, the depth of selection 1000–3600 m, the Azov-Kuban trough, N1). After interstitial water squeezing from the rock samples, decrease in the original content of the rock forming components including trace metals V, Cr, Co, Ni, Cu, Zn, Zr, Mo, Pb, W, Ti, and others was recorded. The experiments made it possible to evaluate the ore elements output and organic matters with the interstitial waters. Calculations have shown that, in standard conditions, from each ton of the oil shales, 5-6 kg of ore elements and 9-10 kg of organic matter can be escaped. A quantity of matter, migrating from clays in the process of solidification, is changed depending on the lithogenesis stage: more recent unrealized deposits lose more ore and organic materials than the clay rocks, selected from depth over 3000 m. Each ton of clays in the depth interval 1000-1500 m is able to generate 3-5 kg of the ore elements and 6-8 kg of the organic matters. The interstitial waters are a freight forwarder over transferring these matters in the reservoir beds. It was concluded that the interstitial waters which escaped from the study samples are solutions with abnormal high concentrations of the metals and organic matters. In the discharge zones of the sediment basins, such fluids can create paragenetic associations of the sedimentary-catagenetic ore and hydrocarbon mineral resources accumulations.

Keywords: Hydrocarbons, ore genesis, paragenesis, interstitial waters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1075
143 Structure and Power Struggle in Contemporary Nollywood: An Ethnographic Evaluation

Authors: Ezinne M. Igwe

Abstract:

Statements of facts have been made about Nollywood, a segment of the Nigerian film industry that has in recent times become phenomenal due largely to its quantity of production and specific production style. In the face of recent transformations reshaping the industry, matters have been arising which have not been given due academic attention from an industry player perspective. While re-addressing such issues like structure, policy and informality, this study benefits from a new perspective – that of a community member adopting participant observation to research into a familiar culture. With data drawn from an extensive ethnographic study of the industry, this paper examines these matters with an emphasis on structure and the industry’s overall political economy. Drawing from discourses on the new and old Nollywood labels and other current matters arising within the industry such as the MOPICON bill redraft, corporate financing and possibilities of regeneration, this paper examines structure and power struggle within Nollywood. These are championing regenerative processes that bring about formalization, professionalism and the quest for a transnational presence, which have only been superficially evaluated. Focused essentially on Nollywood’s political economy, this study critically analyses the transforming face of an informal industry, the consistent quest for structure, quality and standard, and issues of corporate sponsorship as possible trends of regeneration. It evaluates them as indicators of regeneration, questioning the possibilities of their sustenance in an industry experiencing increased interactions with the formal economy and an influx of young professionals. With findings that make sustained regeneration both certain (due to increased formal economy interaction) and uncertain (due to the dysfunctionality of the society and its political system), it concludes that the transforming face of the industry suggests impending gentrification of the industry.

Keywords: Formalization, MOPICON, Nollywood, Structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1343
142 Analysis of Noise Level Effects on Signal-Averaged Electrocardiograms

Authors: Chun-Cheng Lin

Abstract:

Noise level has critical effects on the diagnostic performance of signal-averaged electrocardiogram (SAECG), because the true starting and end points of QRS complex would be masked by the residual noise and sensitive to the noise level. Several studies and commercial machines have used a fixed number of heart beats (typically between 200 to 600 beats) or set a predefined noise level (typically between 0.3 to 1.0 μV) in each X, Y and Z lead to perform SAECG analysis. However different criteria or methods used to perform SAECG would cause the discrepancies of the noise levels among study subjects. According to the recommendations of 1991 ESC, AHA and ACC Task Force Consensus Document for the use of SAECG, the determinations of onset and offset are related closely to the mean and standard deviation of noise sample. Hence this study would try to perform SAECG using consistent root-mean-square (RMS) noise levels among study subjects and analyze the noise level effects on SAECG. This study would also evaluate the differences between normal subjects and chronic renal failure (CRF) patients in the time-domain SAECG parameters. The study subjects were composed of 50 normal Taiwanese and 20 CRF patients. During the signal-averaged processing, different RMS noise levels were adjusted to evaluate their effects on three time domain parameters (1) filtered total QRS duration (fQRSD), (2) RMS voltage of the last QRS 40 ms (RMS40), and (3) duration of the low amplitude signals below 40 μV (LAS40). The study results demonstrated that the reduction of RMS noise level can increase fQRSD and LAS40 and decrease the RMS40, and can further increase the differences of fQRSD and RMS40 between normal subjects and CRF patients. The SAECG may also become abnormal due to the reduction of RMS noise level. In conclusion, it is essential to establish diagnostic criteria of SAECG using consistent RMS noise levels for the reduction of the noise level effects.

Keywords: Signal-averaged electrocardiogram, Ventricular latepotentials, Chronic renal failure, Noise level effects.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
141 Probe-Assisted Axillary Lymph Node Biopsy Compared with Axillary Dissection in Breast Cancer: A Retrospective Study from the West of Iran

Authors: Morteza Alizadeh Foroutan, Hassan Moayeri, Keivan Sabooni, Motahareh Rouhi Ardeshiri

Abstract:

Breast cancer incidence is annually increasing in various parts of the world, and sentinel lymph node biopsy (SLNB) has turned into a new standard for care as a staging process in this regard. In the present study, the gamma probe technique was used for SLNB as a safe method with more accuracy and less complications. The study sought to compare the results of two surgical techniques, namely, axillary lymph node dissection (ALND) and SLNB, including epidemiological results and clinicopathological features of BC patients from the western provinces of Iran. In general, 420 BC women were identified who referred to the breast clinic in Sanandaj, Kurdistan province during 2017-2021. Of whom, 318 patients underwent breast surgery, and from these patients, 277 cases participated in the current study. Patients were divided into those undergoing ALND and SLNB. The criteria for complete dissection or axillary biopsy using the gamma probe were based on the results of clinical examinations and the presence of palpable lymph nodes. Overall complications after surgery belonged to 58 (18.9%) cases, including 15 (25.9%) and 43 (74.1%) patients in the SLNB and ALND groups, respectively (P = 0.74). Based on the findings, Seroma (60.3%) was the most reported complication in each group. Most patients had tumors in the upper-outer quadrant of their left breast. The mean of the tumor dimension in the SLNB and ALND groups was 2.1 ± 1.3 cm and 3.2 ± 1.8 cm, respectively, (P = 0.003). The benefits of breast-conserving surgery (BCS) with the SLNB technique are clearly undeniable and can be considered a method with less complications and a better prognosis. Accordingly, SLNB and BCS are favorable methods that can be performed, along with gamma probe technique, which is safe and accurate.

Keywords: Breast cancer, Sentinel lymph node biopsy, Axillary lymph node dissection, Gamma probe.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40
140 Bacteriological Quality of Commercially Prepared Fermented Ogi (Akamu) Sold in Some Parts of South Eastern Nigeria

Authors: Alloysius C. Ogodo, Ositadinma C. Ugbogu, Uzochukwu G. Ekeleme

Abstract:

Food poisoning and infection by bacteria are of public health significance to both developing and developed countries. Samples of ogi (akamu) prepared from white and yellow variety of maize sold in Uturu and Okigwe were analyzed together with the laboratory prepared ogi for bacterial quality using the standard microbiological methods. The analyses showed that both white and yellow variety had total bacterial counts (cfu/g) of 4.0 ×107 and 3.9 x 107 for the laboratory prepared ogi while the commercial ogi had 5.2 x 107 and 4.9 x107, 4.9 x107 and 4.5 x107, 5.4 x107 and 5.0 x107 for Eke-Okigwe, Up-gate and Nkwo-Achara market respectively. The Staphylococcal counts ranged from 2.0 x 102 to 5.0 x102 and 1.0 x 102 to 4.0 x102 for the white and yellow variety from the different markets while Staphylococcal growth was not recorded on the laboratory prepared ogi. The laboratory prepared ogi had no Coliform growth while the commercially prepared ogi had counts of 0.5 x103 to 1.6 x 103 for white variety and 0.3 x 103 to 1.1 x103 for yellow variety respectively. The Lactic acid bacterial count of 3.5x106 and 3.0x106 was recorded for the laboratory ogi while the commercially prepared ogi ranged from 3.2x106 to 4.2x106 (white variety) and 3.0 x106 to 3.9 x106 (yellow). The presence of bacteria isolates from the commercial and laboratory fermented ogi showed that Lactobacillus sp, Leuconostoc sp and Citrobacter sp were present in all the samples, Micrococcus sp and Klebsiella sp were isolated from Eke- Okigwe and ABSU-up-gate markets varieties respectively, E. coli and Staphylococcus sp were present in Eke-Okigwe and Nkwo- Achara markets while Salmonella sp were isolated from the three markets. Hence, there are chances of contracting food borne diseases from commercially prepared ogi. Therefore, there is the need for sanitary measures in the production of fermented cereals so as to minimize the rate of food borne pathogens during processing and storage.

Keywords: Bacterial quality, fermentation, maize, Ogi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3460
139 A Secure Auditing Framework for Load Balancing in Cloud Environment

Authors: R. Geetha, T. Padmavathy

Abstract:

Security audit is an important aspect or feature to be considered in cloud service customer. It is basically a certification process to audit the controls that deliver the security requirements. Security audits are conducted by trained and qualified staffs that belong to an independent auditing organization. Security audits must be carried as a standard of security controls. Proper check to be made that the cloud user has a proper reporting and logging facilities with the customer's system and hence ensuring appropriate business and operational flow of data through cloud service. We propose a cloud-based secure auditing framework, which enables confided in power to safely store their mystery information on the semi-believed cloud specialist co-ops, and specifically share their mystery information with a wide scope of information recipient, to diminish the key administration intricacy for power proprietors and information collectors. Unique in relation to past cloud-based information framework, data proprietors transfer their mystery information into cloud utilizing static and dynamic evaluating plan. Another propelled determination is, if any information beneficiary needs individual record to download, the information collector will send the solicitation to the expert. The specialist proprietor has the Access Control. At the off probability, the businessman must impart the primary record to the knowledge collector, acknowledge statistics beneficiary solicitation. Once the acknowledgement for the records is over, the recipient downloads the first record and this record shifting time with date and downloading time with date are monitored by the inspector. In addition to deduplication concept, diminished cloud memory area using dynamic document distribution has been proposed.

Keywords: Cloud computing, cloud storage auditing, data integrity, key exposure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168
138 Comparative Efficacy of Pomegranate Juice, Peel and Seed Extract in the Stabilization of Corn Oil under Accelerated Conditions

Authors: Zoi Konsoula

Abstract:

Antioxidant-rich extracts were prepared from pomegranate peels, seeds and juice using methanol and ethanol and their antioxidant activity was evaluated by the 1,1-diphenyl-2-picrylhydrazine (DPPH) radical scavenging and Ferric Reducing Antioxidant Power (FRAP) method. Both analytical methods indicated a higher antioxidant activity in extracts prepared from peels, which was comparable to that of butylated hydroxytoluene (BHT). Furthermore, the antioxidant activity was correlated to the phenolic and flavonoid content of the various extracts. The antioxidant effectiveness of the extracts was also assessed using corn oil as the oxidation substrate. More specifically, preheated corn oil samples stabilized with extracts at a concentration of 250 ppm, 500 ppm or 1,000 ppm were subjected to accelerated aging (100 oC, 10 days) and the extent of oxidative alteration was followed by the measurement of the peroxide, conjugated dienes and trienes, as well as p-aniside value. BHT at its legal limit (200 ppm) served as standard besides the control sample. Results from the different parameters were in agreement with each other suggesting that pomegranate extracts can stabilize corn oil effectively under accelerated conditions, at all concentrations tested. However, the magnitude of oil stabilization depended strongly on the amount of extract added and this was positively correlated with their phenolic content. Pomegranate peel extracts, which exhibited the highest not only phenolic and flavonoid content but also antioxidant activity, were more potent in inhibiting oxidative deterioration. Both methanolic and ethanolic peel extracts at a concentration of 500 ppm exerted a stabilizing effect comparable to that of BHT, while at a concentration of 1000 ppm they exhibited higher stabilization efficiency in comparison to BHT. Finally, heating oil samples resulted in a time dependent decrease in their antioxidant capacity. Samples containing peel extracts appeared to retain their antioxidant capacity for a longer period, indicating that these extracts contained active compounds that offered superior antioxidant protection to corn oil.

Keywords: Antioxidant activity, corn oil, oxidative deterioration, pomegranate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
137 Geostatistical Analysis and Mapping of Groundlevel Ozone in a Medium Sized Urban Area

Authors: F. J. Moral García, P. Valiente González, F. López Rodríguez

Abstract:

Ground-level tropospheric ozone is one of the air pollutants of most concern. It is mainly produced by photochemical processes involving nitrogen oxides and volatile organic compounds in the lower parts of the atmosphere. Ozone levels become particularly high in regions close to high ozone precursor emissions and during summer, when stagnant meteorological conditions with high insolation and high temperatures are common. In this work, some results of a study about urban ozone distribution patterns in the city of Badajoz, which is the largest and most industrialized city in Extremadura region (southwest Spain) are shown. Fourteen sampling campaigns, at least one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the ozone distribution at the city, the measured ozone data were analyzed using geostatistical techniques. Thus, first, during the exploratory analysis of data, it was revealed that they were distributed normally, which is a desirable property for the subsequent stages of the geostatistical study. Secondly, during the structural analysis of data, theoretical spherical models provided the best fit for all monthly experimental variograms. The parameters of these variograms (sill, range and nugget) revealed that the maximum distance of spatial dependence is between 302-790 m and the variable, air ozone concentration, is not evenly distributed in reduced distances. Finally, predictive ozone maps were derived for all points of the experimental study area, by use of geostatistical algorithms (kriging). High prediction accuracy was obtained in all cases as cross-validation showed. Useful information for hazard assessment was also provided when probability maps, based on kriging interpolation and kriging standard deviation, were produced.

Keywords: Kriging, map, tropospheric ozone, variogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
136 High Securing Cover-File of Hidden Data Using Statistical Technique and AES Encryption Algorithm

Authors: A. A. Zaidan, Anas Majeed, B. B. Zaidan

Abstract:

Nowadays, the rapid development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information Besides that, digital documents are also easy to copy and distribute, therefore it will be faced by many threatens. It-s a big security and privacy issue with the large flood of information and the development of the digital format, it become necessary to find appropriate protection because of the significance, accuracy and sensitivity of the information. Nowadays protection system classified with more specific as hiding information, encryption information, and combination between hiding and encryption to increase information security, the strength of the information hiding science is due to the non-existence of standard algorithms to be used in hiding secret messages. Also there is randomness in hiding methods such as combining several media (covers) with different methods to pass a secret message. In addition, there are no formal methods to be followed to discover the hidden data. For this reason, the task of this research becomes difficult. In this paper, a new system of information hiding is presented. The proposed system aim to hidden information (data file) in any execution file (EXE) and to detect the hidden file and we will see implementation of steganography system which embeds information in an execution file. (EXE) files have been investigated. The system tries to find a solution to the size of the cover file and making it undetectable by anti-virus software. The system includes two main functions; first is the hiding of the information in a Portable Executable File (EXE), through the execution of four process (specify the cover file, specify the information file, encryption of the information, and hiding the information) and the second function is the extraction of the hiding information through three process (specify the steno file, extract the information, and decryption of the information). The system has achieved the main goals, such as make the relation of the size of the cover file and the size of information independent and the result file does not make any conflict with anti-virus software.

Keywords: Cryptography, Steganography, Portable ExecutableFile.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
135 Evaluation of Pragmatic Information in an English Textbook: Focus on Requests

Authors: Israa A. Qari

Abstract:

Learning to request in a foreign language is a key ability within pragmatics language teaching. This paper examines how requests are taught in English Unlimited Book 3 (Cambridge University Press), an EFL textbook series employed by King Abdulaziz University in Jeddah, Saudi Arabia to teach advanced foundation year students English. The focus of analysis is the evaluation of the request linguistic strategies present in the textbook, frequency of the use of these strategies, and the contextual information provided on the use of these linguistic forms. The researcher collected all the linguistic forms which consisted of the request speech act and divided them into levels employing the CCSARP request coding manual. Findings demonstrated that simple and commonly employed request strategies are introduced. Looking closely at the exercises throughout the chapters, it was noticeable that the book exclusively employed the most direct form of requesting (the imperative) when giving learners instructions: e.g. listen, write, ask, answer, read, look, complete, choose, talk, think, etc. The book also made use of some other request strategies such as ‘hedged performatives’ and ‘query preparatory’. However, it was also found that many strategies were not dealt with in the book, specifically strategies with combined functions (e.g. possibility, ability). On a sociopragmatic level, a strong focus was found to exist on standard situations in which relations between the requester and requestee are clear. In general, contextual information was communicated implicitly only. The textbook did not seem to differentiate between formal and informal request contexts (register) which might consequently impel students to overgeneralize. The paper closes with some recommendations for textbook and curriculum designers. Findings are also contrasted with previous results from similar body of research on EFL requests.

Keywords: EFL, Requests, Saudi, speech acts, textbook evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 453
134 Validation on 3D Surface Roughness Algorithm for Measuring Roughness of Psoriasis Lesion

Authors: M.H. Ahmad Fadzil, Esa Prakasa, Hurriyatul Fitriyah, Hermawan Nugroho, Azura Mohd Affandi, S.H. Hussein

Abstract:

Psoriasis is a widespread skin disease affecting up to 2% population with plaque psoriasis accounting to about 80%. It can be identified as a red lesion and for the higher severity the lesion is usually covered with rough scale. Psoriasis Area Severity Index (PASI) scoring is the gold standard method for measuring psoriasis severity. Scaliness is one of PASI parameter that needs to be quantified in PASI scoring. Surface roughness of lesion can be used as a scaliness feature, since existing scale on lesion surface makes the lesion rougher. The dermatologist usually assesses the severity through their tactile sense, therefore direct contact between doctor and patient is required. The problem is the doctor may not assess the lesion objectively. In this paper, a digital image analysis technique is developed to objectively determine the scaliness of the psoriasis lesion and provide the PASI scaliness score. Psoriasis lesion is modelled by a rough surface. The rough surface is created by superimposing a smooth average (curve) surface with a triangular waveform. For roughness determination, a polynomial surface fitting is used to estimate average surface followed by a subtraction between rough and average surface to give elevation surface (surface deviations). Roughness index is calculated by using average roughness equation to the height map matrix. The roughness algorithm has been tested to 444 lesion models. From roughness validation result, only 6 models can not be accepted (percentage error is greater than 10%). These errors occur due the scanned image quality. Roughness algorithm is validated for roughness measurement on abrasive papers at flat surface. The Pearson-s correlation coefficient of grade value (G) of abrasive paper and Ra is -0.9488, its shows there is a strong relation between G and Ra. The algorithm needs to be improved by surface filtering, especially to overcome a problem with noisy data.

Keywords: psoriasis, roughness algorithm, polynomial surfacefitting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2491
133 Nebulized Magnesium Sulfate in Acute Moderate to Severe Asthma in Pediatric Patients

Authors: Lubna M. Zakaryia Mahmoud, Mohammed A. Dawood, Doaa A. Heiba

Abstract:

A prospective double-blind placebo controlled trial carried out on 60 children known to be asthmatic who presented to the emergency department at Alexandria University of Children’s Hospital at El-Shatby with acute asthma exacerbations to assess the efficacy of adding inhaled magnesium sulfate to β-agonist, compared with β-agonist in saline, in the management of acute asthma exacerbations in children. The participants in the study were divided in two groups; Group A (study group) received inhaled salbutamol solution (0.15 ml/kg) plus isotonic magnesium sulfate 2 ml in a nebulizer chamber. Group B (control group): received nebulized salbutamol solution (0.15 ml/kg) diluted with placebo (2 ml normal saline). Both groups received inhaled solution every 20 minutes that was repeated for three doses. They were evaluated using the Pediatric Asthma Severity Score (PASS), oxygen saturation using portable pulse oximetry and peak expiratory flow rate using a portable peak expiratory flow meter at initially recorded as zero-minute assessment and every 20 minutes from the end of each nebulization (nebulization lasts 5-10 minutes) recorded as 20, 40 and 60-minute assessments. Regarding PASS, comparison showed non-significant difference with p-value 0.463, 0.472, 0.0766 at 20, 40 and 60 minutes. Regarding oxygen saturation, improvement was more significant towards group A starting from 40 min with significant p-value=0.000. At 60 min p-value=0.000. Although mean PEFR significantly improved from zero-min in both groups; however, improvement was more significant in group A with significant p-value = 0.015, 0.001, 0.001 at 20 min, 40 min and 60 min, respectively. The conclusion this study suggests is that inhaled magnesium sulfate is an efficient add on drug to standard β- agonist inhalation used in the treatment of moderate to severe asthma exacerbations.

Keywords: Nebulized, magnesium sulfate, acute asthma, pediatric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660
132 Knowledge Management Strategies within a Corporate Environment of Papers

Authors: Daniel J. Glauber

Abstract:

Knowledge transfer between personnel could benefit an organization’s improved competitive advantage in the marketplace from a strategic approach to knowledge management. The lack of information sharing between personnel could create knowledge transfer gaps while restricting the decision-making processes. Knowledge transfer between personnel can potentially improve information sharing based on an implemented knowledge management strategy. An organization’s capacity to gain more knowledge is aligned with the organization’s prior or existing captured knowledge. This case study attempted to understand the overall influence of a KMS within the corporate environment and knowledge exchange between personnel. The significance of this study was to help understand how organizations can improve the Return on Investment (ROI) of a knowledge management strategy within a knowledge-centric organization. A qualitative descriptive case study was the research design selected for this study. The lack of information sharing between personnel may create knowledge transfer gaps while restricting the decision-making processes. Developing a knowledge management strategy acceptable at all levels of the organization requires cooperation in support of a common organizational goal. Working with management and executive members to develop a protocol where knowledge transfer becomes a standard practice in multiple tiers of the organization. The knowledge transfer process could be measurable when focusing on specific elements of the organizational process, including personnel transition to help reduce time required understanding the job. The organization studied in this research acknowledged the need for improved knowledge management activities within the organization to help organize, retain, and distribute information throughout the workforce. Data produced from the study indicate three main themes including information management, organizational culture, and knowledge sharing within the workforce by the participants. These themes indicate a possible connection between an organizations KMS, the organizations culture, knowledge sharing, and knowledge transfer.

Keywords: Knowledge management strategies, knowledge transfer, knowledge management, knowledge capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964
131 Hydro-Geochemistry of Qare-Sou Catchment and Gorgan Gulf, Iran: Examining Spatial and Temporal Distribution of Major Ions and Determining the River’s Hydro-Chemical Type

Authors: Milad Kurdi, Hadi Farhadian, Teymour Eslamkish

Abstract:

This study examined the hydro-geochemistry of Qare-Sou catchment and Gorgan Gulf in order to determine the spatial distribution of major ions. In this regard, six hydrometer stations in the catchment and four stations in Gorgan Gulf were chosen and the samples were collected. Results of spatial and temporal distribution of major ions have shown similar variation trends for calcium, magnesium, and bicarbonate ions. Also, the spatial trend of chloride, sulfate, sodium and potassium ions were same as Electrical Conductivity (EC) and Total Dissolved Solid (TDS). In Nahar Khoran station, the concentrations of ions were more than other stations which may be related to human activities and the role of geology. The Siah Ab station’s ions showed high concentration which is may be related to the station’s close proximity to Gorgan Gulf and the return of water to Qare-Sou River. In order to determine the interaction of water and rock, the Gibbs diagram was used and the results showed that water of the river falls in the rock range and it is affected more by weathering and reaction between water and stone and less by evaporation and crystallization. Assessment of the quality of river water by using graphic methods indicated that the type of water in this area is Ca-HCO3-Mg. Major ions concentration in Qare-Sou in the universal average was more than but not more than the allowed limit by the World Health Organization and China Standard Organization. A comparison of ions concentration in Gorgan Gulf, seas and oceans showed that the pH in Gorgan Gulf was more than the other seas but in Gorgan Gulf the concentration of anion and cation was less than other seas.

Keywords: Hydro-geochemistry, Qare-Sou River, Gorgan Gulf, major ions, Gibbs diagram, water quality, graphical methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
130 Simulation and Analysis of Passive Parameters of Building in eQuest: A Case Study in Istanbul, Turkey

Authors: Mahdiyeh Zafaranchi

Abstract:

With rapid development of urbanization and improvement of living standards in the world, energy consumption and carbon emissions of the building sector are expected to increase in the near future; because of that, energy-saving issues have become more important among the engineers. Besides, the building sector is a major contributor to energy consumption and carbon emissions. The concept of efficient building appeared as a response to the need for reducing energy demand in this sector which has the main purpose of shifting from standard buildings to low-energy buildings. Although energy-saving should happen in all steps of a building during the life cycle (material production, construction, demolition), the main concept of efficient energy building is saving energy during the life expectancy of a building by using passive and active systems, and should not sacrifice comfort and quality to reach these goals. The main aim of this study is to investigate passive strategies (do not need energy consumption or use renewable energy) to achieve energy-efficient buildings. Energy retrofit measures were explored by eQuest software using a case study as a base model. The study investigates predictive accuracy for the major factors like thermal transmittance (U-value) of the material, windows, shading devices, thermal insulation, rate of the exposed envelope, window/wall ration, lighting system in the energy consumption of the building. The base model was located in Istanbul, Turkey. The impact of eight passive parameters on energy consumption had been indicated. After analyzing the base model by eQuest, a final scenario was suggested which had a good energy performance. The results showed a decrease in the U-values of materials, the rate of exposing buildings, and windows had a significant effect on energy consumption. Finally, savings in electric consumption of about 10.5%, and gas consumption by about 8.37% in the suggested model were achieved annually.

Keywords: Efficient building, electric and gas consumption, eQuest, passive parameters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 774
129 Hydrogen Production at the Forecourt from Off-Peak Electricity and Its Role in Balancing the Grid

Authors: Abdulla Rahil, Rupert Gammon, Neil Brown

Abstract:

The rapid growth of renewable energy sources and their integration into the grid have been motivated by the depletion of fossil fuels and environmental issues. Unfortunately, the grid is unable to cope with the predicted growth of renewable energy which would lead to its instability. To solve this problem, energy storage devices could be used. Electrolytic hydrogen production from an electrolyser is considered a promising option since it is a clean energy source (zero emissions). Choosing flexible operation of an electrolyser (producing hydrogen during the off-peak electricity period and stopping at other times) could bring about many benefits like reducing the cost of hydrogen and helping to balance the electric systems. This paper investigates the price of hydrogen during flexible operation compared with continuous operation, while serving the customer (hydrogen filling station) without interruption. The optimization algorithm is applied to investigate the hydrogen station in both cases (flexible and continuous operation). Three different scenarios are tested to see whether the off-peak electricity price could enhance the reduction of the hydrogen cost. These scenarios are: Standard tariff (1 tier system) during the day (assumed 12 p/kWh) while still satisfying the demand for hydrogen; using off-peak electricity at a lower price (assumed 5 p/kWh) and shutting down the electrolyser at other times; using lower price electricity at off-peak times and high price electricity at other times. This study looks at Derna city, which is located on the coast of the Mediterranean Sea (32° 46′ 0 N, 22° 38′ 0 E) with a high potential for wind resource. Hourly wind speed data which were collected over 24½ years from 1990 to 2014 were in addition to data on hourly radiation and hourly electricity demand collected over a one-year period, together with the petrol station data.

Keywords: Hydrogen filling station off-peak electricity, renewable energy, off-peak electricity, electrolytic hydrogen.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1263
128 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components

Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich

Abstract:

This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.

Keywords: Hard disk drive, line balancing, simulation, Arena program.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1186
127 Field Study on Thermal Performance of a Green Office in Bangkok, Thailand: A Possibility of Increasing Temperature Set-Points

Authors: T. Sikram, M. Ichinose, R. Sasaki

Abstract:

In the tropics, indoor thermal environment is usually provided by a cooling mode to maintain comfort all year. Indoor thermal environment performance is sometimes different from the standard or from the first design process because of operation, maintenance, and utilization. The field study of thermal environment in the green building is still limited in this region, while the green building continues to increase. This study aims to clarify thermal performance and subjective perception in the green building by testing the temperature set-points. A Thai green office was investigated twice in October 2018 and in May 2019. Indoor environment variables (temperature, relative humidity, and wind velocity) were collected continuously. The temperature set-point was normally set as 23 °C, and it was changed into 24 °C and 25 °C. The study found that this gap of temperature set-point produced average room temperature from 22.7 to 24.6 °C and average relative humidity from 55% to 62%. Thermal environments slight shifted out of the ASHRAE comfort zone when the set-point was increased. Based on the thermal sensation vote, the feeling-colder vote decreased by 30% and 18% when changing +1 °C and +2 °C, respectively. Predicted mean vote (PMV) shows that most of the calculated median values were negative. The values went close to the optimal neutral value (0) when the set-point was set at 25 °C. The neutral temperature was slightly decreased when changing warmer temperature set-points. Building-related symptom reports were found in this study that the number of votes reduced continuously when the temperature was warmer. The symptoms that occurred by a cooler condition had the number of votes more than ones that occurred by a warmer condition. In sum, for this green office, there is a possibility to adjust a higher temperature set-point to +1 °C (24 °C) in terms of reducing cold sensitivity, discomfort, and symptoms. All results could support the policy of changing a warmer temperature of this office to become “a better green building”.

Keywords: Thermal environment, green office, temperature set-point, comfort.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 671
126 Vancomycin and Rifaximin Combination Therapy for Diarrhoea Predominant Irritable Bowel Syndrome: An Observational Study

Authors: P. Murphy, D. Vasic, A. W. Gunaratne, T. Tugonon, M. Ison, C. Pagonis, E. T. Sitchon, A. Le Busque, T. J. Borody

Abstract:

Irritable bowel syndrome (IBS) is a gastrointestinal disorder characterized by an alteration in bowel movements. There are three different types of IBS: diarrhea-predominant IBS (IBS-D), constipation-predominant IBS (IBS-C) and IBS with mixed bowel habit (IBS-M). Antimicrobials are increasingly being used as treatment for all types of IBS. Due to this increased use and subsequent success, the gut microbiome as a factor in the etiology of IBS is becoming more apparent. Accepted standard treatment has focused on IBS-C and involves either vancomycin or rifaximin. Here, we report on a cohort of 18 patients treated with both vancomycin and rifaximin for IBS-D. These patients’ records were reviewed retrospectively. In this cohort, patients were aged between 24-74 years (mean 44 years) and nine were female. At baseline all patients had diarrhea, four with mucus and one with blood. Other reported symptoms include abdominal pain (n = 11) bloating (n = 9), flatulence (n = 7), fatigue (n = 4) and nausea (n = 3). Patient’s treatments were personalized according to their symptom severity and tolerability and were treated with a combination of rifaximin (500-3000 mg/d) and vancomycin (500 mg-1500 mg/d) for an ongoing period. Follow-ups were conducted between 2-32 weeks. Of all patients, 89% reported improvement of at least 1 symptom, one reported no change and one patient’s symptoms got worse. The success of this combination treatment could be due to the different mechanisms of action undertaken by each medication. Vancomycin works by inhibiting the cell wall of the bacteria and rifaximin by inhibiting protein synthesis. This success in treatment validates the idea that IBS-D may be driven by a bacterial infection of the gastrointestinal microbiome. As IBS-D presents similarly to Clostridium difficile and symptom improvement can occur with the same treatment as Clostridium difficile of rifaximin and vancomycin, there is reason to suggest that the infectious agent could be an unidentified strain of Clostridium. Although these results offer some validity to the theory, more research is required.

Keywords: Clostridium difficile infection, diarrhea predominant irritable bowel syndrome, microbiome, vancomycin/rifaximin combination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 352
125 A Cumulative Learning Approach to Data Mining Employing Censored Production Rules (CPRs)

Authors: Rekha Kandwal, Kamal K.Bharadwaj

Abstract:

Knowledge is indispensable but voluminous knowledge becomes a bottleneck for efficient processing. A great challenge for data mining activity is the generation of large number of potential rules as a result of mining process. In fact sometimes result size is comparable to the original data. Traditional data mining pruning activities such as support do not sufficiently reduce the huge rule space. Moreover, many practical applications are characterized by continual change of data and knowledge, thereby making knowledge voluminous with each change. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. Michalski & Winston proposed Censored Production Rules (CPRs), as an extension of production rules, that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence, are tight or there is simply no information available as to whether it holds or not. Thus the 'If P Then D' part of the CPR expresses important information while the Unless C part acts only as a switch changes the polarity of D to ~D. In this paper a scheme based on Dempster-Shafer Theory (DST) interpretation of a CPR is suggested for discovering CPRs from the discovered flat PRs. The discovery of CPRs from flat rules would result in considerable reduction of the already discovered rules. The proposed scheme incrementally incorporates new knowledge and also reduces the size of knowledge base considerably with each episode. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested cumulative learning scheme would be useful in mining data streams.

Keywords: Censored production rules, cumulative learning, data mining, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1485
124 Temperature-Based Detection of Initial Yielding Point in Loading of Tensile Specimens Made of Structural Steel

Authors: Aqsa Jamil, Hiroshi Tamura, Hiroshi Katsuchi, Jiaqi Wang

Abstract:

Yield point represents the upper limit of forces which can be applied on a specimen without causing any permanent deformation. After yielding, the behavior of specimen suddenly changes including the possibility of cracking or buckling. So, the accumulation of damage or type of fracture changes depending on this condition. As it is difficult to accurately detect yield points of the several stress concentration points in structural steel specimens, an effort has been made in this research work to develop a convenient technique using thermography (temperature-based detection) during tensile tests for the precise detection of yield point initiation. To verify the applicability of thermography camera, tests were conducted under different loading conditions and measuring the deformation by installing various strain gauges and monitoring the surface temperature with the help of thermography camera. The yield point of specimens was estimated by the help of temperature dip which occurs due to the thermoelastic effect during the plastic deformation. The scattering of the data has been checked by performing repeatability analysis. The effect of temperature imperfection and light source has been checked by carrying out the tests at daytime as well as midnight and by calculating the signal to noise ratio (SNR) of the noised data from the infrared thermography camera, it can be concluded that the camera is independent of testing time and the presence of a visible light source. Furthermore, a fully coupled thermal-stress analysis has been performed by using Abaqus/Standard exact implementation technique to validate the temperature profiles obtained from the thermography camera and to check the feasibility of numerical simulation for the prediction of results extracted with the help of thermographic technique.

Keywords: Signal to noise ratio, thermoelastic effect, thermography, yield point.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 371
123 Buckling Optimization of Radially-Graded, Thin-Walled, Long Cylinders under External Pressure

Authors: Karam Y. Maalawi

Abstract:

This paper presents a generalized formulation for the problem of buckling optimization of anisotropic, radially graded, thin-walled, long cylinders subject to external hydrostatic pressure. The main structure to be analyzed is built of multi-angle fibrous laminated composite lay-ups having different volume fractions of the constituent materials within the individual plies. This yield to a piecewise grading of the material in the radial direction; that is the physical and mechanical properties of the composite material are allowed to vary radially. The objective function is measured by maximizing the critical buckling pressure while preserving the total structural mass at a constant value equals to that of a baseline reference design. In the selection of the significant optimization variables, the fiber volume fractions adjoin the standard design variables including fiber orientation angles and ply thicknesses. The mathematical formulation employs the classical lamination theory, where an analytical solution that accounts for the effective axial and flexural stiffness separately as well as the inclusion of the coupling stiffness terms is presented. The proposed model deals with dimensionless quantities in order to be valid for thin shells having arbitrary thickness-to-radius ratios. The critical buckling pressure level curves augmented with the mass equality constraint are given for several types of cylinders showing the functional dependence of the constrained objective function on the selected design variables. It was shown that material grading can have significant contribution to the whole optimization process in achieving the required structural designs with enhanced stability limits.

Keywords: Buckling instability, structural optimization, functionally graded material, laminated cylindrical shells, externalhydrostatic pressure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2359
122 Named Entity Recognition using Support Vector Machine: A Language Independent Approach

Authors: Asif Ekbal, Sivaji Bandyopadhyay

Abstract:

Named Entity Recognition (NER) aims to classify each word of a document into predefined target named entity classes and is now-a-days considered to be fundamental for many Natural Language Processing (NLP) tasks such as information retrieval, machine translation, information extraction, question answering systems and others. This paper reports about the development of a NER system for Bengali and Hindi using Support Vector Machine (SVM). Though this state of the art machine learning technique has been widely applied to NER in several well-studied languages, the use of this technique to Indian languages (ILs) is very new. The system makes use of the different contextual information of the words along with the variety of features that are helpful in predicting the four different named (NE) classes, such as Person name, Location name, Organization name and Miscellaneous name. We have used the annotated corpora of 122,467 tokens of Bengali and 502,974 tokens of Hindi tagged with the twelve different NE classes 1, defined as part of the IJCNLP-08 NER Shared Task for South and South East Asian Languages (SSEAL) 2. In addition, we have manually annotated 150K wordforms of the Bengali news corpus, developed from the web-archive of a leading Bengali newspaper. We have also developed an unsupervised algorithm in order to generate the lexical context patterns from a part of the unlabeled Bengali news corpus. Lexical patterns have been used as the features of SVM in order to improve the system performance. The NER system has been tested with the gold standard test sets of 35K, and 60K tokens for Bengali, and Hindi, respectively. Evaluation results have demonstrated the recall, precision, and f-score values of 88.61%, 80.12%, and 84.15%, respectively, for Bengali and 80.23%, 74.34%, and 77.17%, respectively, for Hindi. Results show the improvement in the f-score by 5.13% with the use of context patterns. Statistical analysis, ANOVA is also performed to compare the performance of the proposed NER system with that of the existing HMM based system for both the languages.

Keywords: Named Entity (NE), Named Entity Recognition (NER), Support Vector Machine (SVM), Bengali, Hindi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3404
121 Text Mining Technique for Data Mining Application

Authors: M. Govindarajan

Abstract:

Text Mining is around applying knowledge discovery techniques to unstructured text is termed knowledge discovery in text (KDT), or Text data mining or Text Mining. In decision tree approach is most useful in classification problem. With this technique, tree is constructed to model the classification process. There are two basic steps in the technique: building the tree and applying the tree to the database. This paper describes a proposed C5.0 classifier that performs rulesets, cross validation and boosting for original C5.0 in order to reduce the optimization of error ratio. The feasibility and the benefits of the proposed approach are demonstrated by means of medial data set like hypothyroid. It is shown that, the performance of a classifier on the training cases from which it was constructed gives a poor estimate by sampling or using a separate test file, either way, the classifier is evaluated on cases that were not used to build and evaluate the classifier are both are large. If the cases in hypothyroid.data and hypothyroid.test were to be shuffled and divided into a new 2772 case training set and a 1000 case test set, C5.0 might construct a different classifier with a lower or higher error rate on the test cases. An important feature of see5 is its ability to classifiers called rulesets. The ruleset has an error rate 0.5 % on the test cases. The standard errors of the means provide an estimate of the variability of results. One way to get a more reliable estimate of predictive is by f-fold –cross- validation. The error rate of a classifier produced from all the cases is estimated as the ratio of the total number of errors on the hold-out cases to the total number of cases. The Boost option with x trials instructs See5 to construct up to x classifiers in this manner. Trials over numerous datasets, large and small, show that on average 10-classifier boosting reduces the error rate for test cases by about 25%.

Keywords: C5.0, Error Ratio, text mining, training data, test data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2489
120 TNFRSF11B Gene Polymorphisms A163G and G11811C in Prediction of Osteoporosis Risk

Authors: Boroňová I., Bernasovská J., Kľoc J., Tomková Z., Petrejčíková E., Gabriková D., Mačeková S.

Abstract:

Osteoporosis is a complex health disease characterized by low bone mineral density, which is determined by an interaction of genetics with metabolic and environmental factors. Current research in genetics of osteoporosis is focused on identification of responsible genes and polymorphisms. TNFRSF11B gene plays a key role in bone remodeling. The aim of this study was to investigate the genotype and allele distribution of A163G (rs3102735) osteoprotegerin gene promoter and G1181C (rs2073618) osteoprotegerin first exon polymorphisms in the group of 180 unrelated postmenopausal women with diagnosed osteoporosis and 180 normal controls. Genomic DNA was isolated from peripheral blood leukocytes using standard methodology. Genotyping for presence of different polymorphisms was performed using the Custom Taqman®SNP Genotyping assays. Hardy-Weinberg equilibrium was tested for each SNP in the groups of participants using the chi-square (χ2) test. The distribution of investigated genotypes in the group of patients with osteoporosis were as follows: AA (66.7%), AG (32.2%), GG (1.1%) for A163G polymorphism; GG (19.4%), CG (44.4%), CC (36.1%) for G1181C polymorphism. The distribution of genotypes in normal controls were follows: AA (71.1%), AG (26.1%), GG (2.8%) for A163G polymorphism; GG (22.2%), CG (48.9%), CC (28.9%) for G1181C polymorphism. In A163G polymorphism the variant G allele was more common among patients with osteoporosis: 17.2% versus 15.8% in normal controls. Also, in G1181C polymorphism the phenomenon of more frequent occurrence of C allele in the group of patients with osteoporosis was observed (58.3% versus 53.3%). Genotype and allele distributions showed no significant differences (A163G: χ2=0.270, p=0.605; χ2=0.250, p=0.616; G1181C: χ2= 1.730, p=0.188; χ2=1.820, p=0.177). Our results represents an initial study, further studies of more numerous file and associations studies will be carried out. Knowing the distribution of genotypes is important for assessing the impact of these polymorphisms on various parameters associated with osteoporosis. Screening for identification of “at-risk” women likely to develop osteoporosis and initiating subsequent early intervention appears to be most effective strategy to substantially reduce the risks of osteoporosis.

Keywords: Osteoporosis, Real-time PCR method, SNP polymorphisms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2244
119 The Implications of Technological Advancements on the Constitutional Principles of Contract Law

Authors: Laura Çami (Vorpsi), Xhon Skënderi

Abstract:

In today's rapidly evolving technological landscape, the traditional principles of contract law are facing significant challenges. The emergence of new technologies, such as electronic signatures, smart contracts, and online dispute resolution mechanisms, is transforming the way contracts are formed, interpreted, and enforced. This paper examines the implications of these technological advancements on the constitutional principles of contract law. One of the fundamental principles of contract law is freedom of contract, which ensures that parties have the autonomy to negotiate and enter into contracts as they see fit. However, the use of technology in the contracting process has the potential to disrupt this principle. For example, online platforms and marketplaces often offer standard-form contracts, which may not reflect the specific needs or interests of individual parties. This raises questions about the equality of bargaining power between parties and the extent to which parties are truly free to negotiate the terms of their contracts. Another important principle of contract law is the requirement of consideration, which requires that each party receives something of value in exchange for their promise. The use of digital assets, such as cryptocurrencies, has created new challenges in determining what constitutes valuable consideration in a contract. Due to the ambiguity in this area, disagreements about the legality and enforceability of such contracts may arise. Furthermore, the use of technology in dispute resolution mechanisms, such as online arbitration and mediation, may raise concerns about due process and access to justice. The use of algorithms and artificial intelligence to determine the outcome of disputes may also raise questions about the impartiality and fairness of the process. Finally, it should be noted that there are many different and complex effects of technical improvements on the fundamental constitutional foundations of contract law. As technology continues to evolve, it will be important for policymakers and legal practitioners to consider the potential impacts on contract law and to ensure that the principles of fairness, equality, and access to justice are preserved in the contracting process.

Keywords: Technological advancements, constitutional principles, contract law, smart contracts, online dispute resolution, freedom of contract.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 251
118 Legal Theories Underpinning Access to Justice for Victims of Sexual Violence in Refugee Camps in Africa

Authors: O. E. Eberechi, G. P. Stevens

Abstract:

Legal theory has been referred to as the explanation of why things do or do not happen. It also describes situations and why they ensue. It provides a normative framework by which things are regulated and a foundation for the establishment of legal mechanisms/institutions that can bring about a desired change in a society. Furthermore, it offers recommendations in resolving practical problems and describes what the law is, what the law ought to be and defines the legal landscape generally. Some legal theories provide a universal standard, e.g. human rights, while others are capable of organizing and streamlining the collective use, and, by extension, bring order to society. Legal theory is used to explain how the world works and how it does not work. This paper will argue for the application of the principles of legal theory in the achievement of access to justice for female victims of sexual violence in refugee camps in Africa through the analysis of legal theories underpinning the access to justice for these women. It is a known fact that female refugees in camps in Africa often experience some form of sexual violation. The perpetrators of these incidents may never be apprehended, prosecuted, convicted or sentenced. Where prosecution does occur, the perpetrators are either acquitted as a result of poor investigation, inept prosecution, a lack of evidence, or the case may be dismissed owing to tardiness on the part of the prosecutor, which accounts for the culture of impunity in refugee camps. In other words, victims do not have access to the justice that could ameliorate the plight of the victims. There is, thus, a need for a legal framework that will facilitate access to justice for these victims. This paper will start with an introduction, and be followed by the definition of legal theory, its functions and its application in law. Secondly, it will provide a brief explanation of the problems faced by female refugees who are victims of sexual violence in refugee camps in Africa. Thirdly, it will embark on an analysis of theories which will be a help to an understanding of the precarious situation of female refugees, why they are violated, the need for access to justice for these victims, and the principles of legal theory in its usefulness in resolving access to justice for these victims.

Keywords: Access to justice, underpinning legal theory, refugee, sexual violence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1814