Search results for: Hattie%20Hope%20Makumbe
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2

Search results for: Hattie%20Hope%20Makumbe

2 Incidence of Listeria monocytogenes in Ready-To-Eat Food Sold in Johannesburg, South Africa

Authors: Hattie Hope Makumbe, Bhekisisa Dlamini, Frederick Tabit

Abstract:

Listeria monocytogenes is one of the most important foodborne pathogens associated with ready-to-eat (RTE) food. This study investigated the incidence of Listeria monocytogenes in 80 RTE food sold in the formal (dairy and processed meat) and informal markets (vegetable salads, beef stew, and rice) of Johannesburg, South Africa. High Enterobacteriaceae, S. aureus, and E. coli counts were obtained, which ranged from 1.9-7.5 log CFU/g. Listeria monocytogenes microbial counts in the food samples ranged from 3.5-6.0 log colony forming unit per gram except in cooked rice. The Listeria monocytogenes isolates were identified using biochemical tests and confirmed with the Biolog identification system and PCR analyses. The percentage incidence for Listeria monocytogenes in ready to eat food was 12.5%. When Minimum Inhibitory Concentrations were under consideration, all disinfectants were effective against Listeria monocytogenes strains. For antimicrobial work, rates of resistance amongst the antibiotics ranged from 17-100%. Therefore, more effective preventive control strategies for Listeria monocytogenes are needed to reduce the prevalence of the pathogen in RTE food that is sold in Johannesburg.

Keywords: Listeria monocytogenes, Listeria species, ready to eat food, sanitiser efficacy

Procedia PDF Downloads 125
1 High School Gain Analytics From National Assessment Program – Literacy and Numeracy and Australian Tertiary Admission Rankin Linkage

Authors: Andrew Laming, John Hattie, Mark Wilson

Abstract:

Nine Queensland Independent high schools provided deidentified student-matched ATAR and NAPLAN data for all 1217 ATAR graduates since 2020 who also sat NAPLAN at the school. Graduating cohorts from the nine schools contained a mean 100 ATAR graduates with previous NAPLAN data from their school. Excluded were vocational students (mean=27) and any ATAR graduates without NAPLAN data (mean=20). Based on Index of Community Socio-Educational Access (ICSEA) prediction, all schools had larger that predicted proportions of their students graduating with ATARs. There were an additional 173 students not releasing their ATARs to their school (14%), requiring this data to be inferred by schools. Gain was established by first converting each student’s strongest NAPLAN domain to a statewide percentile, then subtracting this result from final ATAR. The resulting ‘percentile shift’ was corrected for plausible ATAR participation at each NAPLAN level. Strongest NAPLAN domain had the highest correlation with ATAR (R2=0.58). RESULTS School mean NAPLAN scores fitted ICSEA closely (R2=0.97). Schools achieved a mean cohort gain of two ATAR rankings, but only 66% of students gained. This ranged from 46% of top-NAPLAN decile students gaining, rising to 75% achieving gains outside the top decile. The 54% of top-decile students whose ATAR fell short of prediction lost a mean 4.0 percentiles (or 6.2 percentiles prior to correction for regression to the mean). 71% of students in smaller schools gained, compared to 63% in larger schools. NAPLAN variability in each of the 13 ICSEA1100 cohorts was 17%, with both intra-school and inter-school variation of these values extremely low (0.3% to 1.8%). Mean ATAR change between years in each school was just 1.1 ATAR ranks. This suggests consecutive school cohorts and ICSEA-similar schools share very similar distributions and outcomes over time. Quantile analysis of the NAPLAN/ATAR revealed heteroscedasticity, but splines offered little additional benefit over simple linear regression. The NAPLAN/ATAR R2 was 0.33. DISCUSSION Standardised data like NAPLAN and ATAR offer educators a simple no-cost progression metric to analyse performance in conjunction with their internal test results. Change is expressed in percentiles, or ATAR shift per student, which is layperson intuitive. Findings may also reduce ATAR/vocational stream mismatch, reveal proportions of cohorts meeting or falling short of expectation and demonstrate by how much. Finally, ‘crashed’ ATARs well below expectation are revealed, which schools can reasonably work to minimise. The percentile shift method is neither value-add nor a growth percentile. In the absence of exit NAPLAN testing, this metric is unable to discriminate academic gain from legitimate ATAR-maximizing strategies. But by controlling for ICSEA, ATAR proportion variation and student mobility, it uncovers progression to ATAR metrics which are not currently publicly available. However achieved, ATAR maximisation is a sought-after private good. So long as standardised nationwide data is available, this analysis offers useful analytics for educators and reasonable predictivity when counselling subsequent cohorts about their ATAR prospects.  

Keywords: NAPLAN, ATAR, analytics, measurement, gain, performance, data, percentile, value-added, high school, numeracy, reading comprehension, variability, regression to the mean

Procedia PDF Downloads 34