Search results for: capability evaluating
102 Social Network Roles in Organizations: Influencers, Bridges, and Soloists
Authors: Sofia Dokuka, Liz Lockhart, Alex Furman
Abstract:
Organizational hierarchy, traditionally composed of individual contributors, middle management, and executives, is enhanced by the understanding of informal social roles. These roles, identified with organizational network analysis (ONA), might have an important effect on organizational functioning. In this paper, we identify three social roles – influencers, bridges, and soloists, and provide empirical analysis based on real-world organizational networks. Influencers are employees with broad networks and whose contacts also have rich networks. Influence is calculated using PageRank, initially proposed for measuring website importance, but now applied in various network settings, including social networks. Influencers, having high PageRank, become key players in shaping opinions and behaviors within an organization. Bridges serve as links between loosely connected groups within the organization. Bridges are identified using betweenness and Burt’s constraint. Betweenness quantifies a node's control over information flows by evaluating its role in the control over the shortest paths within the network. Burt's constraint measures the extent of interconnection among an individual's contacts. A high constraint value suggests fewer structural holes and lesser control over information flows, whereas a low value suggests the contrary. Soloists are individuals with fewer than 5 stable social contacts, potentially facing challenges due to reduced social interaction and hypothetical lack of feedback and communication. We considered social roles in the analysis of real-world organizations (N=1,060). Based on data from digital traces (Slack, corporate email and calendar) we reconstructed an organizational communication network and identified influencers, bridges and soloists. We also collected employee engagement data through an online survey. Among the top-5% of influencers, 10% are members of the Executive Team. 56% of the Executive Team members are part of the top influencers group. The same proportion of top influencers (10%) is individual contributors, accounting for just 0.6% of all individual contributors in the company. The majority of influencers (80%) are at the middle management level. Out of all middle managers, 19% hold the role of influencers. However, individual contributors represent a small proportion of influencers, and having information about these individuals who hold influential roles can be crucial for management in identifying high-potential talents. Among the bridges, 4% are members of the Executive Team, 16% are individual contributors, and 80% are middle management. Predominantly middle management acts as a bridge. Bridge positions of some members of the executive team might indicate potential micromanagement on the leader's part. Recognizing the individuals serving as bridges in an organization uncovers potential communication problems. The majority of soloists are individual contributors (96%), and 4% of soloists are from middle management. These managers might face communication difficulties. We found an association between being an influencer and attitude toward a company's direction. There is a statistically significant 20% higher perception that the company is headed in the right direction among influencers compared to non-influencers (p < 0.05, Mann-Whitney test). Taken together, we demonstrate that considering social roles in the company might indicate both positive and negative aspects of organizational functioning that should be considered in data-driven decision-making.Keywords: organizational network analysis, social roles, influencer, bridge, soloist
Procedia PDF Downloads 106101 User-Controlled Color-Changing Textiles: From Prototype to Mass Production
Authors: Joshua Kaufman, Felix Tan, Morgan Monroe, Ayman Abouraddy
Abstract:
Textiles and clothing have been a staple of human existence for millennia, yet the basic structure and functionality of textile fibers and yarns has remained unchanged. While color and appearance are essential characteristics of a textile, an advancement in the fabrication of yarns that allows for user-controlled dynamic changes to the color or appearance of a garment has been lacking. Touch-activated and photosensitive pigments have been used in textiles, but these technologies are passive and cannot be controlled by the user. The technology described here allows the owner to control both when and in what pattern the fabric color-change takes place. In addition, the manufacturing process is compatible with mass-producing the user-controlled, color-changing yarns. The yarn fabrication utilizes a fiber spinning system that can produce either monofilament or multifilament yarns. For products requiring a more robust fabric (backpacks, purses, upholstery, etc.), larger-diameter monofilament yarns with a coarser weave are suitable. Such yarns are produced using a thread-coater attachment to encapsulate a 38-40 AWG metal wire inside a polymer sheath impregnated with thermochromic pigment. Conversely, products such as shirts and pants requiring yarns that are more flexible and soft against the skin comprise multifilament yarns of much smaller-diameter individual fibers. Embedding a metal wire in a multifilament fiber spinning process has not been realized to date. This research has required collaboration with Hills, Inc., to design a liquid metal-injection system to be combined with fiber spinning. The new system injects molten tin into each of 19 filaments being spun simultaneously into a single yarn. The resulting yarn contains 19 filaments, each with a tin core surrounded by a polymer sheath impregnated with thermochromic pigment. The color change we demonstrate is distinct from garments containing LEDs that emit light in various colors. The pigment itself changes its optical absorption spectrum to appear a different color. The thermochromic color-change is induced by a temperature change in the inner metal wire within each filament when current is applied from a small battery pack. The temperature necessary to induce the color change is near body temperature and not noticeable by touch. The prototypes already developed either use a simple push button to activate the battery pack or are wirelessly activated via a smart-phone app over Wi-Fi. The app allows the user to choose from different activation patterns of stripes that appear in the fabric continuously. The power requirements are mitigated by a large hysteresis in the activation temperature of the pigment and the temperature at which there is full color return. This was made possible by a collaboration with Chameleon International to develop a new, customized pigment. This technology enables a never-before seen capability: user-controlled, dynamic color and pattern change in large-area woven and sewn textiles and fabrics with wide-ranging applications from clothing and accessories to furniture and fixed-installation housing and business décor. The ability to activate through Wi-Fi opens up possibilities for the textiles to be part of the ‘Internet of Things.’ Furthermore, this technology is scalable to mass-production levels for wide-scale market adoption.Keywords: activation, appearance, color, manufacturing
Procedia PDF Downloads 279100 Evaluating the Accuracy of Biologically Relevant Variables Generated by ClimateAP
Authors: Jing Jiang, Wenhuan XU, Lei Zhang, Shiyi Zhang, Tongli Wang
Abstract:
Climate data quality significantly affects the reliability of ecological modeling. In the Asia Pacific (AP) region, low-quality climate data hinders ecological modeling. ClimateAP, a software developed in 2017, generates high-quality climate data for the AP region, benefiting researchers in forestry and agriculture. However, its adoption remains limited. This study aims to confirm the validity of biologically relevant variable data generated by ClimateAP during the normal climate period through comparison with the currently available gridded data. Climate data from 2,366 weather stations were used to evaluate the prediction accuracy of ClimateAP in comparison with the commonly used gridded data from WorldClim1.4. Univariate regressions were applied to 48 monthly biologically relevant variables, and the relationship between the observational data and the predictions made by ClimateAP and WorldClim was evaluated using Adjusted R-Squared and Root Mean Squared Error (RMSE). Locations were categorized into mountainous and flat landforms, considering elevation, slope, ruggedness, and Topographic Position Index. Univariate regressions were then applied to all biologically relevant variables for each landform category. Random Forest (RF) models were implemented for the climatic niche modeling of Cunninghamia lanceolata. A comparative analysis of the prediction accuracies of RF models constructed with distinct climate data sources was conducted to evaluate their relative effectiveness. Biologically relevant variables were obtained from three unpublished Chinese meteorological datasets. ClimateAPv3.0 and WorldClim predictions were obtained from weather station coordinates and WorldClim1.4 rasters, respectively, for the normal climate period of 1961-1990. Occurrence data for Cunninghamia lanceolata came from integrated biodiversity databases with 3,745 unique points. ClimateAP explains a minimum of 94.74%, 97.77%, 96.89%, and 94.40% of monthly maximum, minimum, average temperature, and precipitation variances, respectively. It outperforms WorldClim in 37 biologically relevant variables with lower RMSE values. ClimateAP achieves higher R-squared values for the 12 monthly minimum temperature variables and consistently higher Adjusted R-squared values across all landforms for precipitation. ClimateAP's temperature data yields lower Adjusted R-squared values than gridded data in high-elevation, rugged, and mountainous areas but achieves higher values in mid-slope drainages, plains, open slopes, and upper slopes. Using ClimateAP improves the prediction accuracy of tree occurrence from 77.90% to 82.77%. The biologically relevant climate data produced by ClimateAP is validated based on evaluations using observations from weather stations. The use of ClimateAP leads to an improvement in data quality, especially in non-mountainous regions. The results also suggest that using biologically relevant variables generated by ClimateAP can slightly enhance climatic niche modeling for tree species, offering a better understanding of tree species adaptation and resilience compared to using gridded data.Keywords: climate data validation, data quality, Asia pacific climate, climatic niche modeling, random forest models, tree species
Procedia PDF Downloads 6899 Comparison of On-Site Stormwater Detention Policies in Australian and Brazilian Cities
Authors: Pedro P. Drumond, James E. Ball, Priscilla M. Moura, Márcia M. L. P. Coelho
Abstract:
In recent decades, On-site Stormwater Detention (OSD) systems have been implemented in many cities around the world. In Brazil, urban drainage source control policies were created in the 1990’s and were mainly based on OSD. The concept of this technique is to promote the detention of additional stormwater runoff caused by impervious areas, in order to maintain pre-urbanization peak flow levels. In Australia OSD, was first adopted in the early 1980’s by the Ku-ring-gai Council in Sydney’s northern suburbs and Wollongong City Council. Many papers on the topic were published at that time. However, source control techniques related to stormwater quality have become to the forefront and OSD has been relegated to the background. In order to evaluate the effectiveness of the current regulations regarding OSD, the existing policies were compared in Australian cities, a country considered experienced in the use of this technique, and in Brazilian cities where OSD adoption has been increasing. The cities selected for analysis were Wollongong and Belo Horizonte, the first municipalities to adopt OSD in their respective countries, and Sydney and Porto Alegre, cities where these policies are local references. The Australian and Brazilian cities are located in Southern Hemisphere of the planet and similar rainfall intensities can be observed, especially in storm bursts greater than 15 minutes. Regarding technical criteria, Brazilian cities have a site-based approach, analyzing only on-site system drainage. This approach is criticized for not evaluating impacts on urban drainage systems and in rare cases may cause the increase of peak flows downstream. The city of Wollongong and most of the Sydney Councils adopted a catchment-based approach, requiring the use of Permissible Site Discharge (PSD) and Site Storage Requirements (SSR) values based on analysis of entire catchments via hydrograph-producing computer models. Based on the premise that OSD should be designed to dampen storms of 100 years Average Recurrence Interval (ARI) storm, the values of PSD and SSR in these four municipalities were compared. In general, Brazilian cities presented low values of PSD and high values of SSR. This can be explained by site-based approach and the low runoff coefficient value adopted for pre-development conditions. The results clearly show the differences between approaches and methodologies adopted in OSD designs among Brazilian and Australian municipalities, especially with regard to PSD values, being on opposite sides of the scale. However, lack of research regarding the real performance of constructed OSD does not allow for determining which is best. It is necessary to investigate OSD performance in a real situation, assessing the damping provided throughout its useful life, maintenance issues, debris blockage problems and the parameters related to rain-flow methods. Acknowledgments: The authors wish to thank CNPq - Conselho Nacional de Desenvolvimento Científico e Tecnológico (Chamada Universal – MCTI/CNPq Nº 14/2014), FAPEMIG - Fundação de Amparo à Pesquisa do Estado de Minas Gerais, and CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior for their financial support.Keywords: on-site stormwater detention, source control, stormwater, urban drainage
Procedia PDF Downloads 18098 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals
Authors: Bahareh Ansari
Abstract:
Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.Keywords: best practices, data visualization, literature review, open government data
Procedia PDF Downloads 10797 Navigating the Future: Evaluating the Market Potential and Drivers for High-Definition Mapping in the Autonomous Vehicle Era
Authors: Loha Hashimy, Isabella Castillo
Abstract:
In today's rapidly evolving technological landscape, the importance of precise navigation and mapping systems cannot be understated. As various sectors undergo transformative changes, the market potential for Advanced Mapping and Management Systems (AMMS) emerges as a critical focus area. The Galileo/GNSS-Based Autonomous Mobile Mapping System (GAMMS) project, specifically targeted toward high-definition mapping (HDM), endeavours to provide insights into this market within the broader context of the geomatics and navigation fields. With the growing integration of Autonomous Vehicles (AVs) into our transportation systems, the relevance and demand for sophisticated mapping solutions like HDM have become increasingly pertinent. The research employed a meticulous, lean, stepwise, and interconnected methodology to ensure a comprehensive assessment. Beginning with the identification of pivotal project results, the study progressed into a systematic market screening. This was complemented by an exhaustive desk research phase that delved into existing literature, data, and trends. To ensure the holistic validity of the findings, extensive consultations were conducted. Academia and industry experts provided invaluable insights through interviews, questionnaires, and surveys. This multi-faceted approach facilitated a layered analysis, juxtaposing secondary data with primary inputs, ensuring that the conclusions were both accurate and actionable. Our investigation unearthed a plethora of drivers steering the HD maps landscape. These ranged from technological leaps, nuanced market demands, and influential economic factors to overarching socio-political shifts. The meteoric rise of Autonomous Vehicles (AVs) and the shift towards app-based transportation solutions, such as Uber, stood out as significant market pull factors. A nuanced PESTEL analysis further enriched our understanding, shedding light on political, economic, social, technological, environmental, and legal facets influencing the HD maps market trajectory. Simultaneously, potential roadblocks were identified. Notable among these were barriers related to high initial costs, concerns around data quality, and the challenges posed by a fragmented and evolving regulatory landscape. The GAMMS project serves as a beacon, illuminating the vast opportunities that lie ahead for the HD mapping sector. It underscores the indispensable role of HDM in enhancing navigation, ensuring safety, and providing pinpoint, accurate location services. As our world becomes more interconnected and reliant on technology, HD maps emerge as a linchpin, bridging gaps and enabling seamless experiences. The research findings accentuate the imperative for stakeholders across industries to recognize and harness the potential of HD mapping, especially as we stand on the cusp of a transportation revolution heralded by Autonomous Vehicles and advanced geomatic solutions.Keywords: high-definition mapping (HDM), autonomous vehicles, PESTEL analysis, market drivers
Procedia PDF Downloads 8596 Global News Coverage of the Pandemic: Towards an Ethical Framework for Media Professionalism
Authors: Anantha S. Babbili
Abstract:
This paper analyzes the current media practices dominant in global journalistic practices within the framework of world press theories of Libertarian, Authoritarian, Communist, and Social Responsibility to evaluate their efficacy in addressing their role in the coverage of the coronavirus, also known as COVID-19. The global media flows, determinants of news coverage, and international awareness and the Western view of the world will be critically analyzed within the context of the prevalent news values that underpin free press and media coverage of the world. While evaluating the global discourse paramount to a sustained and dispassionate understanding of world events, this paper proposes an ethical framework that brings clarity devoid of sensationalism, partisanship, right-wing and left-wing interpretations to a breaking and dangerous development of a pandemic. As the world struggles to contain the coronavirus pandemic with death climbing close to 6,000 from late January to mid-March, 2020, the populations of the developed as well as the developing nations are beset with news media renditions of the crisis that are contradictory, confusing and evoking anxiety, fear and hysteria. How are we to understand differing news standards and news values? What lessons do we as journalism and mass media educators, researchers, and academics learn in order to construct a better news model and structure of media practice that addresses science, health, and media literacy among media practitioners, journalists, and news consumers? As traditional media struggles to cover the pandemic to its audience and consumers, social media from which an increasing number of consumers get their news have exerted their influence both in a positive way and in a negative manner. Even as the world struggles to grasp the full significance of the pandemic, the World Health Organization (WHO) has been feverishly battling an additional challenge related to the pandemic in what it termed an 'infodemic'—'an overabundance of information, some accurate and some not, that makes it hard for people to find trustworthy sources and reliable guidance when they need it.' There is, indeed, a need for journalism and news coverage in times of pandemics that reflect social responsibility and ethos of public service journalism. Social media and high-tech information corporations, collectively termed GAMAF—Google, Apple, Microsoft, Amazon, and Facebook – can team up with reliable traditional media—newspapers, magazines, book publishers, radio and television corporates—to ease public emotions and be helpful in times of a pandemic outbreak. GAMAF can, conceivably, weed out sensational and non-credible sources of coronavirus information, exotic cures offered for sale on a quick fix, and demonetize videos that exploit peoples’ vulnerabilities at the lowest ebb. Credible news of utility delivered in a sustained, calm, and reliable manner serves people in a meaningful and helpful way. The world’s consumers of news and information, indeed, deserve a healthy and trustworthy news media – at least in the time of pandemic COVID-19. Towards this end, the paper will propose a practical model for news media and journalistic coverage during times of a pandemic.Keywords: COVID-19, international news flow, social media, social responsibility
Procedia PDF Downloads 11395 Surveillance of Artemisinin Resistance Markers and Their Impact on Treatment Outcomes in Malaria Patients in an Endemic Area of South-Western Nigeria
Authors: Abiodun Amusan, Olugbenga Akinola, Kazeem Akano, María Hernández-Castañeda, Jenna Dick, Akintunde Sowunmi, Geoffrey Hart, Grace Gbotosho
Abstract:
Introduction: Artemisinin-based Combination Therapy (ACTs) is the cornerstone malaria treatment option in most malaria-endemic countries. Unfortunately, the malaria control effort is constantly being threatened by resistance of Plasmodium falciparum to ACTs. The recent evidence of artemisinin resistance in East Africa and its possibility of spreading to other African regions portends an imminent health catastrophe. This study aimed at evaluating the occurrence, prevalence, and influence of artemisinin-resistance markers on treatment outcomes in Ibadan before and after post-adoption of artemisinin combination therapy (ACTs) in Nigeria in 2005. Method: The study involved day zero dry blood spot (DBS) obtained from malaria patients during retrospective (2000-2005) and prospective (2021) studies. A cohort in the prospective study received oral dihydroartemisinin-piperaquine and underwent a 42-day follow-up to observe treatment outcomes. Genomic DNA was extracted from the DBS samples using a QIAamp blood extraction kit. Fragments of P. falciparum kelch13 (Pfkelch13), P. falciparum coronin (Pfcoronin), P. falciparum multidrug resistance 2 (PfMDR2), and P. falciparum chloroquine resistance transporter (PfCRT) genes were amplified and sequenced on a sanger sequencing platform to identify artemisinin resistance-associated mutations. Mutations were identified by aligning sequenced data with reference sequences obtained from the National Center for Biotechnology Information. Data were analyzed using descriptive statistics and student t-tests. Results: Mean parasite clearance time (PCT) and fever clearance time (FCT) were 2.1 ± 0.6 days (95% CI: 1.97-2.24) and 1.3 ± 0.7 days (95% CI: 1.1-1.6) respectively. Four mutations, K189T [34/53(64.2%)], R255K [2/53(3.8%)], K189N [1/53(1.9%)] and N217H [1/53(1.9%)] were identified within the N-terminal (Coiled-coil containing) domain of Pfkelch13. No artemisinin resistance-associated mutation usually found within the β-propeller domain of the Pfkelch13 gene was found in these analyzed samples. However, K189T and R255K mutations showed a significant correlation with longer parasite clearance time in the patients (P<0.002). The observed Pfkelch13 gene changes did not influence the baseline mean parasitemia (P = 0.44). P76S [17/100 (17%)] and V62M [1/100 (1%)] changes were identified in the Pfcoronin gene fragment without any influence on the parasitological parameters. No change was observed in the PfMDR2 gene, while no artemisinin resistance-associated mutation was found in the PfCRT gene. Furthermore, a sample each in the retrospective study contained the Pfkelch13 K189T and Pfcoronin P76S mutations. Conclusion: The study revealed absence of genetic-based evidence of artemisinin resistance in the study population at the time of study. The high frequency of K189T Pfkelch13 mutation and its correlation with increased parasite clearance time in this study may depict geographical variation of resistance mediators and imminent artemisinin resistance, respectively. The study also revealed an inherent potential of parasites to harbour drug-resistant genotypes before the introduction of ACTs in Nigeria.Keywords: artemisinin resistance, plasmodium falciparum, Pfkelch13 mutations, Pfcoronin
Procedia PDF Downloads 5194 Seismic History and Liquefaction Resistance: A Comparative Study of Sites in California
Authors: Tarek Abdoun, Waleed Elsekelly
Abstract:
Introduction: Liquefaction of soils during earthquakes can have significant consequences on the stability of structures and infrastructure. This study focuses on comparing two liquefaction case histories in California, namely the response of the Wildlife site in the Imperial Valley to the 2010 El-Mayor Cucapah earthquake (Mw = 7.2, amax = 0.15g) and the response of the Treasure Island Fire Station (F.S.) site in the San Francisco Bay area to the 1989 Loma Prieta Earthquake (Mw = 6.9, amax = 0.16g). Both case histories involve liquefiable layers of silty sand with non-plastic fines, similar shear wave velocities, low CPT cone penetration resistances, and groundwater tables at similar depths. The liquefaction charts based on shear wave velocity field predict liquefaction at both sites. However, a significant difference arises in their pore pressure responses during the earthquakes. The Wildlife site did not experience liquefaction, as evidenced by piezometer data, while the Treasure Island F.S. site did liquefy during the shaking. Objective: The primary objective of this study is to investigate and understand the reason for the contrasting pore pressure responses observed at the Wildlife site and the Treasure Island F.S. site despite their similar geological characteristics and predicted liquefaction potential. By conducting a detailed analysis of similarities and differences between the two case histories, the objective is to identify the factors that contributed to the higher liquefaction resistance exhibited by the Wildlife site. Methodology: To achieve this objective, the geological and seismic data available for both sites were gathered and analyzed. Then their soil profiles, seismic characteristics, and liquefaction potential as predicted by shear wave velocity-based liquefaction charts were analyzed. Furthermore, the seismic histories of both regions were examined. The number of previous earthquakes capable of generating significant excess pore pressures for each critical layer was assessed. This analysis involved estimating the total seismic activity that the Wildlife and Treasure Island F.S. critical layers experienced over time. In addition to historical data, centrifuge and large-scale experiments were conducted to explore the impact of prior seismic activity on liquefaction resistance. These findings served as supporting evidence for the investigation. Conclusions: The higher liquefaction resistance observed at the Wildlife site and other sites in the Imperial Valley can be attributed to preshaking by previous earthquakes. The Wildlife critical layer was subjected to a substantially greater number of seismic events capable of generating significant excess pore pressures over time compared to the Treasure Island F.S. layer. This crucial disparity arises from the difference in seismic activity between the two regions in the past century. In conclusion, this research sheds light on the complex interplay between geological characteristics, seismic history, and liquefaction behavior. It emphasizes the significant impact of past seismic activity on liquefaction resistance and can provide valuable insights for evaluating the stability of sandy sites in other seismic regions.Keywords: liquefaction, case histories, centrifuge, preshaking
Procedia PDF Downloads 7593 Microplastics in Urban Environment – Coimbra City Case Study
Authors: Inês Amorim Leitão, Loes van Shaick, António Dinis Ferreira, Violette Geissen
Abstract:
Plastic pollution is a growing concern worldwide: plastics are commercialized in large quantities and it takes a long time for them to degrade. When in the environment, plastic is fragmented into microplastics (<5mm), which have been found in all environmental compartments at different locations. Microplastics contribute to the environmental pollution in water, air and soil and are linked to human health problems. The progressive increase of population living in cities led to the aggravation of the pollution problem worldwide, especially in urban environments. Urban areas represent a strong source of pollution, through the roads, industrial production, wastewater, landfills, etc. It is expected that pollutants such as microplastics are transported diffusely from the sources through different pathways such as wind and rain. Therefore, it is very complex to quantify, control and treat these pollutants, designated current problematic issues by the European Commission. Green areas are pointed out by experts as natural filters for contaminants in cities, through their capacity of retention by vegetation. These spaces have thus the capacity to control the load of pollutants transported. This study investigates the spatial distribution of microplastics in urban soils of different land uses, their transport through atmospheric deposition, wind erosion, runoff and streams, as well as their deposition in vegetation like grass and tree leaves in urban environment. Coimbra, a medium large city located in the central Portugal, is the case-study. All the soil, sediments, water and vegetation samples were collected in Coimbra and were later analyzed in the Wageningen University & Research laboratory. Microplastics were extracted through the density separation using Sodium Phosphate as solution (~1.4 g cm−3) and filtration methods, visualized under a stereo microscope and identified using the u-FTIR method. Microplastic particles were found in all the different samples. In terms of soils, higher concentrations of microplastics were found in green parks, followed by landfills and industrial places, and the lowest concentrations in forests and pasture land-uses. Atmospheric deposition and streams after rainfall events seems to represent the strongest pathways of microplastics. Tree leaves can retain microplastics on their surfaces. Small leaves such as needle leaves seem to present higher amounts of microplastics per leaf area than bigger leaves. Rainfall episodes seem to reduce the concentration of microplastics on leaves surface, which suggests the wash of microplastics down to lower levels of the tree or to the soil. When in soil, different types of microplastics could be transported to the atmosphere through wind erosion. Grass seems to present high concentrations of microplastics, and the enlargement of the grass cover leads to a reduction of the amount of microplastics in soil, but also of the microplastics moved from the ground to the atmosphere by wind erosion. This study proof that vegetation can help to control the transport and dispersion of microplastics. In order to control the entry and the concentration of microplastics in the environment, especially in cities, it is essential to defining and evaluating nature-based land-use scenarios, considering the role of green urban areas in filtering small particles.Keywords: microplastics, cities, sources, pathways, vegetation
Procedia PDF Downloads 6092 Concept Mapping to Reach Consensus on an Antibiotic Smart Use Strategy Model to Promote and Support Appropriate Antibiotic Prescribing in a Hospital, Thailand
Authors: Phenphak Horadee, Rodchares Hanrinth, Saithip Suttiruksa
Abstract:
Inappropriate use of antibiotics has happened in several hospitals, Thailand. Drug use evaluation (DUE) is one strategy to overcome this difficulty. However, most community hospitals still encounter incomplete evaluation resulting overuse of antibiotics with high cost. Consequently, drug-resistant bacteria have been rising due to inappropriate antibiotic use. The aim of this study was to involve stakeholders in conceptualizing, developing, and prioritizing a feasible intervention strategy to promote and support appropriate antibiotic prescribing in a community hospital, Thailand. Study antibiotics included four antibiotics such as Meropenem, Piperacillin/tazobactam, Amoxicillin/clavulanic acid, and Vancomycin. The study was conducted for the 1-year period between March 1, 2018, and March 31, 2019, in a community hospital in the northeastern part of Thailand. Concept mapping was used in a purposive sample, including doctors (one was an administrator), pharmacists, and nurses who involving drug use evaluation of antibiotics. In-depth interviews for each participant and survey research were conducted to seek the problems for inappropriate use of antibiotics based on drug use evaluation system. Seventy-seven percent of DUE reported appropriate antibiotic prescribing, which still did not reach the goal of 80 percent appropriateness. Meropenem led other antibiotics for inappropriate prescribing. The causes of the unsuccessful DUE program were classified into three themes such as personnel, lack of public relation and communication, and unsupported policy and impractical regulations. During the first meeting, stakeholders (n = 21) expressed the generation of interventions. During the second meeting, participants who were almost the same group of people in the first meeting (n = 21) were requested to independently rate the feasibility and importance of each idea and to categorize them into relevant clusters to facilitate multidimensional scaling and hierarchical cluster analysis. The outputs of analysis included the idealist, cluster list, point map, point rating map, cluster map, and cluster rating map. All of these were distributed to participants (n = 21) during the third meeting to reach consensus on an intervention model. The final proposed intervention strategy included 29 feasible and crucial interventions in seven clusters: development of information technology system, establishing policy and taking it into the action plan, proactive public relations of the policy, action plan and workflow, in cooperation of multidisciplinary teams in drug use evaluation, work review and evaluation with performance reporting, promoting and developing professional and clinical skill for staff with training programs, and developing practical drug use evaluation guideline for antibiotics. These interventions are relevant and fit to several intervention strategies for antibiotic stewardship program in many international organizations such as participation of the multidisciplinary team, developing information technology to support antibiotic smart use, and communication. These interventions were prioritized for implementation over a 1-year period. Once the possibility of each activity or plan is set up, the proposed program could be applied and integrated into hospital policy after evaluating plans. Effectiveness of each intervention could be promoted to other community hospitals to promote and support antibiotic smart use.Keywords: antibiotic, concept mapping, drug use evaluation, multidisciplinary teams
Procedia PDF Downloads 12191 Understanding Beginning Writers' Narrative Writing with a Multidimensional Assessment Approach
Authors: Huijing Wen, Daibao Guo
Abstract:
Writing is thought to be the most complex facet of language arts. Assessing writing is difficult and subjective, and there are few scientifically validated assessments exist. Research has proposed evaluating writing using a multidimensional approach, including both qualitative and quantitative measures of handwriting, spelling and prose. Given that narrative writing has historically been a staple of literacy instruction in primary grades and is one of the three major genres Common Core State Standards required students to acquire starting in kindergarten, it is essential for teachers to understand how to measure beginning writers writing development and sources of writing difficulties through narrative writing. Guided by the theoretical models of early written expression and using empirical data, this study examines ways teachers can enact a comprehensive approach to understanding beginning writer’s narrative writing through three writing rubrics developed for a Curriculum-based Measurement (CBM). The goal is to help classroom teachers structure a framework for assessing early writing in primary classrooms. Participants in this study included 380 first-grade students from 50 classrooms in 13 schools in three school districts in a Mid-Atlantic state. Three writing tests were used to assess first graders’ writing skills in relation to both transcription (i.e., handwriting fluency and spelling tests) and translational skills (i.e., a narrative prompt). First graders were asked to respond to a narrative prompt in 20 minutes. Grounded in theoretical models of earlier expression and empirical evidence of key contributors to early writing, all written samples to the narrative prompt were coded three ways for different dimensions of writing: length, quality, and genre elements. To measure the quality of the narrative writing, a traditional holistic rating rubric was developed by the researchers based on the CCSS and the general traits of good writing. Students' genre knowledge was measured by using a separate analytic rubric for narrative writing. Findings showed that first-graders had emerging and limited transcriptional and translational skills with a nascent knowledge of genre conventions. The findings of the study provided support for the Not-So-Simple View of Writing in that fluent written expression, measured by length and other important linguistic resources measured by the overall quality and genre knowledge rubrics, are fundamental in early writing development. Our study echoed previous research findings on children's narrative development. The study has practical classroom application as it informs writing instruction and assessment. It offered practical guidelines for classroom instruction by providing teachers with a better understanding of first graders' narrative writing skills and knowledge of genre conventions. Understanding students’ narrative writing provides teachers with more insights into specific strategies students might use during writing and their understanding of good narrative writing. Additionally, it is important for teachers to differentiate writing instruction given the individual differences shown by our multiple writing measures. Overall, the study shed light on beginning writers’ narrative writing, indicating the complexity of early writing development.Keywords: writing assessment, early writing, beginning writers, transcriptional skills, translational skills, primary grades, simple view of writing, writing rubrics, curriculum-based measurement
Procedia PDF Downloads 7690 Development of Cost Effective Ultra High Performance Concrete by Using Locally Available Materials
Authors: Mohamed Sifan, Brabha Nagaratnam, Julian Thamboo, Keerthan Poologanathan
Abstract:
Ultra high performance concrete (UHPC) is a type of cementitious material known for its exceptional strength, ductility, and durability. However, its production is often associated with high costs due to the significant amount of cementitious materials required and the use of fine powders to achieve the desired strength. The aim of this research is to explore the feasibility of developing cost-effective UHPC mixes using locally available materials. Specifically, the study aims to investigate the use of coarse limestone sand along with other sand types, namely, basalt sand, dolomite sand, and river sand for developing UHPC mixes and evaluating its performances. The study utilises the particle packing model to develop various UHPC mixes. The particle packing model involves optimising the combination of coarse limestone sand, basalt sand, dolomite sand, and river sand to achieve the desired properties of UHPC. The developed UHPC mixes are then evaluated based on their workability (measured through slump flow and mini slump value), compressive strength (at 7, 28, and 90 days), splitting tensile strength, and microstructural characteristics analysed through scanning electron microscope (SEM) analysis. The results of this study demonstrate that cost-effective UHPC mixes can be developed using locally available materials without the need for silica fume or fly ash. The UHPC mixes achieved impressive compressive strengths of up to 149 MPa at 28 days with a cement content of approximately 750 kg/m³. The mixes also exhibited varying levels of workability, with slump flow values ranging from 550 to 850 mm. Additionally, the inclusion of coarse limestone sand in the mixes effectively reduced the demand for superplasticizer and served as a filler material. By exploring the use of coarse limestone sand and other sand types, this study provides valuable insights into optimising the particle packing model for UHPC production. The findings highlight the potential to reduce costs associated with UHPC production without compromising its strength and durability. The study collected data on the workability, compressive strength, splitting tensile strength, and microstructural characteristics of the developed UHPC mixes. Workability was measured using slump flow and mini slump tests, while compressive strength and splitting tensile strength were assessed at different curing periods. Microstructural characteristics were analysed through SEM and energy dispersive X-ray spectroscopy (EDS) analysis. The collected data were then analysed and interpreted to evaluate the performance and properties of the UHPC mixes. The research successfully demonstrates the feasibility of developing cost-effective UHPC mixes using locally available materials. The inclusion of coarse limestone sand, in combination with other sand types, shows promising results in achieving high compressive strengths and satisfactory workability. The findings suggest that the use of the particle packing model can optimise the combination of materials and reduce the reliance on expensive additives such as silica fume and fly ash. This research provides valuable insights for researchers and construction practitioners aiming to develop cost-effective UHPC mixes using readily available materials and an optimised particle packing approach.Keywords: cost-effective, limestone powder, particle packing model, ultra high performance concrete
Procedia PDF Downloads 11189 Speech and Swallowing Function after Tonsillo-Lingual Sulcus Resection with PMMC Flap Reconstruction: A Case Study
Authors: K. Rhea Devaiah, B. S. Premalatha
Abstract:
Background: Tonsillar Lingual sulcus is the area between the tonsils and the base of the tongue. The surgical resection of the lesions in the head and neck results in changes in speech and swallowing functions. The severity of the speech and swallowing problem depends upon the site and extent of the lesion, types and extent of surgery and also the flexibility of the remaining structures. Need of the study: This paper focuses on the importance of speech and swallowing rehabilitation in an individual with the lesion in the Tonsillar Lingual Sulcus and post-operative functions. Aim: Evaluating the speech and swallow functions post-intensive speech and swallowing rehabilitation. The objectives are to evaluate the speech intelligibility and swallowing functions after intensive therapy and assess the quality of life. Method: The present study describes a report of an individual aged 47years male, with the diagnosis of basaloid squamous cell carcinoma, left tonsillar lingual sulcus (pT2n2M0) and underwent wide local excision with left radical neck dissection with PMMC flap reconstruction. Post-surgery the patient came with a complaint of reduced speech intelligibility, and difficulty in opening the mouth and swallowing. Detailed evaluation of the speech and swallowing functions were carried out such as OPME, articulation test, speech intelligibility, different phases of swallowing and trismus evaluation. Self-reported questionnaires such as SHI-E(Speech handicap Index- Indian English), DHI (Dysphagia handicap Index) and SESEQ -K (Self Evaluation of Swallowing Efficiency in Kannada) were also administered to know what the patient feels about his problem. Based on the evaluation, the patient was diagnosed with pharyngeal phase dysphagia associated with trismus and reduced speech intelligibility. Intensive speech and swallowing therapy was advised weekly twice for the duration of 1 hour. Results: Totally the patient attended 10 intensive speech and swallowing therapy sessions. Results indicated misarticulation of speech sounds such as lingua-palatal sounds. Mouth opening was restricted to one finger width with difficulty chewing, masticating, and swallowing the bolus. Intervention strategies included Oro motor exercise, Indirect swallowing therapy, usage of a trismus device to facilitate mouth opening, and change in the food consistency to help to swallow. A practice session was held with articulation drills to improve the production of speech sounds and also improve speech intelligibility. Significant changes in articulatory production and speech intelligibility and swallowing abilities were observed. The self-rated quality of life measures such as DHI, SHI and SESE Q-K revealed no speech handicap and near-normal swallowing ability indicating the improved QOL after the intensive speech and swallowing therapy. Conclusion: Speech and swallowing therapy post carcinoma in the tonsillar lingual sulcus is crucial as the tongue plays an important role in both speech and swallowing. The role of Speech-language and swallowing therapists in oral cancer should be highlighted in treating these patients and improving the overall quality of life. With intensive speech-language and swallowing therapy post-surgery for oral cancer, there can be a significant change in the speech outcome and swallowing functions depending on the site and extent of lesions which will thereby improve the individual’s QOL.Keywords: oral cancer, speech and swallowing therapy, speech intelligibility, trismus, quality of life
Procedia PDF Downloads 11288 Implementation of Smart Card Automatic Fare Collection Technology in Small Transit Agencies for Standards Development
Authors: Walter E. Allen, Robert D. Murray
Abstract:
Many large transit agencies have adopted RFID technology and electronic automatic fare collection (AFC) or smart card systems, but small and rural agencies remain tied to obsolete manual, cash-based fare collection. Small countries or transit agencies can benefit from the implementation of smart card AFC technology with the promise of increased passenger convenience, added passenger satisfaction and improved agency efficiency. For transit agencies, it reduces revenue loss, improves passenger flow and bus stop data. For countries, further implementation into security, distribution of social services or currency transactions can provide greater benefits. However, small countries or transit agencies cannot afford expensive proprietary smart card solutions typically offered by the major system suppliers. Deployment of Contactless Fare Media System (CFMS) Standard eliminates the proprietary solution, ultimately lowering the cost of implementation. Acumen Building Enterprise, Inc. chose the Yuma County Intergovernmental Public Transportation Authority (YCIPTA) existing proprietary YCAT smart card system to implement CFMS. The revised system enables the purchase of fare product online with prepaid debit or credit cards using the Payment Gateway Processor. Open and interoperable smart card standards for transit have been developed. During the 90-day Pilot Operation conducted, the transit agency gathered the data from the bus AcuFare 200 Card Reader, loads (copies) the data to a USB Thumb Drive and uploads the data to the Acumen Host Processing Center for consolidation of the data into the transit agency master data file. The transition from the existing proprietary smart card data format to the new CFMS smart card data format was transparent to the transit agency cardholders. It was proven that open standards and interoperability design can work and reduce both implementation and operational costs for small transit agencies or countries looking to expand smart card technology. Acumen was able to avoid the implementation of the Payment Card Industry (PCI) Data Security Standards (DSS) which is expensive to develop and costly to operate on a continuing basis. Due to the substantial additional complexities of implementation and the variety of options presented to the transit agency cardholder, Acumen chose to implement only the Directed Autoload. To improve the implementation efficiency and the results for a similar undertaking, it should be considered that some passengers lack credit cards and are averse to technology. There are more than 1,300 small and rural agencies in the United States. This grows by 10 fold when considering small countries or rural locations throughout Latin American and the world. Acumen is evaluating additional countries, sites or transit agency that can benefit from the smart card systems. Frequently, payment card systems require extensive security procedures for implementation. The Project demonstrated the ability to purchase fare value, rides and passes with credit cards on the internet at a reasonable cost without highly complex security requirements.Keywords: automatic fare collection, near field communication, small transit agencies, smart cards
Procedia PDF Downloads 28487 Bio-Hub Ecosystems: Profitability through Circularity for Sustainable Forestry, Energy, Agriculture and Aquaculture
Authors: Kimberly Samaha
Abstract:
The Bio-Hub Ecosystem model was developed to address a critical area of concern within the global energy market regarding biomass as a feedstock for power plants. Yet the lack of an economically-viable business model for bioenergy facilities has resulted in the continuation of idled and decommissioned plants. This study analyzed data and submittals to the Born Global Maine Innovation Challenge. The Innovation Challenge was a global innovation challenge to identify process innovations that could address a ‘whole-tree’ approach of maximizing the products, byproducts, energy value and process slip-streams into a circular zero-waste design. Participating companies were at various stages of developing bioproducts and included biofuels, lignin-based products, carbon capture platforms and biochar used as both a filtration medium and as a soil amendment product. This case study shows the QCA (Qualitative Comparative Analysis) methodology of the prequalification process and the resulting techno-economic model that was developed for the maximizing profitability of the Bio-Hub Ecosystem through continuous expansion of system waste streams into valuable process inputs for co-hosts. A full site plan for the integration of co-hosts (biorefinery, land-based shrimp and salmon aquaculture farms, a tomato green-house and a hops farm) at an operating forestry-based biomass to energy plant in West Enfield, Maine USA. This model and process for evaluating the profitability not only proposes models for integration of forestry, aquaculture and agriculture in cradle-to-cradle linkages of what have typically been linear systems, but the proposal also allows for the early measurement of the circularity and impact of resource use and investment risk mitigation, for these systems. In this particular study, profitability is assessed at two levels CAPEX (Capital Expenditures) and in OPEX (Operating Expenditures). Given that these projects start with repurposing facilities where the industrial level infrastructure is already built, permitted and interconnected to the grid, the addition of co-hosts first realizes a dramatic reduction in permitting, development times and costs. In addition, using the biomass energy plant’s waste streams such as heat, hot water, CO₂ and fly ash as valuable inputs to their operations and a significant decrease in the OPEX costs, increasing overall profitability to each of the co-hosts bottom line. This case study utilizes a proprietary techno-economic model to demonstrate how utilizing waste streams of a biomass energy plant and/or biorefinery, results in significant reduction in OPEX for both the biomass plants and the agriculture and aquaculture co-hosts. Economically viable Bio-Hubs with favorable environmental and community impacts may prove critical in garnering local and federal government support for pilot programs and more wide-scale adoption, especially for those living in severely economically depressed rural areas where aging industrial sites have been shuttered and local economies devastated.Keywords: bio-economy, biomass energy, financing, zero-waste
Procedia PDF Downloads 13486 Developing the Collaboration Model of Physical Education and Sport Sciences Faculties with Service Section of Sport Industrial
Authors: Vahid Saatchian, Seyyed Farideh Hadavi
Abstract:
The main aim of this study was developing the collaboration model of physical education and sport sciences faculties with service section of sport industrial.The research methods of this study was a qualitative. So researcher with of identifying the priority list of collaboration between colleges and service section of sport industry and according to sampling based of subjective and snowball approach, conducted deep interviews with 22 elites that study around the field of research topic. indeed interviews were analyzed through qualitative coding (open, axial and selective) with 5 category such as causal condition, basic condition, intervening conditions, action/ interaction and strategy. Findings exposed that in causal condition 10 labels appeared. So because of heterogeneity of labes, researcher categorized in total subject. In basic condition 59 labels in open coding identified this categorized in 14 general concepts. Furthermore with composition of the declared category and relationship between them, 5 final and internal categories (culture, intelligence, marketing, environment and ultra-powers) were appeared. Also an intervening condition in the study includes 5 overall scopes of social factors, economic, cultural factors, and the management of the legal and political factors that totally named macro environment. Indeed for identifying strategies, 8 areas that covered with internal and external challenges relationship management were appeared. These are including, understanding, outside awareness, manpower, culture, integrated management, the rules and regulations and marketing. Findings exposed 8 labels in open coding which covered the internal and external of challenges of relation management of two sides and these concepts were knowledge and awareness, external view, human source, madding organizational culture, parties’ thoughts, unit responsible for/integrated management, laws and regulations and marketing. Eventually the consequences categorized in line of strategies and were at scope of the cultural development, general development, educational development, scientific development, under development, international development, social development, economic development, technology development and political development that consistent with strategies. The research findings could help the sport managers witch use to scientific collaboration management and the consequences of this in those sport institutions. Finally, the consequences that identified as a result of the devopmental strategies include: cultural, governmental, educational, scientific, infrastructure, international, social, economic, technological and political that is largely consistent with strategies. With regard to the above results, enduring and systematic relation with long term cooperation between the two sides requires strategic planning were based on cooperation of all stakeholders. Through this, in the turbulent constantly changing current sustainable environment, competitive advantage for university and industry obtained. No doubt that lack of vision and strategic thinking for cooperation in the planning of the university and industry from its capability and instead of using the opportunity, lead the opportunities to problems.Keywords: university and industry collaboration, sport industry, physical education and sport science college, service section of sport industry
Procedia PDF Downloads 38285 Partnering With Faith-Based Entities to Improve Mental Health Awareness and Decrease Stigma in African American Communities
Authors: Bryana Woodard, Monica Mitchell, Kasey Harry, Ebony Washington, Megan Harris, Marcia Boyd, Regina Lynch, Daphene Baines, Surbi Bankar
Abstract:
Introduction: African Americans experience mental health illnesses (i.e., depression, anxiety, etc.) at higher rates than their white counterparts. Despite this, they utilize mental health resources less and have lower mental health literacy, perhaps due to cultural barriers- including but not limited to mistrust. Research acknowledges African Americans’ close ties to community networks, identifying these linkages as key to establishing comfort and trust. Similarly, the church has historically been a space that creates unity and community among African Americans. Studies show that longstanding academic-community partnerships with organizations, such as churches and faith-based entities, have the capability to effectively address health and mental health barriers and needs in African Americans. The importance of implementing faith-based approaches is supported in the literature, however few empirical studies exist. This project describes the First Ladies for Health and Cincinnati Children's Hospital Medical Center (CCHMC) Partnership (FLFH-CCHMC Partnership) and the implementation and assessment of an annual Mental Health Symposium, the overall aim of which was to increase mental health awareness and decrease stigma in African American communities. Methods: The specific goals of the FLFH Mental Health Symposium were to (1) Collaborate with trusted partners to build trust with community participants; (2) Increase mental health literacy and decrease mental health stigma; (3) Understand the barriers to improving mental health and improving trust; (4) Assess the short-term outcomes two months following the symposium. Data were collected through post-event and follow-up surveys using a mixed methods approach. Results: More than 100 participants attended each year with over 350 total participants over three years. 98.7% of participants were African American, 86.67% female, 11.6% male, and 11.6% LGBTQ+/non-binary; 10.5% of participants were teens, with the remainder aged 20 to 80 plus. The event was successful in achieving its goals: (1a) Eleven different speakers from 8 community and church organizations presented; (1b) 93% of participants rated the overall symposium as very good or excellent (2a) Mental health literacy significantly increased each year with over 90% of participants reporting improvement in their “understanding” and “awareness of mental health (2b) Participants 'personal stigma surrounding mental health illness decreased each year with 92.3% of participants reporting changes in their “willingness to talk about and share” mental health challenges; (3) Barriers to mental health care were identified and included social stigma, lack of trust, and the cost of care. Data were used to develop priorities and an action plan for the FLFH-CCHMC Mental Health Partnership; (4) Follow-up data showed that participants sustained benefits of the FLFH Symposium and took actionable steps (e.g., meditation, referrals, etc.). Additional quantitative and qualitative data will be shared. Conclusions: Lower rates of mental health literacy and higher rates of stigma among participants in this initiative demonstrate the importance of mental health providers building trust and partnerships in communities. Working with faith-based entities provides an opportunity to mitigate and address mental health equity in African American communities.Keywords: community psychology, faith-based, african-american, culturally competent care, mental health equity
Procedia PDF Downloads 3684 Polysaccharide Polyelectrolyte Complexation: An Engineering Strategy for the Development of Commercially Viable Sustainable Materials
Authors: Jeffrey M. Catchmark, Parisa Nazema, Caini Chen, Wei-Shu Lin
Abstract:
Sustainable and environmentally compatible materials are needed for a wide variety of volume commercial applications. Current synthetic materials such as plastics, fluorochemicals (such as PFAS), adhesives and resins in form of sheets, laminates, coatings, foams, fibers, molded parts and composites are used for countless products such as packaging, food handling, textiles, biomedical, construction, automotive and general consumer devices. Synthetic materials offer distinct performance advantages including stability, durability and low cost. These attributes are associated with the physical and chemical properties of these materials that, once formed, can be resistant to water, oils, solvents, harsh chemicals, salt, temperature, impact, wear and microbial degradation. These advantages become disadvantages when considering the end of life of these products which generate significant land and water pollution when disposed of and few are recycled. Agriculturally and biologically derived polymers offer the potential of remediating these environmental and life-cycle difficulties, but face numerous challenges including feedstock supply, scalability, performance and cost. Such polymers include microbial biopolymers like polyhydroxyalkanoates and polyhydroxbutirate; polymers produced using biomonomer chemical synthesis like polylactic acid; proteins like soy, collagen and casein; lipids like waxes; and polysaccharides like cellulose and starch. Although these materials, and combinations thereof, exhibit the potential for meeting some of the performance needs of various commercial applications, only cellulose and starch have both the production feedstock volume and cost to compete with petroleum derived materials. Over 430 million tons of plastic is produced each year and plastics like low density polyethylene cost ~$1500 to $1800 per ton. Over 400 million tons of cellulose and over 100 million tons of starch are produced each year at a volume cost as low as ~$500 to $1000 per ton with the capability of increased production. Cellulose and starches, however, are hydroscopic materials that do not exhibit the needed performance in most applications. Celluloses and starches can be chemically modified to contain positive and negative surface charges and such modified versions of these are used in papermaking, foods and cosmetics. Although these modified polysaccharides exhibit the same performance limitations, recent research has shown that composite materials comprised of cationic and anionic polysaccharides in polyelectrolyte complexation exhibit significantly improved performance including stability in diverse environments. Moreover, starches with added plasticizers can exhibit thermoplasticity, presenting the possibility of improved thermoplastic starches when comprised of starches in polyelectrolyte complexation. In this work, the potential for numerous volume commercial products based on polysaccharide polyelectrolyte complexes (PPCs) will be discussed, including the engineering design strategy used to develop them. Research results will be detailed including the development and demonstration of starch PPC compositions for paper coatings to replace PFAS; adhesives; foams for packaging, insulation and biomedical applications; and thermoplastic starches. In addition, efforts to demonstrate the potential for volume manufacturing with industrial partners will be discussed.Keywords: biomaterials engineering, commercial materials, polysaccharides, sustainable materials
Procedia PDF Downloads 1883 Absenteeism in Polytechnical University Studies: Quantification and Identification of the Causes at Universitat Politècnica de Catalunya
Authors: E. Mas de les Valls, M. Castells-Sanabra, R. Capdevila, N. Pla, Rosa M. Fernandez-Canti, V. de Medina, A. Mujal, C. Barahona, E. Velo, M. Vigo, M. A. Santos, T. Soto
Abstract:
Absenteeism in universities, including polytechnical universities, is influenced by a variety of factors. Some factors overlap with those causing absenteeism in schools, while others are specific to the university and work-related environments. Indeed, these factors may stem from various sources, including students, educators, the institution itself, or even the alignment of degree curricula with professional requirements. In Spain, there has been an increase in absenteeism in polytechnical university studies, especially after the Covid crisis, posing a significant challenge for institutions to address. This study focuses on Universitat Politècnica de Catalunya• BarcelonaTech (UPC) and aims to quantify the current level of absenteeism and identify its main causes. The study is part of the teaching innovation project ASAP-UPC, which aims to minimize absenteeism through the redesign of teaching methodologies. By understanding the factors contributing to absenteeism, the study seeks to inform the subsequent phases of the ASAP-UPC project, which involve implementing methodologies to minimize absenteeism and evaluating their effectiveness. The study utilizes surveys conducted among students and polytechnical companies. Students' perspectives are gathered through both online surveys and in-person interviews. The surveys inquire about students' interest in attending classes, skill development throughout their UPC experience, and their perception of the skills required for a career in a polytechnical field. Additionally, polytechnical companies are surveyed regarding the skills they seek in prospective employees. The collected data is then analyzed to identify patterns and trends. This analysis involves organizing and categorizing the data, identifying common themes, and drawing conclusions based on the findings. This mixed-method approach has revealed that higher levels of absenteeism are observed in large student groups at both the Bachelor's and Master's degree levels. However, the main causes of absenteeism differ between these two levels. At the Bachelor's level, many students express dissatisfaction with in-person classes, perceiving them as overly theoretical and lacking a balance between theory, experimental practice, and problem-solving components. They also find a lack of relevance to professional needs. Consequently, they resort to using online available materials developed during the Covid crisis and attending private academies for exam preparation instead. On the other hand, at the Master's level, absenteeism primarily arises from schedule incompatibility between university and professional work. There is a discrepancy between the skills highly valued by companies and the skills emphasized during the studies, aligning partially with students' perceptions. These findings are of theoretical importance as they shed light on areas that can be improved to offer a more beneficial educational experience to students at UPC. The study also has potential applicability to other polytechnic universities, allowing them to adapt the surveys and apply the findings to their specific contexts. By addressing the identified causes of absenteeism, universities can enhance the educational experience and better prepare students for successful careers in polytechnical fields.Keywords: absenteeism, polytechnical studies, professional skills, university challenges
Procedia PDF Downloads 6982 Surface Plasmon Resonance Imaging-Based Epigenetic Assay for Blood DNA Post-Traumatic Stress Disorder Biomarkers
Authors: Judy M. Obliosca, Olivia Vest, Sandra Poulos, Kelsi Smith, Tammy Ferguson, Abigail Powers Lott, Alicia K. Smith, Yang Xu, Christopher K. Tison
Abstract:
Post-Traumatic Stress Disorder (PTSD) is a mental health problem that people may develop after experiencing traumatic events such as combat, natural disasters, and major emotional challenges. Tragically, the number of military personnel with PTSD correlates directly with the number of veterans who attempt suicide, with the highest rate in the Army. Research has shown epigenetic risks in those who are prone to several psychiatric dysfunctions, particularly PTSD. Once initiated in response to trauma, epigenetic alterations in particular, the DNA methylation in the form of 5-methylcytosine (5mC) alters chromatin structure and represses gene expression. Current methods to detect DNA methylation, such as bisulfite-based genomic sequencing techniques, are laborious and have massive analysis workflow while still having high error rates. A faster and simpler detection method of high sensitivity and precision would be useful in a clinical setting to confirm potential PTSD etiologies, prevent other psychiatric disorders, and improve military health. A nano-enhanced Surface Plasmon Resonance imaging (SPRi)-based assay that simultaneously detects site-specific 5mC base (termed as PTSD base) in methylated genes related to PTSD is being developed. The arrays on a sensing chip were first constructed for parallel detection of PTSD bases using synthetic and genomic DNA (gDNA) samples. For the gDNA sample extracted from the whole blood of a PTSD patient, the sample was first digested using specific restriction enzymes, and fragments were denatured to obtain single-stranded methylated target genes (ssDNA). The resulting mixture of ssDNA was then injected into the assay platform, where targets were captured by specific DNA aptamer probes previously immobilized on the surface of a sensing chip. The PTSD bases in targets were detected by anti-5-methylcytosine antibody (anti-5mC), and the resulting signals were then enhanced by the universal nanoenhancer. Preliminary results showed successful detection of a PTSD base in a gDNA sample. Brighter spot images and higher delta values (control-subtracted reflectivity signal) relative to those of the control were observed. We also implemented the in-house surface activation system for detection and developed SPRi disposable chips. Multiplexed PTSD base detection of target methylated genes in blood DNA from PTSD patients of severity conditions (asymptomatic and severe) was conducted. This diagnostic capability being developed is a platform technology, and upon successful implementation for PTSD, it could be reconfigured for the study of a wide variety of neurological disorders such as traumatic brain injury, Alzheimer’s disease, schizophrenia, and Huntington's disease and can be extended to the analyses of other sample matrices such as urine and saliva.Keywords: epigenetic assay, DNA methylation, PTSD, whole blood, multiplexing
Procedia PDF Downloads 12781 Simultech - Innovative Country-Wide Ultrasound Training Center
Authors: Yael Rieder, Yael Gilboa, S. O. Adva, Efrat Halevi, Ronnie Tepper
Abstract:
Background: Operation of ultrasound equipment is a core skill for many clinical specialties. As part of the training program at -Simultech- a simulation center for Ob\Gyn at the Meir Medical Center, Israel, teaching how to operate ultrasound equipment requires dealing with misunderstandings of spatial and 3D orientation, failure of the operator to hold a transducer correctly, and limited ability to evaluate the data on the screen. We have developed a platform intended to endow physicians and sonographers with clinical and operational skills of obstetric ultrasound. Simultech's simulations are focused on medical knowledge, risk management, technology operations and physician-patient communication. The simulations encompass extreme work conditions. Setup: Between eight and ten of the eight hundred and fifty physicians and sonographers of the Clalit health services from seven hospitals and eight community centers across Israel, participate in individual Ob/Gyn training sessions each week. These include Ob/Gyn specialists, experts, interns, and sonographers. Innovative teaching and training methodologies: The six-hour training program includes: (1) An educational computer program that challenges trainees to deal with medical questions based upon ultrasound pictures and films. (2) Sophisticated hands-on simulators that challenge the trainees to practice correct grip of the transducer, elucidate pathology, and practice daily tasks such as biometric measurements and analysis of sonographic data. (3) Participation in a video-taped simulation which focuses on physician-patient communications. In the simulation, the physician is required to diagnose the clinical condition of a hired actress based on the data she provides and by evaluating the assigned ultrasound films accordingly. Giving ‘bad news’ to the patient may put the physician in a stressful situation that must be properly managed. (4) Feedback at the end of each phase is provided by a designated trainer, not a physician, who is specially qualified by Ob\Gyn senior specialists. (5) A group exercise in which the trainer presents a medico-legal case in order to encourage the participants to use their own experience and knowledge to conduct a productive ‘brainstorming’ session. Medical cases are presented and analyzed by the participants together with the trainer's feedback. Findings: (1) The training methods and content that Simultech provides allows trainees to review their medical and communications skills. (2) Simultech training sessions expose physicians to both basic and new, up-to-date cases, refreshing and expanding the trainee's knowledge. (3) Practicing on advanced simulators enables trainees to understand the sonographic space and to implement the basic principles of ultrasound. (4) Communications simulations were found to be beneficial for trainees who were unaware of their interpersonal skills. The trainer feedback, supported by the recorded simulation, allows the trainee to draw conclusions about his performance. Conclusion: Simultech was found to contribute to physicians at all levels of clinical expertise who deal with ultrasound. A break in daily routine together with attendance at a neutral educational center can vastly improve performance and outlook.Keywords: medical training, simulations, ultrasound, Simultech
Procedia PDF Downloads 28080 Urban Heat Islands Analysis of Matera, Italy Based on the Change of Land Cover Using Satellite Landsat Images from 2000 to 2017
Authors: Giuseppina Anna Giorgio, Angela Lorusso, Maria Ragosta, Vito Telesca
Abstract:
Climate change is a major public health threat due to the effects of extreme weather events on human health and on quality of life in general. In this context, mean temperatures are increasing, in particular, extreme temperatures, with heat waves becoming more frequent, more intense, and longer lasting. In many cities, extreme heat waves have drastically increased, giving rise to so-called Urban Heat Island (UHI) phenomenon. In an urban centre, maximum temperatures may be up to 10° C warmer, due to different local atmospheric conditions. UHI occurs in the metropolitan areas as function of the population size and density of a city. It consists of a significant difference in temperature compared to the rural/suburban areas. Increasing industrialization and urbanization have increased this phenomenon and it has recently also been detected in small cities. Weather conditions and land use are one of the key parameters in the formation of UHI. In particular surface urban heat island is directly related to temperatures, to land surface types and surface modifications. The present study concern a UHI analysis of Matera city (Italy) based on the analysis of temperature, change in land use and land cover, using Corine Land Cover maps and satellite Landsat images. Matera, located in Southern Italy, has a typical Mediterranean climate with mild winters and hot and humid summers. Moreover, Matera has been awarded the international title of the 2019 European Capital of Culture. Matera represents a significant example of vernacular architecture. The structure of the city is articulated by a vertical succession of dug layers sometimes excavated or partly excavated and partly built, according to the original shape and height of the calcarenitic slope. In this study, two meteorological stations were selected: MTA (MaTera Alsia, in industrial zone) and MTCP (MaTera Civil Protection, suburban area located in a green zone). In order to evaluate the increase in temperatures (in terms of UHI occurrences) over time, and evaluating the effect of land use on weather conditions, the climate variability of temperatures for both stations was explored. Results show that UHI phenomena is growing in Matera city, with an increase of maximum temperature values at a local scale. Subsequently, spatial analysis was conducted by Landsat satellite images. Four years was selected in the summer period (27/08/2000, 27/07/2006, 11/07/2012, 02/08/2017). In Particular, Landsat 7 ETM+ for 2000, 2006 and 2012 years; Landsat 8 OLI/TIRS for 2017. In order to estimate the LST, Mono Window Algorithm was applied. Therefore, the increase of LST values spatial scale trend has been verified, in according to results obtained at local scale. Finally, the analysis of land use maps over the years by the LST and/or the maximum temperatures measured, show that the development of industrialized area produces a corresponding increase in temperatures and consequently a growth in UHI.Keywords: climate variability, land surface temperature, LANDSAT images, urban heat island
Procedia PDF Downloads 12679 Health Equity in Hard-to-Reach Rural Communities in Abia State, Nigeria: An Asset-Based Community Development Intervention to Influence Community Norms and Address the Social Determinants of Health in Hard-to-Reach Rural Communities
Authors: Chinasa U. Imo, Queen Chikwendu, Jonathan Ajuma, Mario Banuelos
Abstract:
Background: Sociocultural norms primarily influence the health-seeking behavior of populations in rural communities. In the Nkporo community, Abia State, Nigeria, their sociocultural perception of diseases runs counter to biomedical definitions, wherein they rely heavily on traditional medicine and practices. In a state where birth asphyxia and sepsis account for the significant causes of death for neonates, malaria leads to the causes of other mortalities, followed by common preventable diseases such as diarrhea, pneumonia, acute respiratory tract infection, malnutrition, and HIV/AIDS. Most local mothers attribute their health conditions and that of their children to witchcraft attacks, the hand of God, and ancestral underlining. This influences how they see antenatal and postnatal care, choice of place of accessing care and birth delivery, response to children's illnesses, immunization, and nutrition. Method: To implement a community health improvement program, we adopted an asset-based community development model to address health's normative and social determinants. The first step was to use a qualitative approach to conduct a community health needs baseline assessment, involving focus group discussions with twenty-five (25) youths aged 18-25, semi-structured interviews with ten (10) officers-in-charge of primary health centers, eight (8) ward health committee members, and nine (9) community leaders. Secondly, we designed an intervention program. Going forward, we will proceed with implementing and evaluating this program. Result: The priority needs identified by the communities were malaria, lack of clean drinking water, and the need for behavioral change information. The study also highlighted the significant influence of youths on their peers, family, and community as caregivers and information interpreters. Based on the findings, the NGO SieDi-Hub collaborated with the Abia State Ministry of Health, the State Primary Healthcare Agency, and Empower Next Generations to design a one-year "Community Health Youth Champions Pilot Program." Twenty (20) youths in the community were trained and equipped to champion a participatory approach to bridging the gap between access and delivery of primary healthcare, to adjust sociocultural norms to improve health equity for people in Nkporo community – with limited education, lack of access to health information, and quality healthcare facilities using an innovative community-led improvement approach. Conclusion: Youths play a vital role in achieving health equity, being a vulnerable population with significant influence. To ensure effective primary healthcare, strategies must include cultural humility. The asset-based community development model offers valuable tools, and this article will share ongoing lessons from the intervention's behavioral change strategies with young people.Keywords: asset-based community development, community health, primary health systems strengthening, youth empowerment
Procedia PDF Downloads 9378 Introducing, Testing, and Evaluating a Unified JavaScript Framework for Professional Online Studies
Authors: Caspar Goeke, Holger Finger, Dorena Diekamp, Peter König
Abstract:
Online-based research has recently gained increasing attention from various fields of research in the cognitive sciences. Technological advances in the form of online crowdsourcing (Amazon Mechanical Turk), open data repositories (Open Science Framework), and online analysis (Ipython notebook) offer rich possibilities to improve, validate, and speed up research. However, until today there is no cross-platform integration of these subsystems. Furthermore, implementation of online studies still suffers from the complex implementation (server infrastructure, database programming, security considerations etc.). Here we propose and test a new JavaScript framework that enables researchers to conduct any kind of behavioral research in the browser without the need to program a single line of code. In particular our framework offers the possibility to manipulate and combine the experimental stimuli via a graphical editor, directly in the browser. Moreover, we included an action-event system that can be used to handle user interactions, interactively change stimuli properties or store participants’ responses. Besides traditional recordings such as reaction time, mouse and keyboard presses, the tool offers webcam based eye and face-tracking. On top of these features our framework also takes care about the participant recruitment, via crowdsourcing platforms such as Amazon Mechanical Turk. Furthermore, the build in functionality of google translate will ensure automatic text translations of the experimental content. Thereby, thousands of participants from different cultures and nationalities can be recruited literally within hours. Finally, the recorded data can be visualized and cleaned online, and then exported into the desired formats (csv, xls, sav, mat) for statistical analysis. Alternatively, the data can also be analyzed online within our framework using the integrated Ipython notebook. The framework was designed such that studies can be used interchangeably between researchers. This will support not only the idea of open data repositories but also constitutes the possibility to share and reuse the experimental designs and analyses such that the validity of the paradigms will be improved. Particularly, sharing and integrating the experimental designs and analysis will lead to an increased consistency of experimental paradigms. To demonstrate the functionality of the framework we present the results of a pilot study in the field of spatial navigation that was conducted using the framework. Specifically, we recruited over 2000 subjects with various cultural backgrounds and consequently analyzed performance difference in dependence on the factors culture, gender and age. Overall, our results demonstrate a strong influence of cultural factors in spatial cognition. Such an influence has not yet been reported before and would not have been possible to show without the massive amount of data collected via our framework. In fact, these findings shed new lights on cultural differences in spatial navigation. As a consequence we conclude that our new framework constitutes a wide range of advantages for online research and a methodological innovation, by which new insights can be revealed on the basis of massive data collection.Keywords: cultural differences, crowdsourcing, JavaScript framework, methodological innovation, online data collection, online study, spatial cognition
Procedia PDF Downloads 25877 Diffusion MRI: Clinical Application in Radiotherapy Planning of Intracranial Pathology
Authors: Pomozova Kseniia, Gorlachev Gennadiy, Chernyaev Aleksandr, Golanov Andrey
Abstract:
In clinical practice, and especially in stereotactic radiosurgery planning, the significance of diffusion-weighted imaging (DWI) is growing. This makes the existence of software capable of quickly processing and reliably visualizing diffusion data, as well as equipped with tools for their analysis in terms of different tasks. We are developing the «MRDiffusionImaging» software on the standard C++ language. The subject part has been moved to separate class libraries and can be used on various platforms. The user interface is Windows WPF (Windows Presentation Foundation), which is a technology for managing Windows applications with access to all components of the .NET 5 or .NET Framework platform ecosystem. One of the important features is the use of a declarative markup language, XAML (eXtensible Application Markup Language), with which you can conveniently create, initialize and set properties of objects with hierarchical relationships. Graphics are generated using the DirectX environment. The MRDiffusionImaging software package has been implemented for processing diffusion magnetic resonance imaging (dMRI), which allows loading and viewing images sorted by series. An algorithm for "masking" dMRI series based on T2-weighted images was developed using a deformable surface model to exclude tissues that are not related to the area of interest from the analysis. An algorithm of distortion correction using deformable image registration based on autocorrelation of local structure has been developed. Maximum voxel dimension was 1,03 ± 0,12 mm. In an elementary brain's volume, the diffusion tensor is geometrically interpreted using an ellipsoid, which is an isosurface of the probability density of a molecule's diffusion. For the first time, non-parametric intensity distributions, neighborhood correlations, and inhomogeneities are combined in one segmentation of white matter (WM), grey matter (GM), and cerebrospinal fluid (CSF) algorithm. A tool for calculating the coefficient of average diffusion and fractional anisotropy has been created, on the basis of which it is possible to build quantitative maps for solving various clinical problems. Functionality has been created that allows clustering and segmenting images to individualize the clinical volume of radiation treatment and further assess the response (Median Dice Score = 0.963 ± 0,137). White matter tracts of the brain were visualized using two algorithms: deterministic (fiber assignment by continuous tracking) and probabilistic using the Hough transform. The proposed algorithms test candidate curves in the voxel, assigning to each one a score computed from the diffusion data, and then selects the curves with the highest scores as the potential anatomical connections. White matter fibers were visualized using a Hough transform tractography algorithm. In the context of functional radiosurgery, it is possible to reduce the irradiation volume of the internal capsule receiving 12 Gy from 0,402 cc to 0,254 cc. The «MRDiffusionImaging» will improve the efficiency and accuracy of diagnostics and stereotactic radiotherapy of intracranial pathology. We develop software with integrated, intuitive support for processing, analysis, and inclusion in the process of radiotherapy planning and evaluating its results.Keywords: diffusion-weighted imaging, medical imaging, stereotactic radiosurgery, tractography
Procedia PDF Downloads 8576 Environmental Life Cycle Assessment of Circular, Bio-Based and Industrialized Building Envelope Systems
Authors: N. Cihan KayaçEtin, Stijn Verdoodt, Alexis Versele
Abstract:
The construction industry is accounted for one-third of all waste generated in the European Union (EU) countries. The Circular Economy Action Plan of the EU aims to tackle this issue and aspires to enhance the sustainability of the construction industry by adopting more circular principles and bio-based material use. The Interreg Circular Bio-Based Construction Industry (CBCI) project was conceived to research how this adoption can be facilitated. For this purpose, an approach is developed that integrates technical, legal and social aspects and provides business models for circular designing and building with bio-based materials. In the scope of the project, the research outputs are to be displayed in a real-life setting by constructing a demo terraced single-family house, the living lab (LL) located in Ghent (Belgium). The realization of the LL is conducted in a step-wise approach that includes iterative processes for design, description, criteria definition and multi-criteria assessment of building components. The essence of the research lies within the exploratory approach to the state-of-art building envelope and technical systems options for achieving an optimum combination for a circular and bio-based construction. For this purpose, nine preliminary designs (PD) for building envelope are generated, which consist of three basic construction methods: masonry, lightweight steel construction and wood framing construction supplemented with bio-based construction methods like cross-laminated timber (CLT) and massive wood framing. A comparative analysis on the PDs was conducted by utilizing several complementary tools to assess the circularity. This paper focuses on the life cycle assessment (LCA) approach for evaluating the environmental impact of the LL Ghent. The adoption of an LCA methodology was considered critical for providing a comprehensive set of environmental indicators. The PDs were developed at the component level, in particular for the (i) inclined roof, (ii-iii) front and side façade, (iv) internal walls and (v-vi) floors. The assessment was conducted on two levels; component and building level. The options for each component were compared at the first iteration and then, the PDs as an assembly of components were further analyzed. The LCA was based on a functional unit of one square meter of each component and CEN indicators were utilized for impact assessment for a reference study period of 60 years. A total of 54 building components that are composed of 31 distinct materials were evaluated in the study. The results indicate that wood framing construction supplemented with bio-based construction methods performs environmentally better than the masonry or steel-construction options. An analysis on the correlation between the total weight of components and environmental impact was also conducted. It was seen that masonry structures display a high environmental impact and weight, steel structures display low weight but relatively high environmental impact and wooden framing construction display low weight and environmental impact. The study provided valuable outputs in two levels: (i) several improvement options at component level with substitution of materials with critical weight and/or impact per unit, (ii) feedback on environmental performance for the decision-making process during the design phase of a circular single family house.Keywords: circular and bio-based materials, comparative analysis, life cycle assessment (LCA), living lab
Procedia PDF Downloads 18475 Challenges and Proposals for Public Policies Aimed At Increasing Energy Efficiency in Low-Income Communities in Brazil: A Multi-Criteria Approach
Authors: Anna Carolina De Paula Sermarini, Rodrigo Flora Calili
Abstract:
Energy Efficiency (EE) needs investments, new technologies, greater awareness and management on the side of citizens and organizations, and more planning. However, this issue is usually remembered and discussed only in moments of energy crises, and opportunities are missed to take better advantage of the potential of EE in the various sectors of the economy. In addition, there is little concern about the subject among the less favored classes, especially in low-income communities. Accordingly, this article presents suggestions for public policies that aim to increase EE for low-income housing and communities based on international and national experiences. After reviewing the literature, eight policies were listed, and to evaluate them; a multicriteria decision model was developed using the AHP (Analytical Hierarchy Process) and TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) methods, combined with fuzzy logic. Nine experts analyzed the policies according to 9 criteria: economic impact, social impact, environmental impact, previous experience, the difficulty of implementation, possibility/ease of monitoring and evaluating the policies, expected impact, political risks, and public governance and sustainability of the sector. The results found in order of preference are (i) Incentive program for equipment replacement; (ii) Community awareness program; (iii) EE Program with a greater focus on low income; (iv) Staggered and compulsory certification of social interest buildings; (v) Programs for the expansion of smart metering, energy monitoring and digitalization; (vi) Financing program for construction and retrofitting of houses with the emphasis on EE; (vii) Income tax deduction for investment in EE projects in low-income households made by companies; (viii) White certificates of energy for low-income. First, the policy of equipment substitution has been employed in Brazil and the world and has proven effective in promoting EE. For implementation, efforts are needed from the federal and state governments, which can encourage companies to reduce prices, and provide some type of aid for the purchase of such equipment. In second place is the community awareness program, promoting socio-educational actions on EE concepts and with energy conservation tips. This policy is simple to implement and has already been used by many distribution utilities in Brazil. It can be carried out through bids defined by the government in specific areas, being executed by third sector companies with public and private resources. Third on the list is the proposal to continue the Energy Efficiency Program (which obliges electric energy companies to allocate resources for research in the area) by suggesting the return of the mandatory investment of 60% of the resources in projects for low income. It is also relatively simple to implement, requiring efforts by the federal government to make it mandatory, and on the part of the distributors, compliance is needed. The success of the suggestions depends on changes in the established rules and efforts from the interested parties. For future work, we suggest the development of pilot projects in low-income communities in Brazil and the application of other multicriteria decision support methods to compare the results obtained in this study.Keywords: energy efficiency, low-income community, public policy, multicriteria decision making
Procedia PDF Downloads 11974 Analyzing the Effectiveness of Elderly Design and the Impact on Sustainable Built Environment
Authors: Tristance Kee
Abstract:
With an unprecedented increase in elderly population around the world, the severe lack of quality housing and health-and-safety provisions to serve this cohort cannot be ignored any longer. Many elderly citizens, especially singletons, live in unsafe housing conditions with poorly executed planning and design. Some suffer from deteriorating mobility, sight and general alertness and their sub-standard living conditions further hinder their daily existence. This research explains how concepts such as Universal Design and Co-Design operate in a high density city such as Hong Kong, China where innovative design can become an alternative solution where government and the private sector fail to provide quality elderly friendly facilities to promote a sustainable urban development. Unlike other elderly research which focuses more on housing policies, nursing care and theories, this research takes a more progressive approach by providing an in-depth impact assessment on how innovative design can be practical solutions for creating a more sustainable built environment. The research objectives are to: 1) explain the relationship between innovative design for elderly and a healthier and sustainable environment; 2) evaluate the impact of human ergonomics with the use of universal design; and 3) explain how innovation can enhance the sustainability of a city in improving citizen’s sight, sound, walkability and safety within the ageing population. The research adopts both qualitative and quantitative methodologies to examine ways to improve elderly population’s relationship to our built environment. In particular, the research utilizes collected data from questionnaire survey and focus group discussions to obtain inputs from various stakeholders, including designers, operators and managers related to public housing, community facilities and overall urban development. In addition to feedbacks from end-users and stakeholders, a thorough analysis on existing elderly housing facilities and Universal Design provisions are examined to evaluate their adequacy. To echo the theme of this conference on Innovation and Sustainable Development, this research examines the effectiveness of innovative design in a risk-benefit factor assessment. To test the hypothesis that innovation can cater for a sustainable development, the research evaluated the health improvement of a sample size of 150 elderly in a period of eight months. Their health performances, including mobility, speech and memory are monitored and recorded on a regular basis to assess if the use of innovation does trigger impact on improving health and home safety for an elderly cohort. This study was supported by district community centers under the auspices of Home Affairs Bureau to provide respondents for questionnaire survey, a standardized evaluation mechanism, and professional health care staff for evaluating the performance impact. The research findings will be integrated to formulate design solutions such as innovative home products to improve elderly daily experience and safety with a particular focus on the enhancement on sight, sound and mobility safety. Some policy recommendations and architectural planning recommendations related to Universal Design will also be incorporated into the research output for future planning of elderly housing and amenity provisions.Keywords: elderly population, innovative design, sustainable built environment, universal design
Procedia PDF Downloads 23073 Computer-Integrated Surgery of the Human Brain, New Possibilities
Authors: Ugo Galvanetto, Pirto G. Pavan, Mirco Zaccariotto
Abstract:
The discipline of Computer-integrated surgery (CIS) will provide equipment able to improve the efficiency of healthcare systems and, which is more important, clinical results. Surgeons and machines will cooperate in new ways that will extend surgeons’ ability to train, plan and carry out surgery. Patient specific CIS of the brain requires several steps: 1 - Fast generation of brain models. Based on image recognition of MR images and equipped with artificial intelligence, image recognition techniques should differentiate among all brain tissues and segment them. After that, automatic mesh generation should create the mathematical model of the brain in which the various tissues (white matter, grey matter, cerebrospinal fluid …) are clearly located in the correct positions. 2 – Reliable and fast simulation of the surgical process. Computational mechanics will be the crucial aspect of the entire procedure. New algorithms will be used to simulate the mechanical behaviour of cutting through cerebral tissues. 3 – Real time provision of visual and haptic feedback A sophisticated human-machine interface based on ergonomics and psychology will provide the feedback to the surgeon. The present work will address in particular point 2. Modelling the cutting of soft tissue in a structure as complex as the human brain is an extremely challenging problem in computational mechanics. The finite element method (FEM), that accurately represents complex geometries and accounts for material and geometrical nonlinearities, is the most used computational tool to simulate the mechanical response of soft tissues. However, the main drawback of FEM lies in the mechanics theory on which it is based, classical continuum Mechanics, which assumes matter is a continuum with no discontinuity. FEM must resort to complex tools such as pre-defined cohesive zones, external phase-field variables, and demanding remeshing techniques to include discontinuities. However, all approaches to equip FEM computational methods with the capability to describe material separation, such as interface elements with cohesive zone models, X-FEM, element erosion, phase-field, have some drawbacks that make them unsuitable for surgery simulation. Interface elements require a-priori knowledge of crack paths. The use of XFEM in 3D is cumbersome. Element erosion does not conserve mass. The Phase Field approach adopts a diffusive crack model instead of describing true tissue separation typical of surgical procedures. Modelling discontinuities, so difficult when using computational approaches based on classical continuum Mechanics, is instead easy for novel computational methods based on Peridynamics (PD). PD is a non-local theory of mechanics formulated with no use of spatial derivatives. Its governing equations are valid at points or surfaces of discontinuity, and it is, therefore especially suited to describe crack propagation and fragmentation problems. Moreover, PD does not require any criterium to decide the direction of crack propagation or the conditions for crack branching or coalescence; in the PD-based computational methods, cracks develop spontaneously in the way which is the most convenient from an energy point of view. Therefore, in PD computational methods, crack propagation in 3D is as easy as it is in 2D, with a remarkable advantage with respect to all other computational techniques.Keywords: computational mechanics, peridynamics, finite element, biomechanics
Procedia PDF Downloads 81