Search results for: manning’s equation for open channel flow
438 Optimal Framework of Policy Systems with Innovation: Use of Strategic Design for Evolution of Decisions
Authors: Yuna Lee
Abstract:
In the current policy process, there has been a growing interest in more open approaches that incorporate creativity and innovation based on the forecasting groups composed by the public and experts together into scientific data-driven foresight methods to implement more effective policymaking. Especially, citizen participation as collective intelligence in policymaking with design and deep scale of innovation at the global level has been developed and human-centred design thinking is considered as one of the most promising methods for strategic foresight. Yet, there is a lack of a common theoretical foundation for a comprehensive approach for the current situation of and post-COVID-19 era, and substantial changes in policymaking practice are insignificant and ongoing with trial and error. This project hypothesized that rigorously developed policy systems and tools that support strategic foresight by considering the public understanding could maximize ways to create new possibilities for a preferable future, however, it must involve a better understating of Behavioural Insights, including individual and cultural values, profit motives and needs, and psychological motivations, for implementing holistic and multilateral foresight and creating more positive possibilities. To what extent is the policymaking system theoretically possible that incorporates the holistic and comprehensive foresight and policy process implementation, assuming that theory and practice, in reality, are different and not connected? What components and environmental conditions should be included in the strategic foresight system to enhance the capacity of decision from policymakers to predict alternative futures, or detect uncertainties of the future more accurately? And, compared to the required environmental condition, what are the environmental vulnerabilities of the current policymaking system? In this light, this research contemplates the question of how effectively policymaking practices have been implemented through the synthesis of scientific, technology-oriented innovation with the strategic design for tackling complex societal challenges and devising more significant insights to make society greener and more liveable. Here, this study conceptualizes the notions of a new collaborative way of strategic foresight that aims to maximize mutual benefits between policy actors and citizens through the cooperation stemming from evolutionary game theory. This study applies mixed methodology, including interviews of policy experts, with the case in which digital transformation and strategic design provided future-oriented solutions or directions to cities’ sustainable development goals and society-wide urgent challenges such as COVID-19. As a result, artistic and sensual interpreting capabilities through strategic design promote a concrete form of ideas toward a stable connection from the present to the future and enhance the understanding and active cooperation among decision-makers, stakeholders, and citizens. Ultimately, an improved theoretical foundation proposed in this study is expected to help strategically respond to the highly interconnected future changes of the post-COVID-19 world.Keywords: policymaking, strategic design, sustainable innovation, evolution of cooperation
Procedia PDF Downloads 195437 Digital Image Correlation Based Mechanical Response Characterization of Thin-Walled Composite Cylindrical Shells
Authors: Sthanu Mahadev, Wen Chan, Melanie Lim
Abstract:
Anisotropy dominated continuous-fiber composite materials have garnered attention in numerous mechanical and aerospace structural applications. Tailored mechanical properties in advanced composites can exhibit superiority in terms of stiffness-to-weight ratio, strength-to-weight ratio, low-density characteristics, coupled with significant improvements in fatigue resistance as opposed to metal structure counterparts. Extensive research has demonstrated their core potential as more than just mere lightweight substitutes to conventional materials. Prior work done by Mahadev and Chan focused on formulating a modified composite shell theory based prognosis methodology for investigating the structural response of thin-walled circular cylindrical shell type composite configurations under in-plane mechanical loads respectively. The prime motivation to develop this theory stemmed from its capability to generate simple yet accurate closed-form analytical results that can efficiently characterize circular composite shell construction. It showcased the development of a novel mathematical framework to analytically identify the location of the centroid for thin-walled, open cross-section, curved composite shells that were characterized by circumferential arc angle, thickness-to-mean radius ratio, and total laminate thickness. Ply stress variations for curved cylindrical shells were analytically examined under the application of centric tensile and bending loading. This work presents a cost-effective, small-platform experimental methodology by taking advantage of the full-field measurement capability of digital image correlation (DIC) for an accurate assessment of key mechanical parameters such as in-plane mechanical stresses and strains, centroid location etc. Mechanical property measurement of advanced composite materials can become challenging due to their anisotropy and complex failure mechanisms. Full-field displacement measurements are well suited for characterizing the mechanical properties of composite materials because of the complexity of their deformation. This work encompasses the fabrication of a set of curved cylindrical shell coupons, the design and development of a novel test-fixture design and an innovative experimental methodology that demonstrates the capability to very accurately predict the location of centroid in such curved composite cylindrical strips via employing a DIC based strain measurement technique. Error percentage difference between experimental centroid measurements and previously estimated analytical centroid results are observed to be in good agreement. The developed analytical modified-shell theory provides the capability to understand the fundamental behavior of thin-walled cylindrical shells and offers the potential to generate novel avenues to understand the physics of such structures at a laminate level.Keywords: anisotropy, composites, curved cylindrical shells, digital image correlation
Procedia PDF Downloads 318436 Automatic Moderation of Toxic Comments in the Face of Local Language Complexity in Senegal
Authors: Edouard Ngor Sarr, Abel Diatta, Serigne Mor Toure, Ousmane Sall, Lamine Faty
Abstract:
Thanks to Web 2, we are witnessing a form of democratization of the spoken word, an exponential increase in the number of users on the web, but also, and above all, the accumulation of a daily flow of content that is becoming, at times, uncontrollable. Added to this is the rise of a violent social fabric characterised by hateful and racial comments, insults, and other content that contravenes social rules and the platforms' terms of use. Consequently, managing and regulating this mass of new content is proving increasingly difficult, requiring substantial human, technical, and technological resources. Without regulation and with the complicity of anonymity, this toxic content can pollute discussions and make these online spaces highly conducive to abuse, which very often has serious consequences for certain internet users, ranging from anxiety to suicide, depression, or withdrawal. The toxicity of a comment is defined as anything that is rude, disrespectful, or likely to cause someone to leave a discussion or to take violent action against a person or a community. Two levels of measures are needed to deal with this deleterious situation. The first measures are being taken by governments through draft laws with a dual objective: (i) to punish the perpetrators of these abuses and (ii) to make online platforms accountable for the mistakes made by their users. The second measure comes from the platforms themselves. By assessing the content left by users, they can set up filters to block and/or delete content or decide to suspend the user in question for good. However, the speed of discussions and the volume of data involved mean that platforms are unable to properly monitor the moderation of content produced by Internet users. That's why they use human moderators, either through recruitment or outsourcing. Moderating comments on the web means assessing and monitoring users‘ comments on online platforms in order to strike the right balance between protection against abuse and users’ freedom of expression. It makes it possible to determine which publications and users are allowed to remain online and which are deleted or suspended, how authorised publications are displayed, and what actions accompany content deletions. In this study, we look at the problem of automatic moderation of toxic comments in the face of local African languages and, more specifically, on social network comments in Senegal. We review the state of the art, highlighting the different approaches, algorithms, and tools for moderating comments. We also study the issues and challenges of moderation in the face of web ecosystems with lesser-known languages, such as local languages.Keywords: moderation, local languages, Senegal, toxic comments
Procedia PDF Downloads 12435 Hydrogeomatic System for the Economic Evaluation of Damage by Flooding in Mexico
Authors: Alondra Balbuena Medina, Carlos Diaz Delgado, Aleida Yadira Vilchis Fránces
Abstract:
In Mexico, each year news is disseminated about the ravages of floods, such as the total loss of housing, damage to the fields; the increase of the costs of the food, derived from the losses of the harvests, coupled with health problems such as skin infection, etc. In addition to social problems such as delinquency, damage in education institutions and the population in general. The flooding is a consequence of heavy rains, tropical storms and or hurricanes that generate excess water in drainage systems that exceed its capacity. In urban areas, heavy rains can be one of the main factors in causing flooding, in addition to excessive precipitation, dam breakage, and human activities, for example, excessive garbage in the strainers. In agricultural areas, these can hardly achieve large areas of cultivation. It should be mentioned that for both areas, one of the significant impacts of floods is that they can permanently affect the livelihoods of many families, cause damage, for example in their workplaces such as farmlands, commercial or industry areas and where services are provided. In recent years, Information and Communication Technologies (ICT) have had an accelerated development, being reflected in the growth and the exponential evolution of the innovation giving; as a result, the daily generation of new technologies, updates, and applications. Innovation in the development of Information Technology applications has impacted on all areas of human activity. They influence all the orders of life of individuals, reconfiguring the way of perceiving and analyzing the world such as, for instance, interrelating with people as individuals and as a society, in the economic, political, social, cultural, educational, environmental, etc. Therefore the present work describes the creation of a system of calculation of flood costs for housing areas, retail establishments and agricultural areas from the Mexican Republic, based on the use and application of geotechnical tools being able to be useful for the benefit of the sectors of public, education and private. To generate analysis of hydrometereologic affections and with the obtained results to realize the Geoinformatics tool was constructed from two different points of view: the geoinformatic (design and development of GIS software) and the methodology of flood damage validation in order to integrate a tool that provides the user the monetary estimate of the effects caused by the floods. With information from the period 2000-2014, the functionality of the application was corroborated. For the years 2000 to 2009 only the analysis of the agricultural and housing areas was carried out, incorporating for the commercial establishment's information of the period 2010 - 2014. The method proposed for the resolution of this research project is a fundamental contribution to society, in addition to the tool itself. Therefore, it can be summarized that the problems that are in the physical-geographical environment, conceiving them from the point of view of the spatial analysis, allow to offer different alternatives of solution and also to open up slopes towards academia and research.Keywords: floods, technological innovation, monetary estimation, spatial analysis
Procedia PDF Downloads 225434 Geographical Information System and Multi-Criteria Based Approach to Locate Suitable Sites for Industries to Minimize Agriculture Land Use Changes in Bangladesh
Authors: Nazia Muhsin, Tofael Ahamed, Ryozo Noguchi, Tomohiro Takigawa
Abstract:
One of the most challenging issues to achieve sustainable development on food security is land use changes. The crisis of lands for agricultural production mainly arises from the unplanned transformation of agricultural lands to infrastructure development i.e. urbanization and industrialization. Land use without sustainability assessment could have impact on the food security and environmental protections. Bangladesh, as the densely populated country with limited arable lands is now facing challenges to meet sustainable food security. Agricultural lands are using for economic growth by establishing industries. The industries are spreading from urban areas to the suburban areas and using the agricultural lands. To minimize the agricultural land losses for unplanned industrialization, compact economic zones should be find out in a scientific approach. Therefore, the purpose of the study was to find out suitable sites for industrial growth by land suitability analysis (LSA) by using Geographical Information System (GIS) and multi-criteria analysis (MCA). The goal of the study was to emphases both agricultural lands and industries for sustainable development in land use. The study also attempted to analysis the agricultural land use changes in a suburban area by statistical data of agricultural lands and primary data of the existing industries of the study place. The criteria were selected as proximity to major roads, and proximity to local roads, distant to rivers, waterbodies, settlements, flood-flow zones, agricultural lands for the LSA. The spatial dataset for the criteria were collected from the respective departments of Bangladesh. In addition, the elevation spatial dataset were used from the SRTM (Shuttle Radar Topography Mission) data source. The criteria were further analyzed with factors and constraints in ArcGIS®. Expert’s opinion were applied for weighting the criteria according to the analytical hierarchy process (AHP), a multi-criteria technique. The decision rule was set by using ‘weighted overlay’ tool to aggregate the factors and constraints with the weights of the criteria. The LSA found only 5% of land was most suitable for industrial sites and few compact lands for industrial zones. The developed LSA are expected to help policy makers of land use and urban developers to ensure the sustainability of land uses and agricultural production.Keywords: AHP (analytical hierarchy process), GIS (geographic information system), LSA (land suitability analysis), MCA (multi-criteria analysis)
Procedia PDF Downloads 263433 Management and Genetic Characterization of Local Sheep Breeds for Better Productive and Adaptive Traits
Authors: Sonia Bedhiaf-Romdhani
Abstract:
The sheep (Ovis aries) was domesticated, approximately 11,000 years ago (YBP), in the Fertile Crescent from Asian Mouflon (Ovis Orientalis). The Northern African (NA) sheep is 7,000 years old, represents a remarkable diversity of sheep populations reared under traditional and low input farming systems (LIFS) over millennia. The majority of small ruminants in developing countries are encountered in low input production systems and the resilience of local communities in rural areas is often linked to the wellbeing of small ruminants. Regardless of the rich biodiversity encountered in sheep ecotypes there are four main sheep breeds in the country with 61,6 and 35.4 percents of Barbarine (fat tail breed) and Queue Fine de l’Ouest (thin tail breed), respectively. Phoenicians introduced the Barbarine sheep from the steppes of Central Asia in the Carthaginian period, 3000 years ago. The Queue Fine de l’Ouest is a thin-tailed meat breed heavily concentrated in the Western and the central semi-arid regions. The Noire de Thibar breed, involving mutton-fine wool producing animals, has been on the verge of extinction, it’s a composite black coated sheep breed found in the northern sub-humid region because of its higher nutritional requirements and non-tolerance of the prevailing harsher condition. The D'Man breed, originated from Morocco, is mainly located in the southern oases of the extreme arid ecosystem. A genetic investigation of Tunisian sheep breeds using a genome-wide scan of approximately 50,000 SNPs was performed. Genetic analysis of relationship between breeds highlighted the genetic differentiation of Noire de Thibar breed from the other local breeds, reflecting the effect of past events of introgression of European gene pool. The Queue Fine de l’Ouest breed showed a genetic heterogeneity and was close to Barbarine. The D'Man breed shared a considerable gene flow with the thin-tailed Queue Fine de l'Ouest breed. Native small ruminants breeds, are capable to be efficiently productive if essential ingredients and coherent breeding schemes are implemented and followed. Assessing the status of genetic variability of native sheep breeds could provide important clues for research and policy makers to devise better strategies for the conservation and management of genetic resources.Keywords: sheep, farming systems, diversity, SNPs.
Procedia PDF Downloads 147432 Totally Implantable Venous Access Device for Long Term Parenteral Nutrition in a Patient with High Output Enterocutaneous Fistula Due to Advanced Malignancy
Authors: Puneet Goyal, Aarti Agarwal
Abstract:
Background and Objective: Nutritional support is an integral part of palliative care of advanced non-resectable abdominal malignancy patients, though is frequently neglected aspect. Non-Healing high output Entero-cutaneous fistulas sometimes require long term parenteral nutrition, to take care of catabolism and replacement of nutrients. We present a case of inoperable pancreatic malignancy with high output entero-cutaneous fistula, which was provided parenteral nutritional support with the use of Totally Implantable Venous Access Device (TIVAD). Method and Results: 55 year old man diagnosed with carcinoma pancreas had developed high entero-cutaneous fistula. His tumor was found to be inoperable and was on total parenteral nutrition through routine central line. This line was difficult to maintain as he required it for a long term TPN. He was planned to undergo Totally Implantable Venous Access Device (TIVAD) implantation. 8Fr single lumen catheter with Groshong non-return Valve (Bard Access Systems, Inc. USA) was inserted through right internal jugular vein, under fluoroscopic guidance. The catheter was tunneled subcutaneously and brought towards infraclavicular pocket, cut at appropriate length and connected to port and locked. Port was sutured in floor of pocket. Free flow of blood aspirated, flushed with heparinized saline. There was no kink observed in entire length of catheter under fluoroscopy. Skin over infraclavicular pocket was sutured. Long term catheter care and associated risks were explained to patient and relatives. Patient continued to receive total parenteral nutrition as well as other supportive therapy though TIVAD for next 6 weeks, till his demise. Conclusion: TIVADs are standard of care for long term venous access solutions in cancer patients requiring chemotherapy. In this case, we extended its use for providing parenteral nutrition and other supportive therapy. TIVADs can be implanted in advanced cancer patients for providing venous access solution required for various palliative treatments and medications. This will help in improving quality of life and satisfaction amongst terminally ill cancer patients.Keywords: parenteral nutrition, totally implantable venous access device, long term venous access, interventions in anesthesiology
Procedia PDF Downloads 248431 Robust Electrical Segmentation for Zone Coherency Delimitation Base on Multiplex Graph Community Detection
Authors: Noureddine Henka, Sami Tazi, Mohamad Assaad
Abstract:
The electrical grid is a highly intricate system designed to transfer electricity from production areas to consumption areas. The Transmission System Operator (TSO) is responsible for ensuring the efficient distribution of electricity and maintaining the grid's safety and quality. However, due to the increasing integration of intermittent renewable energy sources, there is a growing level of uncertainty, which requires a faster responsive approach. A potential solution involves the use of electrical segmentation, which involves creating coherence zones where electrical disturbances mainly remain within the zone. Indeed, by means of coherent electrical zones, it becomes possible to focus solely on the sub-zone, reducing the range of possibilities and aiding in managing uncertainty. It allows faster execution of operational processes and easier learning for supervised machine learning algorithms. Electrical segmentation can be applied to various applications, such as electrical control, minimizing electrical loss, and ensuring voltage stability. Since the electrical grid can be modeled as a graph, where the vertices represent electrical buses and the edges represent electrical lines, identifying coherent electrical zones can be seen as a clustering task on graphs, generally called community detection. Nevertheless, a critical criterion for the zones is their ability to remain resilient to the electrical evolution of the grid over time. This evolution is due to the constant changes in electricity generation and consumption, which are reflected in graph structure variations as well as line flow changes. One approach to creating a resilient segmentation is to design robust zones under various circumstances. This issue can be represented through a multiplex graph, where each layer represents a specific situation that may arise on the grid. Consequently, resilient segmentation can be achieved by conducting community detection on this multiplex graph. The multiplex graph is composed of multiple graphs, and all the layers share the same set of vertices. Our proposal involves a model that utilizes a unified representation to compute a flattening of all layers. This unified situation can be penalized to obtain (K) connected components representing the robust electrical segmentation clusters. We compare our robust segmentation to the segmentation based on a single reference situation. The robust segmentation proves its relevance by producing clusters with high intra-electrical perturbation and low variance of electrical perturbation. We saw through the experiences when robust electrical segmentation has a benefit and in which context.Keywords: community detection, electrical segmentation, multiplex graph, power grid
Procedia PDF Downloads 79430 Coupling of Microfluidic Droplet Systems with ESI-MS Detection for Reaction Optimization
Authors: Julia R. Beulig, Stefan Ohla, Detlev Belder
Abstract:
In contrast to off-line analytical methods, lab-on-a-chip technology delivers direct information about the observed reaction. Therefore, microfluidic devices make an important scientific contribution, e.g. in the field of synthetic chemistry. Herein, the rapid generation of analytical data can be applied for the optimization of chemical reactions. These microfluidic devices enable a fast change of reaction conditions as well as a resource saving method of operation. In the presented work, we focus on the investigation of multiphase regimes, more specifically on a biphasic microfluidic droplet systems. Here, every single droplet is a reaction container with customized conditions. The biggest challenge is the rapid qualitative and quantitative readout of information as most detection techniques for droplet systems are non-specific, time-consuming or too slow. An exception is the electrospray mass spectrometry (ESI-MS). The combination of a reaction screening platform with a rapid and specific detection method is an important step in droplet-based microfluidics. In this work, we present a novel approach for synthesis optimization on the nanoliter scale with direct ESI-MS detection. The development of a droplet-based microfluidic device, which enables the modification of different parameters while simultaneously monitoring the effect on the reaction within a single run, is shown. By common soft- and photolithographic techniques a polydimethylsiloxane (PDMS) microfluidic chip with different functionalities is developed. As an interface for the MS detection, we use a steel capillary for ESI and improve the spray stability with a Teflon siphon tubing, which is inserted underneath the steel capillary. By optimizing the flow rates, it is possible to screen parameters of various reactions, this is exemplarity shown by a Domino Knoevenagel Hetero-Diels-Alder reaction. Different starting materials, catalyst concentrations and solvent compositions are investigated. Due to the high repetition rate of the droplet production, each set of reaction condition is examined hundreds of times. As a result, of the investigation, we receive possible reagents, the ideal water-methanol ratio of the solvent and the most effective catalyst concentration. The developed system can help to determine important information about the optimal parameters of a reaction within a short time. With this novel tool, we make an important step on the field of combining droplet-based microfluidics with organic reaction screening.Keywords: droplet, mass spectrometry, microfluidics, organic reaction, screening
Procedia PDF Downloads 302429 Assessment of Potential Chemical Exposure to Betamethasone Valerate and Clobetasol Propionate in Pharmaceutical Manufacturing Laboratories
Authors: Nadeen Felemban, Hamsa Banjer, Rabaah Jaafari
Abstract:
One of the most common hazards in the pharmaceutical industry is the chemical hazard, which can cause harm or develop occupational health diseases/illnesses due to chronic exposures to hazardous substances. Therefore, a chemical agent management system is required, including hazard identification, risk assessment, controls for specific hazards and inspections, to keep your workplace healthy and safe. However, routine management monitoring is also required to verify the effectiveness of the control measures. Moreover, Betamethasone Valerate and Clobetasol Propionate are some of the APIs (Active Pharmaceutical Ingredients) with highly hazardous classification-Occupational Hazard Category (OHC 4), which requires a full containment (ECA-D) during handling to avoid chemical exposure. According to Safety Data Sheet, those chemicals are reproductive toxicants (reprotoxicant H360D), which may affect female workers’ health and cause fatal damage to an unborn child, or impair fertility. In this study, qualitative (chemical Risk assessment-qCRA) was conducted to assess the chemical exposure during handling of Betamethasone Valerate and Clobetasol Propionate in pharmaceutical laboratories. The outcomes of qCRA identified that there is a risk of potential chemical exposure (risk rating 8 Amber risk). Therefore, immediate actions were taken to ensure interim controls (according to the Hierarchy of controls) are in place and in use to minimize the risk of chemical exposure. No open handlings should be done out of the Steroid Glove Box Isolator (SGB) with the required Personal Protective Equipment (PPEs). The PPEs include coverall, nitrile hand gloves, safety shoes and powered air-purifying respirators (PAPR). Furthermore, a quantitative assessment (personal air sampling) was conducted to verify the effectiveness of the engineering controls (SGB Isolator) and to confirm if there is chemical exposure, as indicated earlier by qCRA. Three personal air samples were collected using an air sampling pump and filter (IOM2 filters, 25mm glass fiber media). The collected samples were analyzed by HPLC in the BV lab, and the measured concentrations were reported in (ug/m3) with reference to Occupation Exposure Limits, 8hr OELs (8hr TWA) for each analytic. The analytical results are needed in 8hr TWA (8hr Time-weighted Average) to be analyzed using Bayesian statistics (IHDataAnalyst). The results of the Bayesian Likelihood Graph indicate (category 0), which means Exposures are de "minimus," trivial, or non-existent Employees have little to no exposure. Also, these results indicate that the 3 samplings are representative samplings with very low variations (SD=0.0014). In conclusion, the engineering controls were effective in protecting the operators from such exposure. However, routine chemical monitoring is required every 3 years unless there is a change in the processor type of chemicals. Also, frequent management monitoring (daily, weekly, and monthly) is required to ensure the control measures are in place and in use. Furthermore, a Similar Exposure Group (SEG) was identified in this activity and included in the annual health surveillance for health monitoring.Keywords: occupational health and safety, risk assessment, chemical exposure, hierarchy of control, reproductive
Procedia PDF Downloads 173428 Multiple Plant-Based Cell Suspension as a Bio-Ink for 3D Bioprinting Applications in Food Technology
Authors: Yusuf Hesham Mohamed
Abstract:
Introduction: Three-dimensional printing technology includes multiple procedures that fabricate three-dimensional objects through consecutively layering two-dimensional cross-sections on top of each other. 3D bioprinting is a promising field of 3D printing, which fabricates tissues and organs by accurately controlling the proper arrangement of diverse biological components. 3D bioprinting uses software and prints biological materials and their supporting components layer-by-layer on a substrate or in a tissue culture plate to produce complex live tissues and organs. 3D food printing is an emerging field of 3D bioprinting in which the 3D printed products are food products that are cheap, require less effort to produce, and have more desirable traits. The Aim of the Study is the development of an affordable 3D bioprinter by altering a locally made CNC instrument with an open-source platform to suit the 3D bio-printer purposes. Later, we went through applying the prototype in several applications regarding food technology and drug testing, including the organ-On-Chip. Materials and Methods: An off-the-shelf 3D printer was modified by designing and fabricating the syringe unit, which was designed on the basis of the Milli-fluidics system. Sodium alginate and gelatin hydrogels were prepared, followed by leaf cell suspension preparation from narrow sections of Fragaria’s viable leaves. The desired 3D structure was modeled, and 3D printing preparations took place. Cell-free and cell-laden hydrogels were printed at room temperature under sterile conditions. Post printing curing process was performed. The printed structure was further studied. Results: Positive results have been achieved using the altered 3D bioprinter where a 3D hydrogel construct of two layers made of the combination of sodium alginate to gelatin (15%: 0.5%) has been printed. DLP 3D printer was used to design the syringe component with a transparent PLA-Pro resin for the creation of a microfluidics system having two channels altered to the double extruder. The hydrogel extruder’s design was based on peristaltic pumps, which utilized a stepper motor. The design and fabrication were made using DIY-3D printed parts. Hard plastic PLA was the material utilized for printing. SEM was used to carry out the porous 3D construct imaging. Multiple physical and chemical tests were performed in order to ensure that the cell line was suitable for hosting. Fragaria plant was developed by suspending Fragaria’s cells from its leaves using the 3D bioprinter. Conclusion: 3D bioprinting is considered to be an emerging scientific field that can facilitate and improve many scientific tests and studies. Thus, having a 3D bioprinter in labs is considered to be an essential requirement. 3D bioprinters are very expensive; however, the fabrication of a 3D printer into a 3D bioprinter can lower the cost of the bioprinter. The 3D bioprinter implemented made use of peristaltic pumps instead of syringe-based pumps in order to extend the ability to print multiple types of materials and cells.Keywords: scaffold, eco on chip, 3D bioprinter, DLP printer
Procedia PDF Downloads 120427 Project Management and International Development: Competencies for International Assignment
Authors: M. P. Leroux, C. Coulombe
Abstract:
Projects are popular vehicles through which international aid is delivered in developing countries. To achieve their objectives, many northern organizations develop projects with local partner organizations in the developing countries through technical assistance projects. International aid and international development projects precisely have long been criticized for poor results although billions are spent every year. Little empirical research in the field of project management has the focus on knowledge transfer in international development context. This paper focuses particularly on personal dimensions of international assignees participating in project within local team members in the host country. We propose to explore the possible links with a human resource management perspective in order to shed light on the less research problematic of knowledge transfer in development cooperation projects. The process leading to capacity building being far complex, involving multiple dimensions and far from being linear, we propose here to assess if traditional research on expatriate in multinational corporations pertain to the field of project management in developing countries. The following question is addressed: in the context of international development project cooperation, what personal determinants should the selection process focus when looking to fill a technical assistance position in a developing country? To answer that question, we first reviewed the literature on expatriate in the context of inter organizational knowledge transfer. Second, we proposed a theoretical framework combining perspectives of development studies and management to explore if parallels can be draw between traditional international assignment and technical assistance project assignment in developing countries. We conducted an exploratory study using case studies from technical assistance initiatives led in Haiti, a country in Central America. Data were collected from multiple sources following qualitative study research methods. Direct observations in the field were allowed by local leaders of six organization; individual interviews with present and past international assignees, individual interview with local team members, and focus groups were organized in order to triangulate information collected. Contrary from empirical research on knowledge transfer in multinational corporations, results tend to show that technical expertise rank well behind many others characteristics. Results tend to show the importance of soft skills, as a prerequisite to succeed in projects where local team have to collaborate. More importantly, international assignees who were talking knowledge sharing instead of knowledge transfer seemed to feel more satisfied at the end of their mandate than the others. Reciprocally, local team members who perceived to have participated in a project with an expat looking to share instead of aiming to transfer knowledge seemed to describe the results of project in more positive terms than the others. Results obtained from this exploratory study open the way for a promising research agenda in the field of project management. It emphasises the urgent need to achieve a better understanding on the complex set of soft skills project managers or project chiefs would benefit to develop, in particular, the ability to absorb knowledge and the willingness to share one’s knowledge.Keywords: international assignee, international project cooperation, knowledge transfer, soft skills
Procedia PDF Downloads 142426 Optical Assessment of Marginal Sealing Performance around Restorations Using Swept-Source Optical Coherence Tomography
Authors: Rima Zakzouk, Yasushi Shimada, Yasunori Sumi, Junji Tagami
Abstract:
Background and purpose: The resin composite has become the main material for the restorations of caries in recent years due to aesthetic characteristics, especially with the development of the adhesive techniques. The quality of adhesion to tooth structures is depending on an exchange process between inorganic tooth material and synthetic resin and a micromechanical retention promoted by resin infiltration in partially demineralized dentin. Optical coherence tomography (OCT) is a noninvasive diagnostic method for obtaining cross-sectional images that produce high-resolution of the biological tissue at the micron scale. The aim of this study was to evaluate the gap formation at adhesive/tooth interface of two-step self-etch adhesives that are preceded with or without phosphoric acid pre-etching in different regions of teeth using SS-OCT. Materials and methods: Round tapered cavities (2×2 mm) were prepared in cervical part of bovine incisors teeth and divided into 2 groups (n=10): first group self-etch adhesive (Clearfil SE Bond) was applied for SE group and second group treated with acid etching before applying the self-etch adhesive for PA group. Subsequently, both groups were restored with Estelite Flow Quick Flowable Composite Resin and observed under OCT. Following 5000 thermal cycles, the same section was obtained again for each cavity using OCT at 1310-nm wavelength. Scanning was repeated after two months to monitor the gap progress. Then the gap length was measured using image analysis software, and the statistics analysis were done between both groups using SPSS software. After that, the cavities were sectioned and observed under Confocal Laser Scanning Microscope (CLSM) to confirm the result of OCT. Results: Gaps formed at the bottom of the cavity was longer than the gap formed at the margin and dento-enamel junction in both groups. On the other hand, pre-etching treatment led to damage the DEJ regions creating longer gap. After 2 months the results showed almost progress in the gap length significantly at the bottom regions in both groups. In conclusions, phosphoric acid etching treatment did not reduce the gap lrngth in most regions of the cavity. Significance: The bottom region of tooth was more exposed to gap formation than margin and DEJ regions, The DEJ damaged with phosphoric acid treatment.Keywords: optical coherence tomography, self-etch adhesives, bottom, dento enamel junction
Procedia PDF Downloads 227425 Computer Aided Design Solution Based on Genetic Algorithms for FMEA and Control Plan in Automotive Industry
Authors: Nadia Belu, Laurenţiu Mihai Ionescu, Agnieszka Misztal
Abstract:
The automotive industry is one of the most important industries in the world that concerns not only the economy, but also the world culture. In the present financial and economic context, this field faces new challenges posed by the current crisis, companies must maintain product quality, deliver on time and at a competitive price in order to achieve customer satisfaction. Two of the most recommended techniques of quality management by specific standards of the automotive industry, in the product development, are Failure Mode and Effects Analysis (FMEA) and Control Plan. FMEA is a methodology for risk management and quality improvement aimed at identifying potential causes of failure of products and processes, their quantification by risk assessment, ranking of the problems identified according to their importance, to the determination and implementation of corrective actions related. The companies use Control Plans realized using the results from FMEA to evaluate a process or product for strengths and weaknesses and to prevent problems before they occur. The Control Plans represent written descriptions of the systems used to control and minimize product and process variation. In addition Control Plans specify the process monitoring and control methods (for example Special Controls) used to control Special Characteristics. In this paper we propose a computer-aided solution with Genetic Algorithms in order to reduce the drafting of reports: FMEA analysis and Control Plan required in the manufacture of the product launch and improved knowledge development teams for future projects. The solution allows to the design team to introduce data entry required to FMEA. The actual analysis is performed using Genetic Algorithms to find optimum between RPN risk factor and cost of production. A feature of Genetic Algorithms is that they are used as a means of finding solutions for multi criteria optimization problems. In our case, along with three specific FMEA risk factors is considered and reduce production cost. Analysis tool will generate final reports for all FMEA processes. The data obtained in FMEA reports are automatically integrated with other entered parameters in Control Plan. Implementation of the solution is in the form of an application running in an intranet on two servers: one containing analysis and plan generation engine and the other containing the database where the initial parameters and results are stored. The results can then be used as starting solutions in the synthesis of other projects. The solution was applied to welding processes, laser cutting and bending to manufacture chassis for buses. Advantages of the solution are efficient elaboration of documents in the current project by automatically generating reports FMEA and Control Plan using multiple criteria optimization of production and build a solid knowledge base for future projects. The solution which we propose is a cheap alternative to other solutions on the market using Open Source tools in implementation.Keywords: automotive industry, FMEA, control plan, automotive technology
Procedia PDF Downloads 406424 Scalable UI Test Automation for Large-scale Web Applications
Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani
Abstract:
This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.Keywords: aws, elastic container service, scalability, serverless, ui automation test
Procedia PDF Downloads 108423 Determining the Thermal Performance and Comfort Indices of a Naturally Ventilated Room with Reduced Density Reinforced Concrete Wall Construction over Conventional M-25 Grade Concrete
Authors: P. Crosby, Shiva Krishna Pavuluri, S. Rajkumar
Abstract:
Purpose: Occupied built-up space can be broadly classified as air-conditioned and naturally ventilated. Regardless of the building type, the objective of all occupied built-up space is to provide a thermally acceptable environment for human occupancy. Considering this aspect, air-conditioned spaces allow a greater degree of flexibility to control and modulate the comfort parameters during the operation phase. However, in the case of naturally ventilated space, a number of design features favoring indoor thermal comfort should be mandatorily conceptualized starting from the design phase. One such primary design feature that requires to be prioritized is, selection of building envelope material, as it decides the flow of energy from outside environment to occupied spaces. Research Methodology: In India and many countries across globe, the standardized material used for building envelope is re-enforced concrete (i.e. M-25 grade concrete). The comfort inside the RC built environment for warm & humid climate (i.e. mid-day temp of 30-35˚C, diurnal variation of 5-8˚C & RH of 70-90%) is unsatisfying to say the least. This study is mainly focused on reviewing the impact of mix design of conventional M25 grade concrete on inside thermal comfort. In this mix design, air entrainment in the range of 2000 to 2100 kg/m3 is introduced to reduce the density of M-25 grade concrete. Thermal performance parameters & indoor comfort indices are analyzed for the proposed mix and compared in relation to the conventional M-25 grade. There are diverse methodologies which govern indoor comfort calculation. In this study, three varied approaches specifically a) Indian Adaptive Thermal comfort model, b) Tropical Summer Index (TSI) c) Air temperature less than 33˚C & RH less than 70% to calculate comfort is adopted. The data required for the thermal comfort study is acquired by field measurement approach (i.e. for the new mix design) and simulation approach by using design builder (i.e. for the conventional concrete grade). Findings: The analysis points that the Tropical Summer Index has a higher degree of stringency in determining the occupant comfort band whereas also providing a leverage in thermally tolerable band over & above other methodologies in the context of the study. Another important finding is the new mix design ensures a 10% reduction in indoor air temperature (IAT) over the outdoor dry bulb temperature (ODBT) during the day. This translates to a significant temperature difference of 6 ˚C IAT and ODBT.Keywords: Indian adaptive thermal comfort, indoor air temperature, thermal comfort, tropical summer index
Procedia PDF Downloads 321422 Automated Building Internal Layout Design Incorporating Post-Earthquake Evacuation Considerations
Authors: Sajjad Hassanpour, Vicente A. González, Yang Zou, Jiamou Liu
Abstract:
Earthquakes pose a significant threat to both structural and non-structural elements in buildings, putting human lives at risk. Effective post-earthquake evacuation is critical for ensuring the safety of building occupants. However, current design practices often neglect the integration of post-earthquake evacuation considerations into the early-stage architectural design process. To address this gap, this paper presents a novel automated internal architectural layout generation tool that optimizes post-earthquake evacuation performance. The tool takes an initial plain floor plan as input, along with specific requirements from the user/architect, such as minimum room dimensions, corridor width, and exit lengths. Based on these inputs, firstly, the tool randomly generates different architectural layouts. Secondly, the human post-earthquake evacuation behaviour will be thoroughly assessed for each generated layout using the advanced Agent-Based Building Earthquake Evacuation Simulation (AB2E2S) model. The AB2E2S prototype is a post-earthquake evacuation simulation tool that incorporates variables related to earthquake intensity, architectural layout, and human factors. It leverages a hierarchical agent-based simulation approach, incorporating reinforcement learning to mimic human behaviour during evacuation. The model evaluates different layout options and provides feedback on evacuation flow, time, and possible casualties due to earthquake non-structural damage. By integrating the AB2E2S model into the automated layout generation tool, architects and designers can obtain optimized architectural layouts that prioritize post-earthquake evacuation performance. Through the use of the tool, architects and designers can explore various design alternatives, considering different minimum room requirements, corridor widths, and exit lengths. This approach ensures that evacuation considerations are embedded in the early stages of the design process. In conclusion, this research presents an innovative automated internal architectural layout generation tool that integrates post-earthquake evacuation simulation. By incorporating evacuation considerations into the early-stage design process, architects and designers can optimize building layouts for improved post-earthquake evacuation performance. This tool empowers professionals to create resilient designs that prioritize the safety of building occupants in the face of seismic events.Keywords: agent-based simulation, automation in design, architectural layout, post-earthquake evacuation behavior
Procedia PDF Downloads 105421 A Semi-supervised Classification Approach for Trend Following Investment Strategy
Authors: Rodrigo Arnaldo Scarpel
Abstract:
Trend following is a widely accepted investment strategy that adopts a rule-based trading mechanism that rather than striving to predict market direction or on information gathering to decide when to buy and when to sell a stock. Thus, in trend following one must respond to market’s movements that has recently happen and what is currently happening, rather than on what will happen. Optimally, in trend following strategy, is to catch a bull market at its early stage, ride the trend, and liquidate the position at the first evidence of the subsequent bear market. For applying the trend following strategy one needs to find the trend and identify trade signals. In order to avoid false signals, i.e., identify fluctuations of short, mid and long terms and to separate noise from real changes in the trend, most academic works rely on moving averages and other technical analysis indicators, such as the moving average convergence divergence (MACD) and the relative strength index (RSI) to uncover intelligible stock trading rules following trend following strategy philosophy. Recently, some works has applied machine learning techniques for trade rules discovery. In those works, the process of rule construction is based on evolutionary learning which aims to adapt the rules to the current environment and searches for the global optimum rules in the search space. In this work, instead of focusing on the usage of machine learning techniques for creating trading rules, a time series trend classification employing a semi-supervised approach was used to early identify both the beginning and the end of upward and downward trends. Such classification model can be employed to identify trade signals and the decision-making procedure is that if an up-trend (down-trend) is identified, a buy (sell) signal is generated. Semi-supervised learning is used for model training when only part of the data is labeled and Semi-supervised classification aims to train a classifier from both the labeled and unlabeled data, such that it is better than the supervised classifier trained only on the labeled data. For illustrating the proposed approach, it was employed daily trade information, including the open, high, low and closing values and volume from January 1, 2000 to December 31, 2022, of the São Paulo Exchange Composite index (IBOVESPA). Through this time period it was visually identified consistent changes in price, upwards or downwards, for assigning labels and leaving the rest of the days (when there is not a consistent change in price) unlabeled. For training the classification model, a pseudo-label semi-supervised learning strategy was used employing different technical analysis indicators. In this learning strategy, the core is to use unlabeled data to generate a pseudo-label for supervised training. For evaluating the achieved results, it was considered the annualized return and excess return, the Sortino and the Sharpe indicators. Through the evaluated time period, the obtained results were very consistent and can be considered promising for generating the intended trading signals.Keywords: evolutionary learning, semi-supervised classification, time series data, trading signals generation
Procedia PDF Downloads 90420 Sustainability in Higher Education: A Case of Transition Management from a Private University in Turkey (Ongoing Study)
Authors: Ayse Collins
Abstract:
The Agenda 2030 puts Higher Education Institutions (HEIs) in the situation where they should emphasize ways to promote sustainability accordingly. However, it is still unclear: a) how sustainability is understood, and b) which actions have been taken in both discourse and practice by HEIs regarding the three pillars of sustainability, society, environment, and economy. There are models of sustainable universities developed by different authors from different countries; For Example, The Global Reporting Initiative (GRI) methodology which offers a variety of indicators to diagnose performance. However, these models have never been developed for universities in particular. Any model, in this sense, cannot be completed adequately without defining the appropriate tools to measure, analyze and control the performance of initiatives. There is a need to conduct researches in different universities from different countries to understand where we stand in terms of sustainable higher education. Therefore, this study aims at exploring the actions taken by a university in Ankara, Turkey, since Agenda 2030 should consider localizing its objectives and targets according to a certain geography. This university just announced 2021-2022 as “Sustainability Year.” Therefore, this research is a multi-methodology longitudinal study and uses the theoretical framework of the organization and transition management (TM). It is designed to examine the activities as being strategic, tactical, operational, and reflexive in nature and covers the six main aspects: academic community, administrative staff, operations and services, teaching, research, and extension. The preliminary research will answer the role of the top university governance, perception of the stakeholders (students, instructors, administrative and support staff) regarding sustainability, and the level of achievement at the mid-evaluation and final, end of year evaluation. TM Theory is a multi-scale, multi-actor, process-oriented approach with the analytical framework to explore and promote change in social systems. Therefore, the stages and respective methodology for collecting data in this research is: Pre-development Stage: a) semi-structured interviews with university governance, c) open-ended survey with faculty, students, and administrative staff d) Semi-structured interviews with support staff, and e) analysis of current secondary data for sustainability. Take-off Stage: a) semi-structured interviews with university governance, faculty, students, administrative and support staff, b) analysis of secondary data. Breakthrough stabilization a) survey with all stakeholders at the university, b) secondary data analysis by using selected indicators for the first sustainability report for universities The findings from the predevelopment stage highlight how stakeholders, coming from different faculties, different disciplines with different identities and characteristics, face the sustainability challenge differently. Though similar sustainable development goals ((social, environmental, and economic) are set in the institution, there are differences across disciplines and among different stakeholders, which need to be considered to reach the optimum goal. It is believed that the results will help changes in HEIs organizational culture to embed sustainability values in their strategic planning, academic and managerial work by putting enough time and resources to be successful in coping with sustainability.Keywords: higher education, sustainability, sustainability auditing, transition management
Procedia PDF Downloads 109419 Characterization of Dota-Girentuximab Conjugates for Radioimmunotherapy
Authors: Tais Basaco, Stefanie Pektor, Josue A. Moreno, Matthias Miederer, Andreas Türler
Abstract:
Radiopharmaceuticals based in monoclonal anti-body (mAb) via chemical linkers have become a potential tool in nuclear medicine because of their specificity and the large variability and availability of therapeutic radiometals. It is important to identify the conjugation sites and number of attached chelator to mAb to obtain radioimmunoconjugates with required immunoreactivity and radiostability. Girentuximab antibody (G250) is a potential candidate for radioimmunotherapy of clear cell carcinomas (RCCs) because it is reactive with CAIX antigen, a transmembrane glycoprotein overexpressed on the cell surface of most ( > 90%) (RCCs). G250 was conjugated with the bifunctional chelating agent DOTA (1,4,7,10-Tetraazacyclododecane-N,N’,N’’,N’’’-tetraacetic acid) via a benzyl-thiocyano group as a linker (p-SCN-Bn-DOTA). DOTA-G250 conjugates were analyzed by size exclusion chromatography (SE-HPLC) and by electrophoresis (SDS-PAGE). The potential site-specific conjugation was identified by liquid chromatography–mass spectrometry (LC/MS-MS) and the number of linkers per molecule of mAb was calculated using the molecular weight (MW) measured by matrix assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS). The average number obtained in the conjugates in non-reduced conditions was between 8-10 molecules of DOTA per molecule of mAb. The average number obtained in the conjugates in reduced conditions was between 1-2 and 3-4 molecules of DOTA per molecule of mAb in the light chain (LC) and heavy chain (HC) respectively. Potential DOTA modification sites of the chelator were identified in lysine residues. The biological activity of the conjugates was evaluated by flow cytometry (FACS) using CAIX negative (SKRC-18) and CAIX positive (SKRC-52). The DOTA-G250 conjugates were labelled with 177Lu with a radiochemical yield > 95% reaching specific activities of 12 MBq/µg. The stability in vitro of different types of radioconstructs was analyzed in human serum albumin (HSA). The radiostability of 177Lu-DOTA-G250 at high specific activity was increased by addition of sodium ascorbate after the labelling. The immunoreactivity was evaluated in vitro and in vivo. Binding to CAIX positive cells (SK-RC-52) at different specific activities was higher for conjugates with less DOTA content. Protein dose was optimized in mice with subcutaneously growing SK-RC-52 tumors using different amounts of 177Lu- DOTA-G250.Keywords: mass spectrometry, monoclonal antibody, radiopharmaceuticals, radioimmunotheray, renal cancer
Procedia PDF Downloads 309418 The End Justifies the Means: Using Programmed Mastery Drill to Teach Spoken English to Spanish Youngsters, without Relying on Homework
Authors: Robert Pocklington
Abstract:
Most current language courses expect students to be ‘vocational’, sacrificing their free time in order to learn. However, pupils with a full-time job, or bringing up children, hardly have a spare moment. Others just need the language as a tool or a qualification, as if it were book-keeping or a driving license. Then there are children in unstructured families whose stressful life makes private study almost impossible. And the countless parents whose evenings and weekends have become a nightmare, trying to get the children to do their homework. There are many arguments against homework being a necessity (rather than an optional extra for more ambitious or dedicated students), making a clear case for teaching methods which facilitate full learning of the key content within the classroom. A methodology which could be described as Programmed Mastery Learning has been used at Fluency Language Academy (Spain) since 1992, to teach English to over 4000 pupils yearly, with a staff of around 100 teachers, barely requiring homework. The course is structured according to the tenets of Programmed Learning: small manageable teaching steps, immediate feedback, and constant successful activity. For the Mastery component (not stopping until everyone has learned), the memorisation and practice are entrusted to flashcard-based drilling in the classroom, leading all students to progress together and develop a permanently growing knowledge base. Vocabulary and expressions are memorised using flashcards as stimuli, obliging the brain to constantly recover words from the long-term memory and converting them into reflex knowledge, before they are deployed in sentence building. The use of grammar rules is practised with ‘cue’ flashcards: the brain refers consciously to the grammar rule each time it produces a phrase until it comes easily. This automation of lexicon and correct grammar use greatly facilitates all other language and conversational activities. The full B2 course consists of 48 units each of which takes a class an average of 17,5 hours to complete, allowing the vast majority of students to reach B2 level in 840 class hours, which is corroborated by an 85% pass-rate in the Cambridge University B2 exam (First Certificate). In the past, studying for qualifications was just one of many different options open to young people. Nowadays, youngsters need to stay at school and obtain qualifications in order to get any kind of job. There are many students in our classes who have little intrinsic interest in what they are studying; they just need the certificate. In these circumstances and with increasing government pressure to minimise failure, teachers can no longer think ‘If they don’t study, and fail, its their problem’. It is now becoming the teacher’s problem. Teachers are ever more in need of methods which make their pupils successful learners; this means assuring learning in the classroom. Furthermore, homework is arguably the main divider between successful middle-class schoolchildren and failing working-class children who drop out: if everything important is learned at school, the latter will have a much better chance, favouring inclusiveness in the language classroom.Keywords: flashcard drilling, fluency method, mastery learning, programmed learning, teaching English as a foreign language
Procedia PDF Downloads 110417 Owning (up to) the 'Art of the Insane': Re-Claiming Personhood through Copyright Law
Authors: Mathilde Pavis
Abstract:
From Schumann to Van Gogh, Frida Kahlo, and Ray Charles, the stories narrating the careers of artists with physical or mental disabilities are becoming increasingly popular. From the emergence of ‘pathography’ at the end of 18th century to cinematographic portrayals, the work and lives of differently-abled creative individuals continue to fascinate readers, spectators and researchers. The achievements of those artists form the tip of the iceberg composed of complex politico-cultural movements which continue to advocate for wider recognition of disabled artists’ contribution to western culture. This paper envisages copyright law as a potential tool to such end. It investigates the array of rights available to artists with intellectual disabilities to assert their position as authors of their artwork in the twenty-first-century looking at international and national copyright laws (UK and US). Put simply, this paper questions whether an artist’s intellectual disability could be a barrier to assert their intellectual property rights over their creation. From a legal perspective, basic principles of non-discrimination would contradict the representation of artists’ disability as an obstacle to authorship as granted by intellectual property laws. Yet empirical studies reveal that artists with intellectual disabilities are often denied the opportunity to exercise their intellectual property rights or any form of agency over their work. In practice, it appears that, unlike other non-disabled artists, the prospect for differently-abled creators to make use of their right is contingent to the context in which the creative process takes place. Often will the management of such rights rest with the institution, art therapist or mediator involved in the artists’ work as the latter will have necessitated greater support than their non-disabled peers for a variety of reasons, either medical or practical. Moreover, the financial setbacks suffered by medical institutions and private therapy practices have renewed administrators’ and physicians’ interest in monetising the artworks produced under their supervision. Adding to those economic incentives, the rise of criminal and civil litigation in psychiatric cases has also encouraged the retention of patients’ work by therapists who feel compelled to keep comprehensive medical records to shield themselves from liability in the event of a lawsuit. Unspoken transactions, contracts, implied agreements and consent forms have thus progressively made their way into the relationship between those artists and their therapists or assistants, disregarding any notions of copyright. The question of artists’ authorship finds itself caught in an unusually multi-faceted web of issues formed by tightening purse strings, ethical concerns and the fear of civil or criminal liability. Whilst those issues are playing out behind closed doors, the popularity of what was once called the ‘Art of the Insane’ continues to grow and open new commercial avenues. This socio-economic context exacerbates the need to devise a legal framework able to help practitioners, artists and their advocates navigate through those issues in such a way that neither this minority nor our cultural heritage suffers from the fragmentation of the legal protection available to them.Keywords: authorship, copyright law, intellectual disabilities, art therapy and mediation
Procedia PDF Downloads 150416 Rethinking the Languages for Specific Purposes Syllabus in the 21st Century: Topic-Centered or Skills-Centered
Authors: A. Knezović
Abstract:
21st century has transformed the labor market landscape in a way of posing new and different demands on university graduates as well as university lecturers, which means that the knowledge and academic skills students acquire in the course of their studies should be applicable and transferable from the higher education context to their future professional careers. Given the context of the Languages for Specific Purposes (LSP) classroom, the teachers’ objective is not only to teach the language itself, but also to prepare students to use that language as a medium to develop generic skills and competences. These include media and information literacy, critical and creative thinking, problem-solving and analytical skills, effective written and oral communication, as well as collaborative work and social skills, all of which are necessary to make university graduates more competitive in everyday professional environments. On the other hand, due to limitations of time and large numbers of students in classes, the frequently topic-centered syllabus of LSP courses places considerable focus on acquiring the subject matter and specialist vocabulary instead of sufficient development of skills and competences required by students’ prospective employers. This paper intends to explore some of those issues as viewed both by LSP lecturers and by business professionals in their respective surveys. The surveys were conducted among more than 50 LSP lecturers at higher education institutions in Croatia, more than 40 HR professionals and more than 60 university graduates with degrees in economics and/or business working in management positions in mainly large and medium-sized companies in Croatia. Various elements of LSP course content have been taken into consideration in this research, including reading and listening comprehension of specialist texts, acquisition of specialist vocabulary and grammatical structures, as well as presentation and negotiation skills. The ability to hold meetings, conduct business correspondence, write reports, academic texts, case studies and take part in debates were also taken into consideration, as well as informal business communication, business etiquette and core courses delivered in a foreign language. The results of the surveys conducted among LSP lecturers will be analyzed with reference to what extent those elements are included in their courses and how consistently and thoroughly they are evaluated according to their course requirements. Their opinions will be compared to the results of the surveys conducted among professionals from a range of industries in Croatia so as to examine how useful and important they perceive the same elements of the LSP course content in their working environments. Such comparative analysis will thus show to what extent the syllabi of LSP courses meet the demands of the employment market when it comes to the students’ language skills and competences, as well as transferable skills. Finally, the findings will also be compared to the observations based on practical teaching experience and the relevant sources that have been used in this research. In conclusion, the ideas and observations in this paper are merely open-ended questions that do not have conclusive answers, but might prompt LSP lecturers to re-evaluate the content and objectives of their course syllabi.Keywords: languages for specific purposes (LSP), language skills, topic-centred syllabus, transferable skills
Procedia PDF Downloads 308415 Usability Assessment of a Bluetooth-Enabled Resistance Exercise Band among Young Adults
Authors: Lillian M. Seo, Curtis L. Petersen, Ryan J. Halter, David Kotz, John A. Batsis
Abstract:
Background: Resistance-based exercises effectively enhance muscle strength, which is especially important in older populations as it reduces the risk of disability. Our group developed a Bluetooth-enabled handle for resistance exercise bands that wirelessly transmits relative force data through low-energy Bluetooth to a local smartphone or similar device. The system has the potential to measure home-based exercise interventions, allowing health professionals to monitor compliance. Its feasibility has already been demonstrated in both clinical and field-based settings, but it remained unclear whether the system’s usability persisted upon repeated use. The current study sought to assess the usability of this system and its users’ satisfaction with repeated use by deploying the device among younger adults to gather formative information that can ultimately improve the device’s design for older adults. Methods: A usability study was conducted in which 32 participants used the above system. Participants executed 10 repetitions of four commonly performed exercises: bicep flexion, shoulder abduction, elbow extension, and triceps extension. Each completed three exercise sessions, separated by at least 24 hours to minimize muscle fatigue. At its conclusion, subjects completed an adapted version of the usefulness, satisfaction, and ease (USE) questionnaire – assessing the system across four domains: usability, satisfaction, ease of use, and ease of learning. The 20-item questionnaire examined how strongly a participant agrees with positive statements about the device on a seven-point Likert scale, with one representing ‘strongly disagree’ and seven representing ‘strongly agree.’ Participants’ data were aggregated to calculate mean response values for each question and domain, effectively assessing the device’s performance across different facets of the user experience. Summary force data were visualized using a custom web application. Finally, an optional prompt at the end of the questionnaire allowed for written comments and feedback from participants to elicit qualitative indicators of usability. Results: Of the n=32 participants, 13 (41%) were female; their mean age was 32.4 ± 11.8 years, and no participants had a physical impairment. No usability questions received a mean score < 5 of seven. The four domains’ mean scores were: usefulness 5.66 ± 0.35; satisfaction 6.23 ± 0.06; ease of use 6.25 ± 0.43; and ease of learning 6.50 ± 0.19. Representative quotes of the open-ended feedback include: ‘A non-rigid strap-style handle might be useful for some exercises,’ and, ‘Would need different bands for each exercise as they use different muscle groups with different strength levels.’ General impressions were favorable, supporting the expectation that the device would be a useful tool in exercise interventions. Conclusions: A simple usability assessment of a Bluetooth-enabled resistance exercise band supports a consistent and positive user experience among young adults. This study provides adequate formative data, assuring the next steps can be taken to continue testing and development for the target population of older adults.Keywords: Bluetooth, exercise, mobile health, mHealth, usability
Procedia PDF Downloads 117414 A Bottleneck-Aware Power Management Scheme in Heterogeneous Processors for Web Apps
Authors: Inyoung Park, Youngjoo Woo, Euiseong Seo
Abstract:
With the advent of WebGL, Web apps are now able to provide high quality graphics by utilizing the underlying graphic processing units (GPUs). Despite that the Web apps are becoming common and popular, the current power management schemes, which were devised for the conventional native applications, are suboptimal for Web apps because of the additional layer, the Web browser, between OS and application. The Web browser running on a CPU issues GL commands, which are for rendering images to be displayed by the Web app currently running, to the GPU and the GPU processes them. The size and number of issued GL commands determine the processing load of the GPU. While the GPU is processing the GL commands, CPU simultaneously executes the other compute intensive threads. The actual user experience will be determined by either CPU processing or GPU processing depending on which of the two is the more demanded resource. For example, when the GPU work queue is saturated by the outstanding commands, lowering the performance level of the CPU does not affect the user experience because it is already deteriorated by the retarded execution of GPU commands. Consequently, it would be desirable to lower CPU or GPU performance level to save energy when the other resource is saturated and becomes a bottleneck in the execution flow. Based on this observation, we propose a power management scheme that is specialized for the Web app runtime environment. This approach incurs two technical challenges; identification of the bottleneck resource and determination of the appropriate performance level for unsaturated resource. The proposed power management scheme uses the CPU utilization level of the Window Manager to tell which one is the bottleneck if exists. The Window Manager draws the final screen using the processed results delivered from the GPU. Thus, the Window Manager is on the critical path that determines the quality of user experience and purely executed by the CPU. The proposed scheme uses the weighted average of the Window Manager utilization to prevent excessive sensitivity and fluctuation. We classified Web apps into three categories using the analysis results that measure frame-per-second (FPS) changes under diverse CPU/GPU clock combinations. The results showed that the capability of the CPU decides user experience when the Window Manager utilization is above 90% and consequently, the proposed scheme decreases the performance level of CPU by one step. On the contrary, when its utilization is less than 60%, the bottleneck usually lies in the GPU and it is desirable to decrease the performance of GPU. Even the processing unit that is not on critical path, excessive performance drop can occur and that may adversely affect the user experience. Therefore, our scheme lowers the frequency gradually, until it finds an appropriate level by periodically checking the CPU utilization. The proposed scheme reduced the energy consumption by 10.34% on average in comparison to the conventional Linux kernel, and it worsened their FPS by 1.07% only on average.Keywords: interactive applications, power management, QoS, Web apps, WebGL
Procedia PDF Downloads 193413 Informational Habits and Ideology as Predictors for Political Efficacy: A Survey Study of the Brazilian Political Context
Authors: Pedro Cardoso Alves, Ana Lucia Galinkin, José Carlos Ribeiro
Abstract:
Political participation, can be a somewhat tricky subject to define, not in small part due to the constant changes in the concept fruit of the effort to include new forms of participatory behavior that go beyond traditional institutional channels. With the advent of the internet and mobile technologies, defining political participation has become an even more complicated endeavor, given de amplitude of politicized behaviors that are expressed throughout these mediums, be it in the very organization of social movements, in the propagation of politicized texts, videos and images, or in the micropolitical behaviors that are expressed in daily interaction. In fact, the very frontiers that delimit physical and digital spaces have become ever more diluted due to technological advancements, leading to a hybrid existence that is simultaneously physical and digital, not limited, as it once was, to the temporal limitations of classic communications. Moving away from those institutionalized actions of traditional political behavior, an idea of constant and fluid participation, which occurs in our daily lives through conversations, posts, tweets and other digital forms of expression, is discussed. This discussion focuses on the factors that precede more direct forms of political participation, interpreting the relation between informational habits, ideology, and political efficacy. Though some of the informational habits can be considered political participation, by some authors, a distinction is made to establish a logical flow of behaviors leading to participation, that is, one must gather and process information before acting on it. To reach this objective, a quantitative survey is currently being applied in Brazilian social media, evaluating feelings of political efficacy, social and economic issue-based ideological stances and informational habits pertaining to collection, fact-checking, and diversity of sources and ideological positions present in the participant’s political information network. The measure being used for informational habits relies strongly on a mix of information literacy and political sophistication concepts, bringing a more up-to-date understanding of information and knowledge production and processing in contemporary hybrid (physical-digital) environments. Though data is still being collected, preliminary analysis point towards a strong correlation between information habits and political efficacy, while ideology shows a weaker influence over efficacy. Moreover, social ideology and economic ideology seem to have a strong correlation in the sample, such intermingling between social and economic ideals is generally considered a red flag for political polarization.Keywords: political efficacy, ideology, information literacy, cyberpolitics
Procedia PDF Downloads 235412 Survey for Mango Seed Weevils and Pulp Weevil Sternochetus Species (Coleoptera:Curculionidae) on Mango, Mangifera indica in Shan State-South, Myanmar
Authors: Khin Nyunt Yee, Mu Mu Thein
Abstract:
Detection survey of mango seed and Pulp weevils was undertaken at major mango production areas, Yat Sauk, Taunggyi, Nyaung Shwe and Hopong Townships, in Shan State (South) of Myanmar on two mango cultivars of Sein Ta Lone and Yinkwe from May to August 2016 to coincide with fruiting season to conduct a survey of mango seed and pulp weevils population. The total numbers of 6300 fruits of both mango cultivars were sampled. Among them, 2900 fruits from 5674 fruit bearing plants were collected for Sein Ta Lone cultivar of five well managed, one unmanaged orchards and Urban in Yatsauk Twonship, 400 fruits from only one well managed orchard in Taunggyi Township, 400 fruits from two managed orchards in Nyaung Shwe Township and 400 fruits from one managed orchard in Hopong Township from May to June. 2200 fruits were collected from 4043 fruit bearing plants for Yinkwe Cultivar of four well managed orchards, one unmanaged orchards and one wild tree only in Yat Sauk Township from July to August, 2016. Fruit sample size was 200 fruits /orchard, / wild or /volunteer trees as minimum number. The pulps of all randomly sampling fruits were longitudinal cut open into three slices on each side of fruit and seed were cut longitudinally to inspect the presence of mango weevils. The collected weevils were identified up to species level at Plant Quarantine Laboratory, Plant Protection Division, Department of Agriculture, Ministry of Agriculture, Livestock and Irrigation, Yangon, Myanmar. Mango Pulp and Seed weevils were found on Sein Ta Lone Mango Cultivar in three out of four surveyed Townships except Hopong with the level of infestation ranged from 0.0% to 3.5% of fruits per Township with 0.0% to 39.0% of fruits per orchard. The highest infestation rate per township was 3.5% of fruits (n=400 fruits) in Nyaung Shwe, then, at Yat Suak, the rate was 2.47% (n=2900 fruits). A well-managed orchard at Taung Gyi had 0.75% (n=400 fruits) whereas Hopong was free 0.0% (n=400). The weevils were also recorded on Yinkwe Mango Cultivar in Yatsauk Township where the infestation level was 12.63% of fruits (n=2200) with 0.0% to 67.0% of fruits per orchard. This high level of infestation was obtained by including an absolutely non Integrated Pest Management (non IPM) orchards in both survey with the infestation rates 63.0% of fruits (n=200) and 67.0% of fruits (n=200) respectively on Yinkwe cultivar. Two different species; mango pulp weevil, Sternochetus frigitus, and mango seed weevil Sternochetus olivieri (Faust) of family Curculionidae under the order Coleoptera were recorded. Sternochetus mangiferae was not found during these surveys. Three different developmental stages of mango seed and pulp weevils: larva, pupa and adult were first detected since the first survey in 3rd week of May and mostly were recorded as adult stages in the following surveys in June, July and August The number of Mango pulp weevil was statistically higher than that of mango seed weevils at P < 0.001%. More precise surveys should be carried out national wide to detect the mango weevils.Keywords: mango pulp weevil, Sternochetus frigitus, mango seed weevil Sternochetus olivieri, faust, Sternochetus mangiferae, fabricius, Sein Ta Lone, Yinkwe mango cultivars, Shan State (South) Myanmar
Procedia PDF Downloads 307411 Multi-Objective Optimization of the Thermal-Hydraulic Behavior for a Sodium Fast Reactor with a Gas Power Conversion System and a Loss of off-Site Power Simulation
Authors: Avent Grange, Frederic Bertrand, Jean-Baptiste Droin, Amandine Marrel, Jean-Henry Ferrasse, Olivier Boutin
Abstract:
CEA and its industrial partners are designing a gas Power Conversion System (PCS) based on a Brayton cycle for the ASTRID Sodium-cooled Fast Reactor. Investigations of control and regulation requirements to operate this PCS during operating, incidental and accidental transients are necessary to adapt core heat removal. To this aim, we developed a methodology to optimize the thermal-hydraulic behavior of the reactor during normal operations, incidents and accidents. This methodology consists of a multi-objective optimization for a specific sequence, whose aim is to increase component lifetime by reducing simultaneously several thermal stresses and to bring the reactor into a stable state. Furthermore, the multi-objective optimization complies with safety and operating constraints. Operating, incidental and accidental sequences use specific regulations to control the thermal-hydraulic reactor behavior, each of them is defined by a setpoint, a controller and an actuator. In the multi-objective problem, the parameters used to solve the optimization are the setpoints and the settings of the controllers associated with the regulations included in the sequence. In this way, the methodology allows designers to define an optimized and specific control strategy of the plant for the studied sequence and hence to adapt PCS piloting at its best. The multi-objective optimization is performed by evolutionary algorithms coupled to surrogate models built on variables computed by the thermal-hydraulic system code, CATHARE2. The methodology is applied to a loss of off-site power sequence. Three variables are controlled: the sodium outlet temperature of the sodium-gas heat exchanger, turbomachine rotational speed and water flow through the heat sink. These regulations are chosen in order to minimize thermal stresses on the gas-gas heat exchanger, on the sodium-gas heat exchanger and on the vessel. The main results of this work are optimal setpoints for the three regulations. Moreover, Proportional-Integral-Derivative (PID) control setting is considered and efficient actuators used in controls are chosen through sensitivity analysis results. Finally, the optimized regulation system and the reactor control procedure, provided by the optimization process, are verified through a direct CATHARE2 calculation.Keywords: gas power conversion system, loss of off-site power, multi-objective optimization, regulation, sodium fast reactor, surrogate model
Procedia PDF Downloads 309410 Digital Transformation in Fashion System Design: Tools and Opportunities
Authors: Margherita Tufarelli, Leonardo Giliberti, Elena Pucci
Abstract:
The fashion industry's interest in virtuality is linked, on the one hand, to the emotional and immersive possibilities of digital resources and the resulting languages and, on the other, to the greater efficiency that can be achieved throughout the value chain. The interaction between digital innovation and deep-rooted manufacturing traditions today translates into a paradigm shift for the entire fashion industry where, for example, the traditional values of industrial secrecy and know-how give way to experimentation in an open as well as participatory way, and the complete emancipation of virtual reality from actual 'reality'. The contribution aims to investigate the theme of digitisation in the Italian fashion industry, analysing its opportunities and the criticalities that have hindered its diffusion. There are two reasons why the most common approach in the fashion sector is still analogue: (i) the fashion product lives in close contact with the human body, so the sensory perception of materials plays a central role in both the use and the design of the product, but current technology is not able to restore the sense of touch; (ii) volumes are obtained by stitching flat surfaces that once assembled, given the flexibility of the material, can assume almost infinite configurations. Managing the fit and styling of virtual garments involves a wide range of factors, including mechanical simulation, collision detection, and user interface techniques for garment creation. After briefly reviewing some of the salient historical milestones in the resolution of problems related to the digital simulation of deformable materials and the user interface for the procedures for the realisation of the clothing system, the paper will describe the operation and possibilities offered today by the latest generation of specialised software. Parametric avatars and digital sartorial approach; drawing tools optimised for pattern making; materials both from the point of view of simulated physical behaviour and of aesthetic performance, tools for checking wearability, renderings, but also tools and procedures useful to companies both for dialogue with prototyping software and machinery and for managing the archive and the variants to be made. The article demonstrates how developments in technology and digital procedures now make it possible to intervene in different stages of design in the fashion industry. An integrated and additive process in which the constructed 3D models are usable both in the prototyping and communication of physical products and in the possible exclusively digital uses of 3D models in the new generation of virtual spaces. Mastering such tools requires the acquisition of specific digital skills and, at the same time, traditional skills for the design of the clothing system, but the benefits are manifold and applicable to different business dimensions. We are only at the beginning of the global digital transformation: the emergence of new professional figures and design dynamics leaves room for imagination, but in addition to applying digital tools to traditional procedures, traditional fashion know-how needs to be transferred into emerging digital practices to ensure the continuity of the technical-cultural heritage beyond the transformation.Keywords: digital fashion, digital technology and couture, digital fashion communication, 3D garment simulation
Procedia PDF Downloads 74409 The Achievements and Challenges of Physics Teachers When Implementing Problem-Based Learning: An Exploratory Study Applied to Rural High Schools
Authors: Osman Ali, Jeanne Kriek
Abstract:
Introduction: The current instructional approach entrenched in memorizing does not assist conceptual understanding in science. Instructional approaches that encourage research, investigation, and experimentation, which depict how scientists work, should be encouraged. One such teaching strategy is problem-based learning (PBL). PBL has many advantages; enhanced self-directed learning and improved problem-solving and critical thinking skills. However, despite many advantages, PBL has challenges. Research confirmed is time-consuming and difficult to formulate ill-structured questions. Professional development interventions are needed for in-service educators to adopt the PBL strategy. The purposively selected educators had to implement PBL in their classrooms after the intervention to develop their practice and then reflect on the implementation. They had to indicate their achievements and challenges. This study differs from previous studies as the rural educators were subjected to implementing PBL in their classrooms and reflected on their experiences, beliefs, and attitudes regarding PBL. Theoretical Framework: The study reinforced Vygotskian sociocultural theory. According to Vygotsky, the development of a child's cognitive is sustained by the interaction between the child and more able peers in his immediate environment. The theory suggests that social interactions in small groups create an opportunity for learners to form concepts and skills on their own better than working individually. PBL emphasized learning in small groups. Research Methodology: An exploratory case study was employed. The reason is that the study was not necessarily for specific conclusive evidence. Non-probability purposive sampling was adopted to choose eight schools from 89 rural public schools. In each school, two educators were approached, teaching physical sciences in grades 10 and 11 (N = 16). The research instruments were questionnaires, interviews, and lesson observation protocol. Two open-ended questionnaires were developed before and after intervention and analyzed thematically. Three themes were identified. The semi-structured interviews and responses were coded and transcribed into three themes. Subsequently, the Reform Teaching Observation Protocol (RTOP) was adopted for lesson observation and was analyzed using five constructs. Results: Evidence from analyzing the questionnaires before and after the intervention shows that participants knew better what was required to develop an ill-structured problem during the implementation. Furthermore, indications from the interviews are that participants had positive views about the PBL strategy. They stated that they only act as facilitators, and learners’ problem-solving and critical thinking skills are enhanced. They suggested a change in curriculum to adopt the PBL strategy. However, most participants may not continue to apply the PBL strategy stating that it is time-consuming and difficult to complete the Annual Teaching Plan (ATP). They complained about materials and equipment and learners' readiness to work. Evidence from RTOP shows that after the intervention, participants learn to encourage exploration and use learners' questions and comments to determine the direction and focus of classroom discussions.Keywords: problem-solving, self-directed, critical thinking, intervention
Procedia PDF Downloads 121