Search results for: core/shell
7 Results concerning the University: Industry Partnership for a Research Project Implementation (MUROS) in the Romanian Program Star
Authors: Loretta Ichim, Dan Popescu, Grigore Stamatescu
Abstract:
The paper reports the collaboration between a top university from Romania and three companies for the implementation of a research project in a multidisciplinary domain, focusing on the impact and benefits both for the education and industry. The joint activities were developed under the Space Technology and Advanced Research Program (STAR), funded by the Romanian Space Agency (ROSA) for a university-industry partnership. The context was defined by linking the European Space Agency optional programs, with the development and promotion national research, with the educational and industrial capabilities in the aeronautics, security and related areas by increasing the collaboration between academic and industrial entities as well as by realizing high-level scientific production. The project name is Multisensory Robotic System for Aerial Monitoring of Critical Infrastructure Systems (MUROS), which was carried 2013-2016. The project included the University POLITEHNICA of Bucharest (coordinator) and three companies, which manufacture and market unmanned aerial systems. The project had as main objective the development of an integrated system for combined ground wireless sensor networks and UAV monitoring in various application scenarios for critical infrastructure surveillance. This included specific activities related to fundamental and applied research, technology transfer, prototype implementation and result dissemination. The core area of the contributions laid in distributed data processing and communication mechanisms, advanced image processing and embedded system development. Special focus is given by the paper to analyzing the impact the project implementation in the educational process, directly or indirectly, through the faculty members (professors and students) involved in the research team. Three main directions are discussed: a) enabling students to carry out internships at the partner companies, b) handling advanced topics and industry requirements at the master's level, c) experiments and concept validation for doctoral thesis. The impact of the research work (as the educational component) developed by the faculty members on the increasing performances of the companies’ products is highlighted. The collaboration between university and companies was well balanced both for contributions and results. The paper also presents the outcomes of the project which reveals the efficient collaboration between high education and industry: master thesis, doctoral thesis, conference papers, journal papers, technical documentation for technology transfer, prototype, and patent. The experience can provide useful practices of blending research and education within an academia-industry cooperation framework while the lessons learned represent a starting point in debating the new role of advanced research and development performing companies in association with higher education. This partnership, promoted at UE level, has a broad impact beyond the constrained scope of a single project and can develop into long-lasting collaboration while benefiting all stakeholders: students, universities and the surrounding knowledge-based economic and industrial ecosystem. Due to the exchange of experiences between the university (UPB) and the manufacturing company (AFT Design), a new project, SIMUL, under the Bridge Grant Program (Romanian executive agency UEFISCDI) was started (2016 – 2017). This project will continue the educational research for innovation on master and doctoral studies in MUROS thematic (collaborative multi-UAV application for flood detection).Keywords: education process, multisensory robotic system, research and innovation project, technology transfer, university-industry partnership
Procedia PDF Downloads 2386 Examining the Current Divisive State of American Political Discourse through the Lens of Peirce's Triadic Logical Structure and Pragmatist Metaphysics
Authors: Nathan Garcia
Abstract:
The polarizing dialogue of contemporary political America results from core philosophical differences. But these differences are beyond ideological and reach metaphysical distinction. Good intellectual historians have theorized that fundamental concepts such as freedom, God, and nature have been sterilized of their intellectual vigor. They are partially correct. 19th-century pragmatist Charles Sanders Peirce offers a penetrating philosophy which can yield greater insight into the contemporary political divide. Peirce argues that metaphysical and ethical issues are derivative of operational logic. His triadic logical structure and ensuing metaphysical principles constructed therefrom is contemporaneously applicable for three reasons. First, Peirce’s logic aptly scrutinizes the logical processes of liberal and conservative mindsets. Each group arrives at a cosmological root metaphor (abduction), resulting in a contemporary assessment (deduction), ultimately prompting attempts to verify the original abduction (induction). Peirce’s system demonstrates that liberal citizens develop a cosmological root metaphor in the concept of fairness (abduction), resulting in a contemporary assessment of, for example, underrepresented communities being unfairly preyed upon (deduction), thereby inciting anger toward traditional socio-political structures suspected of purposefully destabilizing minority communities (induction). Similarly, conservative citizens develop a cosmological root metaphor in the concept of freedom (abduction), resulting in a contemporary assessment of, for example, liberal citizens advocating an expansion of governmental powers (deduction), thereby inciting anger towards liberal communities suspected of attacking freedoms of ordinary Americans in a bid to empower their interests through the government (induction). The value of this triadic assessment is the categorization of distinct types of inferential logic by their purpose and boundaries. Only deductive claims can be concretely proven, while abductive claims are merely preliminary hypotheses, and inductive claims are accountable to interdisciplinary oversight. Liberals and conservative logical processes preclude constructive dialogue because of (a) an unshared abductive framework, and (b) misunderstanding the rules and responsibilities of their types of claims. Second, Peircean metaphysical principles offer a greater summary of the contemporaneously divisive political climate. His insights can weed through the partisan theorizing to unravel the underlying philosophical problems. Corrosive nominalistic and essentialistic presuppositions weaken the ability to share experiences and communicate effectively, both requisite for any promising constructive dialogue. Peirce’s pragmatist system can expose and evade fallacious thinking in pursuit of a refreshing alternative framework. Finally, Peirce’s metaphysical foundation enables a logically coherent, scientifically informed orthopraxis well-suited for American dialogue. His logical structure necessitates radically different anthropology conducive to shared experiences and dialogue within a dynamic, cultural continuum. Pierce’s fallibilism and sensitivity to religious sentiment successfully navigate between liberal and conservative values. In sum, he provides a normative paradigm for intranational dialogue that privileges individual experience and values morally defensible notions of freedom, God, and nature. Utilizing Peirce’s thought will yield fruitful analysis and offers a promising philosophical alternative for framing and engaging in contemporary American political discourse.Keywords: Charles s. Peirce, american politics, logic, pragmatism
Procedia PDF Downloads 1145 Circular Nitrogen Removal, Recovery and Reuse Technologies
Authors: Lina Wu
Abstract:
The excessive discharge of nitrogen in sewage greatly intensifies the eutrophication of water bodies and threatens water quality. Nitrogen pollution control has become a global concern. The concentration of nitrogen in water is reduced by converting ammonia nitrogen, nitrate nitrogen and nitrite nitrogen into nitrogen-containing gas through biological treatment, physicochemical treatment and oxidation technology. However, some wastewater containing high ammonia nitrogen including landfill leachate, is difficult to be treated by traditional nitrification and denitrification because of its high COD content. The core process of denitrification is that denitrifying bacteria convert nitrous acid produced by nitrification into nitrite under anaerobic conditions. Still, its low-carbon nitrogen does not meet the conditions for denitrification. Many studies have shown that the natural autotrophic anammox bacteria can combine nitrous and ammonia nitrogen without a carbon source through functional genes to achieve total nitrogen removal, which is very suitable for removing nitrogen from leachate. In addition, the process also saves a lot of aeration energy consumption than the traditional nitrogen removal process. Therefore, anammox plays an important role in nitrogen conversion and energy saving. The short-range nitrification and denitrification coupled with anaerobic ammoX ensures total nitrogen removal. It improves the removal efficiency, meeting the needs of society for an ecologically friendly and cost-effective nutrient removal treatment technology. In recent years, research has found that the symbiotic system has more water treatment advantages because this process not only helps to improve the efficiency of wastewater treatment but also allows carbon dioxide reduction and resource recovery. Microalgae use carbon dioxide dissolved in water or released through bacterial respiration to produce oxygen for bacteria through photosynthesis under light, and bacteria, in turn, provide metabolites and inorganic carbon sources for the growth of microalgae, which may lead the algal bacteria symbiotic system save most or all of the aeration energy consumption. It has become a trend to make microalgae and light-avoiding anammox bacteria play synergistic roles by adjusting the light-to-dark ratio. Microalgae in the outer layer of light particles block most of the light and provide cofactors and amino acids to promote nitrogen removal. In particular, myxoccota MYX1 can degrade extracellular proteins produced by microalgae, providing amino acids for the entire bacterial community, which helps anammox bacteria save metabolic energy and adapt to light. As a result, initiating and maintaining the process of combining dominant algae and anaerobic denitrifying bacterial communities has great potential in treating landfill leachate. Chlorella has a brilliant removal effect and can withstand extreme environments in terms of high ammonia nitrogen, high salt and low temperature. It is urgent to study whether the algal mud mixture rich in denitrifying bacteria and chlorella can greatly improve the efficiency of landfill leachate treatment under an anaerobic environment where photosynthesis is stopped. The optimal dilution concentration of simulated landfill leachate can be found by determining the treatment effect of the same batch of bacteria and algae mixtures under different initial ammonia nitrogen concentrations and making a comparison. High-throughput sequencing technology was used to analyze the changes in microbial diversity, related functional genera and functional genes under optimal conditions, providing a theoretical and practical basis for the engineering application of novel bacteria-algae symbiosis system in biogas slurry treatment and resource utilization.Keywords: nutrient removal and recovery, leachate, anammox, Partial nitrification, Algae-bacteria interaction
Procedia PDF Downloads 384 Fabrication of Highly Stable Low-Density Self-Assembled Monolayers by Thiolyne Click Reaction
Authors: Leila Safazadeh, Brad Berron
Abstract:
Self-assembled monolayers have tremendous impact in interfacial science, due to the unique opportunity they offer to tailor surface properties. Low-density self-assembled monolayers are an emerging class of monolayers where the environment-interfacing portion of the adsorbate has a greater level of conformational freedom when compared to traditional monolayer chemistries. This greater range of motion and increased spacing between surface-bound molecules offers new opportunities in tailoring adsorption phenomena in sensing systems. In particular, we expect low-density surfaces to offer a unique opportunity to intercalate surface bound ligands into the secondary structure of protiens and other macromolecules. Additionally, as many conventional sensing surfaces are built upon gold surfaces (SPR or QCM), these surfaces must be compatible with gold substrates. Here, we present the first stable method of generating low-density self assembled monolayer surfaces on gold for the analysis of their interactions with protein targets. Our approach is based on the 2:1 addition of thiol-yne chemistry to develop new classes of y-shaped adsorbates on gold, where the environment-interfacing group is spaced laterally from neighboring chemical groups. This technique involves an initial deposition of a crystalline monolayer of 1,10 decanedithiol on the gold substrate, followed by grafting of a low-packed monolayer on through a photoinitiated thiol-yne reaction in presence of light. Orthogonality of the thiol-yne chemistry (commonly referred to as a click chemistry) allows for preparation of low-density monolayers with variety of functional groups. To date, carboxyl, amine, alcohol, and alkyl terminated monolayers have been prepared using this core technology. Results from surface characterization techniques such as FTIR, contact angle goniometry and electrochemical impedance spectroscopy confirm the proposed low chain-chain interactions of the environment interfacing groups. Reductive desorption measurements suggest a higher stability for the click-LDMs compared to traditional SAMs, along with the equivalent packing density at the substrate interface, which confirms the proposed stability of the monolayer-gold interface. In addition, contact angle measurements change in the presence of an applied potential, supporting our description of a surface structure which allows the alkyl chains to freely orient themselves in response to different environments. We are studying the differences in protein adsorption phenomena between well packed and our loosely packed surfaces, and we expect this data will be ready to present at the GRC meeting. This work aims to contribute biotechnology science in the following manner: Molecularly imprinted polymers are a promising recognition mode with several advantages over natural antibodies in the recognition of small molecules. However, because of their bulk polymer structure, they are poorly suited for the rapid diffusion desired for recognition of proteins and other macromolecules. Molecularly imprinted monolayers are an emerging class of materials where the surface is imprinted, and there is not a bulk material to impede mass transfer. Further, the short distance between the binding site and the signal transduction material improves many modes of detection. My dissertation project is to develop a new chemistry for protein-imprinted self-assembled monolayers on gold, for incorporation into SPR sensors. Our unique contribution is the spatial imprinting of not only physical cues (seen in current imprinted monolayer techniques), but to also incorporate complementary chemical cues. This is accomplished through a photo-click grafting of preassembled ligands around a protein template. This conference is important for my development as a graduate student to broaden my appreciation of the sensor development beyond surface chemistry.Keywords: low-density self-assembled monolayers, thiol-yne click reaction, molecular imprinting
Procedia PDF Downloads 2243 Supply Side Readiness for Universal Health Coverage: Assessing the Availability and Depth of Essential Health Package in Rural, Remote and Conflict Prone District
Authors: Veenapani Rajeev Verma
Abstract:
Context: Assessing facility readiness is paramount as it can indicate capacity of facilities to provide essential care for resilience to health challenges. In the context of decentralization, estimation of supply side readiness indices at sub national level is imperative for effective evidence based policy but remains a colossal challenge due to lack of dependable and representative data sources. Setting: District Poonch of Jammu and Kashmir was selected for this study. It is remote, rural district with unprecedented topographical barriers and is identified as high priority by government. It is also a fragile area as is bounded by Line of Control with Pakistan bearing the brunt of cease fire violations, military skirmishes and sporadic militant attacks. Hilly geographical terrain, rudimentary/absence of road network and impoverishment are quintessential to this area. Objectives: Objective of the study is to a) Evaluate the service readiness of health facilities and create a concise index subsuming plethora of discrete indicators and b) Ascertain supply side barriers in service provisioning via stakeholder’s analysis. Study also strives to expand analytical domain unravelling context and area specific intricacies associated with service delivery. Methodology: Mixed method approach was employed to triangulate quantitative analysis with qualitative nuances. Facility survey encompassing 90 Subcentres, 44 Primary health centres, 3 Community health centres and 1 District hospital was conducted to gauge general service availability and service specific availability (depth of coverage). Compendium of checklist was designed using Indian Public Health Standards (IPHS) in form of standard core questionnaire and scorecard generated for each facility. Information was collected across dimensions of amenities, equipment, medicines, laboratory and infection control protocols as proposed in WHO’s Service Availability and Readiness Assesment (SARA). Two stage polychoric principal component analysis employed to generate a parsimonious index by coalescing an array of tracer indicators. OLS regression method used to determine factors explaining composite index generated from PCA. Stakeholder analysis was conducted to discern qualitative information. Myriad of techniques like observations, key informant interviews and focus group discussions using semi structured questionnaires on both leaders and laggards were administered for critical stakeholder’s analysis. Results: General readiness score of health facilities was found to be 0.48. Results indicated poorest readiness for subcentres and PHC’s (first point of contact) with composite score of 0.47 and 0.41 respectively. For primary care facilities; principal component was characterized by basic newborn care as well as preparedness for delivery. Results revealed availability of equipment and surgical preparedness having lowest score (0.46 and 0.47) for facilities providing secondary care. Presence of contractual staff, more than 1 hr walk to facility, facilities in zone A (most vulnerable) to cross border shelling and facilities inaccessible due to snowfall and thick jungles was negatively associated with readiness index. Nonchalant staff attitude, unavailability of staff quarters, leakages and constraint in supply chain of drugs and consumables were other impediments identified. Conclusions/Policy Implications: It is pertinent to first strengthen primary care facilities in this setting. Complex dimensions such as geographic barriers, user and provider behavior is not under precinct of this methodology.Keywords: effective coverage, principal component analysis, readiness index, universal health coverage
Procedia PDF Downloads 1202 Modeling the Human Harbor: An Equity Project in New York City, New York USA
Authors: Lauren B. Birney
Abstract:
The envisioned long-term outcome of this three-year research, and implementation plan is for 1) teachers and students to design and build their own computational models of real-world environmental-human health phenomena occurring within the context of the “Human Harbor” and 2) project researchers to evaluate the degree to which these integrated Computer Science (CS) education experiences in New York City (NYC) public school classrooms (PreK-12) impact students’ computational-technical skill development, job readiness, career motivations, and measurable abilities to understand, articulate, and solve the underlying phenomena at the center of their models. This effort builds on the partnership’s successes over the past eight years in developing a benchmark Model of restoration-based Science, Technology, Engineering, and Math (STEM) education for urban public schools and achieving relatively broad-based implementation in the nation’s largest public school system. The Billion Oyster Project Curriculum and Community Enterprise for Restoration Science (BOP-CCERS STEM + Computing) curriculum, teacher professional developments, and community engagement programs have reached more than 200 educators and 11,000 students at 124 schools, with 84 waterfront locations and Out of School of Time (OST) programs. The BOP-CCERS Partnership is poised to develop a more refined focus on integrating computer science across the STEM domains; teaching industry-aligned computational methods and tools; and explicitly preparing students from the city’s most under-resourced and underrepresented communities for upwardly mobile careers in NYC’s ever-expanding “digital economy,” in which jobs require computational thinking and an increasing percentage require discreet computer science technical skills. Project Objectives include the following: 1. Computational Thinking (CT) Integration: Integrate computational thinking core practices across existing middle/high school BOP-CCERS STEM curriculum as a means of scaffolding toward long term computer science and computational modeling outcomes. 2. Data Science and Data Analytics: Enabling Researchers to perform interviews with Teachers, students, community members, partners, stakeholders, and Science, Technology, Engineering, and Mathematics (STEM) industry Professionals. Collaborative analysis and data collection were also performed. As a centerpiece, the BOP-CCERS partnership will expand to include a dedicated computer science education partner. New York City Department of Education (NYCDOE), Computer Science for All (CS4ALL) NYC will serve as the dedicated Computer Science (CS) lead, advising the consortium on integration and curriculum development, working in tandem. The BOP-CCERS Model™ also validates that with appropriate application of technical infrastructure, intensive teacher professional developments, and curricular scaffolding, socially connected science learning can be mainstreamed in the nation’s largest urban public school system. This is evidenced and substantiated in the initial phases of BOP-CCERS™. The BOP-CCERS™ student curriculum and teacher professional development have been implemented in approximately 24% of NYC public middle schools, reaching more than 250 educators and 11,000 students directly. BOP-CCERS™ is a fully scalable and transferable educational model, adaptable to all American school districts. In all settings of the proposed Phase IV initiative, the primary beneficiary group will be underrepresented NYC public school students who live in high-poverty neighborhoods and are traditionally underrepresented in the STEM fields, including African Americans, Latinos, English language learners, and children from economically disadvantaged households. In particular, BOP-CCERS Phase IV will explicitly prepare underrepresented students for skilled positions within New York City’s expanding digital economy, computer science, computational information systems, and innovative technology sectors.Keywords: computer science, data science, equity, diversity and inclusion, STEM education
Procedia PDF Downloads 581 An Intelligent Search and Retrieval System for Mining Clinical Data Repositories Based on Computational Imaging Markers and Genomic Expression Signatures for Investigative Research and Decision Support
Authors: David J. Foran, Nhan Do, Samuel Ajjarapu, Wenjin Chen, Tahsin Kurc, Joel H. Saltz
Abstract:
The large-scale data and computational requirements of investigators throughout the clinical and research communities demand an informatics infrastructure that supports both existing and new investigative and translational projects in a robust, secure environment. In some subspecialties of medicine and research, the capacity to generate data has outpaced the methods and technology used to aggregate, organize, access, and reliably retrieve this information. Leading health care centers now recognize the utility of establishing an enterprise-wide, clinical data warehouse. The primary benefits that can be realized through such efforts include cost savings, efficient tracking of outcomes, advanced clinical decision support, improved prognostic accuracy, and more reliable clinical trials matching. The overarching objective of the work presented here is the development and implementation of a flexible Intelligent Retrieval and Interrogation System (IRIS) that exploits the combined use of computational imaging, genomics, and data-mining capabilities to facilitate clinical assessments and translational research in oncology. The proposed System includes a multi-modal, Clinical & Research Data Warehouse (CRDW) that is tightly integrated with a suite of computational and machine-learning tools to provide insight into the underlying tumor characteristics that are not be apparent by human inspection alone. A key distinguishing feature of the System is a configurable Extract, Transform and Load (ETL) interface that enables it to adapt to different clinical and research data environments. This project is motivated by the growing emphasis on establishing Learning Health Systems in which cyclical hypothesis generation and evidence evaluation become integral to improving the quality of patient care. To facilitate iterative prototyping and optimization of the algorithms and workflows for the System, the team has already implemented a fully functional Warehouse that can reliably aggregate information originating from multiple data sources including EHR’s, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology PAC systems, Digital Pathology archives, Unstructured Clinical Documents, and Next Generation Sequencing services. The System enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information about patient tumors individually or as part of large cohorts to identify patterns that may influence treatment decisions and outcomes. The CRDW core system has facilitated peer-reviewed publications and funded projects, including an NIH-sponsored collaboration to enhance the cancer registries in Georgia, Kentucky, New Jersey, and New York, with machine-learning based classifications and quantitative pathomics, feature sets. The CRDW has also resulted in a collaboration with the Massachusetts Veterans Epidemiology Research and Information Center (MAVERIC) at the U.S. Department of Veterans Affairs to develop algorithms and workflows to automate the analysis of lung adenocarcinoma. Those studies showed that combining computational nuclear signatures with traditional WHO criteria through the use of deep convolutional neural networks (CNNs) led to improved discrimination among tumor growth patterns. The team has also leveraged the Warehouse to support studies to investigate the potential of utilizing a combination of genomic and computational imaging signatures to characterize prostate cancer. The results of those studies show that integrating image biomarkers with genomic pathway scores is more strongly correlated with disease recurrence than using standard clinical markers.Keywords: clinical data warehouse, decision support, data-mining, intelligent databases, machine-learning.
Procedia PDF Downloads 126