Search results for: nonlinear control systems
14 Tackling the Decontamination Challenge: Nanorecycling of Plastic Waste
Authors: Jocelyn Doucet, Jean-Philippe Laviolette, Ali Eslami
Abstract:
The end-of-life management and recycling of polymer wastes remains a key environment issue in on-going efforts to increase resource efficiency and attaining GHG emission reduction targets. Half of all the plastics ever produced were made in the last 13 years, and only about 16% of that plastic waste is collected for recycling, while 25% is incinerated, 40% is landfilled, and 19% is unmanaged and leaks in the environment and waterways. In addition to the plastic collection issue, the UN recently published a report on chemicals in plastics, which adds another layer of challenge when integrating recycled content containing toxic products into new products. To tackle these important issues, innovative solutions are required. Chemical recycling of plastics provides new complementary alternatives to the current recycled plastic market by converting waste material into a high value chemical commodity that can be reintegrated in a variety of applications, making the total market size of the output – virgin-like, high value products - larger than the market size of the input – plastic waste. Access to high-quality feedstock also remains a major obstacle, primarily due to material contamination issues. Pyrowave approaches this challenge with its innovative nano-recycling technology, which purifies polymers at the molecular level, removing undesirable contaminants and restoring the resin to its virgin state without having to depolymerise it. This breakthrough approach expands the range of plastics that can be effectively recycled, including mixed plastics with various contaminants such as lead, inorganic pigments, and flame retardants. The technology allows yields below 100ppm, and purity can be adjusted to an infinitesimal level depending on the customer's specifications. The separation of the polymer and contaminants in Pyrowave's nano-recycling process offers the unique ability to customize the solution on targeted additives and contaminants to be removed based on the difference in molecular size. This precise control enables the attainment of a final polymer purity equivalent to virgin resin. The patented process involves dissolving the contaminated material using a specially formulated solvent, purifying the mixture at the molecular level, and subsequently extracting the solvent to yield a purified polymer resin that can directly be reintegrated in new products without further treatment. Notably, this technology offers simplicity, effectiveness, and flexibility while minimizing environmental impact and preserving valuable resources in the manufacturing circuit. Pyrowave has successfully applied this nano-recycling technology to decontaminate polymers and supply purified, high-quality recycled plastics to critical industries, including food-contact compliance. The technology is low-carbon, electrified, and provides 100% traceable resins with properties identical to those of virgin resins. Additionally, the issue of low recycling rates and the limited market for traditionally hard-to-recycle plastic waste has fueled the need for new complementary alternatives. Chemical recycling, such as Pyrowave's microwave depolymerization, presents a sustainable and efficient solution by converting plastic waste into high-value commodities. By employing microwave catalytic depolymerization, Pyrowave enables a truly circular economy of plastics, particularly in treating polystyrene waste to produce virgin-like styrene monomers. This revolutionary approach boasts low energy consumption, high yields, and a reduced carbon footprint. Pyrowave offers a portfolio of sustainable, low-carbon, electric solutions to give plastic waste a second life and paves the way to the new circular economy of plastics. Here, particularly for polystyrene, we show that styrene monomer yields from Pyrowave’s polystyrene microwave depolymerization reactor is 2,2 to 1,5 times higher than that of the thermal conventional pyrolysis. In addition, we provide a detailed understanding of the microwave assisted depolymerization via analyzing the effects of microwave power, pyrolysis time, microwave receptor and temperature on the styrene product yields. Furthermore, we investigate life cycle environmental impact assessment of microwave assisted pyrolysis of polystyrene in commercial-scale production. Finally, it is worth pointing out that Pyrowave is able to treat several tons of polystyrene to produce virgin styrene monomers and manage waste/contaminated polymeric materials as well in a truly circular economy.Keywords: nanorecycling, nanomaterials, plastic recycling, depolymerization
Procedia PDF Downloads 6513 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 22412 Design and Construction of a Solar Dehydration System as a Technological Strategy for Food Sustainability in Difficult-to-Access Territories
Authors: Erika T. Fajardo-Ariza, Luis A. Castillo-Sanabria, Andrea Nieto-Veloza, Carlos M. Zuluaga-Domínguez
Abstract:
The growing emphasis on sustainable food production and preservation has driven the development of innovative solutions to minimize postharvest losses and improve market access for small-scale farmers. This project focuses on designing, constructing, and selecting materials for solar dryers in certain regions of Colombia where inadequate infrastructure limits access to major commercial hubs. Postharvest losses pose a significant challenge, impacting food security and farmer income. Addressing these losses is crucial for enhancing the value of agricultural products and supporting local economies. A comprehensive survey of local farmers revealed substantial challenges, including limited market access, inefficient transportation, and significant postharvest losses. For crops such as coffee, bananas, and citrus fruits, losses range from 0% to 50%, driven by factors like labor shortages, adverse climatic conditions, and transportation difficulties. To address these issues, the project prioritized selecting effective materials for the solar dryer. Various materials, recovered acrylic, original acrylic, glass, and polystyrene, were tested for their performance. The tests showed that recovered acrylic and glass were most effective in increasing the temperature difference between the interior and the external environment. The solar dryer was designed using Fusion 360® software (Autodesk, USA) and adhered to architectural guidelines from Architectural Graphic Standards. It features up to sixteen aluminum trays, each with a maximum load capacity of 3.5 kg, arranged in two levels to optimize drying efficiency. The constructed dryer was then tested with two locally available plant materials: green plantains (Musa paradisiaca L.) and snack bananas (Musa AA Simonds). To monitor performance, Thermo hygrometers and an Arduino system recorded internal and external temperature and humidity at one-minute intervals. Despite challenges such as adverse weather conditions and delays in local government funding, the active involvement of local producers was a significant advantage, fostering ownership and understanding of the project. The solar dryer operated under conditions of 31°C dry bulb temperature (Tbs), 55% relative humidity, and 21°C wet bulb temperature (Tbh). The drying curves showed a consistent drying period with critical moisture content observed between 200 and 300 minutes, followed by a sharp decrease in moisture loss, reaching an equilibrium point after 3,400 minutes. Although the solar dryer requires more time and is highly dependent on atmospheric conditions, it can approach the efficiency of an electric dryer when properly optimized. The successful design and construction of solar dryer systems in difficult-to-access areas represent a significant advancement in agricultural sustainability and postharvest loss reduction. By choosing effective materials such as recovered acrylic and implementing a carefully planned design, the project provides a valuable tool for local farmers. The initiative not only improves the quality and marketability of agricultural products but also offers broader environmental benefits, such as reduced reliance on fossil fuels and decreased waste. Additionally, it supports economic growth by enhancing the value of crops and potentially increasing farmer income. The successful implementation and testing of the dryer, combined with the engagement of local stakeholders, highlight its potential for replication and positive impact in similar contexts.Keywords: drying technology, postharvest loss reduction, solar dryers, sustainable agriculture
Procedia PDF Downloads 2711 Open Science Philosophy, Research and Innovation
Authors: C.Ardil
Abstract:
Open Science translates the understanding and application of various theories and practices in open science philosophy, systems, paradigms and epistemology. Open Science originates with the premise that universal scientific knowledge is a product of a collective scholarly and social collaboration involving all stakeholders and knowledge belongs to the global society. Scientific outputs generated by public research are a public good that should be available to all at no cost and without barriers or restrictions. Open Science has the potential to increase the quality, impact and benefits of science and to accelerate advancement of knowledge by making it more reliable, more efficient and accurate, better understandable by society and responsive to societal challenges, and has the potential to enable growth and innovation through reuse of scientific results by all stakeholders at all levels of society, and ultimately contribute to growth and competitiveness of global society. Open Science is a global movement to improve accessibility to and reusability of research practices and outputs. In its broadest definition, it encompasses open access to publications, open research data and methods, open source, open educational resources, open evaluation, and citizen science. The implementation of open science provides an excellent opportunity to renegotiate the social roles and responsibilities of publicly funded research and to rethink the science system as a whole. Open Science is the practice of science in such a way that others can collaborate and contribute, where research data, lab notes and other research processes are freely available, under terms that enable reuse, redistribution and reproduction of the research and its underlying data and methods. Open Science represents a novel systematic approach to the scientific process, shifting from the standard practices of publishing research results in scientific publications towards sharing and using all available knowledge at an earlier stage in the research process, based on cooperative work and diffusing scholarly knowledge with no barriers and restrictions. Open Science refers to efforts to make the primary outputs of publicly funded research results (publications and the research data) publicly accessible in digital format with no limitations. Open Science is about extending the principles of openness to the whole research cycle, fostering, sharing and collaboration as early as possible, thus entailing a systemic change to the way science and research is done. Open Science is the ongoing transition in how open research is carried out, disseminated, deployed, and transformed to make scholarly research more open, global, collaborative, creative and closer to society. Open Science involves various movements aiming to remove the barriers for sharing any kind of output, resources, methods or tools, at any stage of the research process. Open Science embraces open access to publications, research data, source software, collaboration, peer review, notebooks, educational resources, monographs, citizen science, or research crowdfunding. The recognition and adoption of open science practices, including open science policies that increase open access to scientific literature and encourage data and code sharing, is increasing in the open science philosophy. Revolutionary open science policies are motivated by ethical, moral or utilitarian arguments, such as the right to access digital research literature for open source research or science data accumulation, research indicators, transparency in the field of academic practice, and reproducibility. Open science philosophy is adopted primarily to demonstrate the benefits of open science practices. Researchers use open science applications for their own advantage in order to get more offers, increase citations, attract media attention, potential collaborators, career opportunities, donations and funding opportunities. In open science philosophy, open data findings are evidence that open science practices provide significant benefits to researchers in scientific research creation, collaboration, communication, and evaluation according to more traditional closed science practices. Open science considers concerns such as the rigor of peer review, common research facts such as financing and career development, and the sacrifice of author rights. Therefore, researchers are recommended to implement open science research within the framework of existing academic evaluation and incentives. As a result, open science research issues are addressed in the areas of publishing, financing, collaboration, resource management and sharing, career development, discussion of open science questions and conclusions.Keywords: Open Science, Open Science Philosophy, Open Science Research, Open Science Data
Procedia PDF Downloads 12910 Modeling the Human Harbor: An Equity Project in New York City, New York USA
Authors: Lauren B. Birney
Abstract:
The envisioned long-term outcome of this three-year research, and implementation plan is for 1) teachers and students to design and build their own computational models of real-world environmental-human health phenomena occurring within the context of the “Human Harbor” and 2) project researchers to evaluate the degree to which these integrated Computer Science (CS) education experiences in New York City (NYC) public school classrooms (PreK-12) impact students’ computational-technical skill development, job readiness, career motivations, and measurable abilities to understand, articulate, and solve the underlying phenomena at the center of their models. This effort builds on the partnership’s successes over the past eight years in developing a benchmark Model of restoration-based Science, Technology, Engineering, and Math (STEM) education for urban public schools and achieving relatively broad-based implementation in the nation’s largest public school system. The Billion Oyster Project Curriculum and Community Enterprise for Restoration Science (BOP-CCERS STEM + Computing) curriculum, teacher professional developments, and community engagement programs have reached more than 200 educators and 11,000 students at 124 schools, with 84 waterfront locations and Out of School of Time (OST) programs. The BOP-CCERS Partnership is poised to develop a more refined focus on integrating computer science across the STEM domains; teaching industry-aligned computational methods and tools; and explicitly preparing students from the city’s most under-resourced and underrepresented communities for upwardly mobile careers in NYC’s ever-expanding “digital economy,” in which jobs require computational thinking and an increasing percentage require discreet computer science technical skills. Project Objectives include the following: 1. Computational Thinking (CT) Integration: Integrate computational thinking core practices across existing middle/high school BOP-CCERS STEM curriculum as a means of scaffolding toward long term computer science and computational modeling outcomes. 2. Data Science and Data Analytics: Enabling Researchers to perform interviews with Teachers, students, community members, partners, stakeholders, and Science, Technology, Engineering, and Mathematics (STEM) industry Professionals. Collaborative analysis and data collection were also performed. As a centerpiece, the BOP-CCERS partnership will expand to include a dedicated computer science education partner. New York City Department of Education (NYCDOE), Computer Science for All (CS4ALL) NYC will serve as the dedicated Computer Science (CS) lead, advising the consortium on integration and curriculum development, working in tandem. The BOP-CCERS Model™ also validates that with appropriate application of technical infrastructure, intensive teacher professional developments, and curricular scaffolding, socially connected science learning can be mainstreamed in the nation’s largest urban public school system. This is evidenced and substantiated in the initial phases of BOP-CCERS™. The BOP-CCERS™ student curriculum and teacher professional development have been implemented in approximately 24% of NYC public middle schools, reaching more than 250 educators and 11,000 students directly. BOP-CCERS™ is a fully scalable and transferable educational model, adaptable to all American school districts. In all settings of the proposed Phase IV initiative, the primary beneficiary group will be underrepresented NYC public school students who live in high-poverty neighborhoods and are traditionally underrepresented in the STEM fields, including African Americans, Latinos, English language learners, and children from economically disadvantaged households. In particular, BOP-CCERS Phase IV will explicitly prepare underrepresented students for skilled positions within New York City’s expanding digital economy, computer science, computational information systems, and innovative technology sectors.Keywords: computer science, data science, equity, diversity and inclusion, STEM education
Procedia PDF Downloads 589 “MaxSALIVA”: A Nano-Sized Dual-Drug Delivery System for Salivary Gland Radioprotection and Repair in Head and Neck Cancer
Authors: Ziyad S. Haidar
Abstract:
Background: Saliva plays a major role in maintaining oral and dental health (consequently, general health and well-being). Where it normally bathes the oral cavity and acts as a clearing agent. This becomes more apparent when the amount and quality of salivare significantly reduced due to medications, salivary gland neoplasms, disorders such as Sjögren’s syndrome, and especially ionizing radiation therapy for tumors of the head and neck, the fifth most common malignancy worldwide, during which the salivary glands are included within the radiation field or zone. Clinically, patients affected by salivary gland dysfunction often opt to terminate their radiotherapy course prematurely because they become malnourished and experience a significant decrease in their quality of life. Accordingly, the development of an alternative treatment to restore or regenerate damaged salivary gland tissue is eagerly awaited. Likewise, the formulation of a radioprotection modality and early damage prevention strategy is also highly desirable. Objectives: To assess the pre-clinical radio-protective effect as well as the reparative/regenerative potential of layer-by-layer self-assembled lipid-polymer-based core-shell nanocapsules designed and fine-tuned in this experimental work for the sequential (ordered) release of dual cytokines, following a single local administration (direct injection) into a murine sub-mandibular salivary gland model of irradiation. Methods: The formulated core-shell nanocapsules were characterized by physical-chemical-mechanically pre-/post-loading with the drugs (in solution and powder formats), followed by optimizing the pharmaco-kinetic profile. Then, nanosuspensions were administered directly into the salivary glands, 24hrs pre-irradiation (PBS, un-loaded nanocapsules, and individual and combined vehicle-free cytokines were injected into the control glands for an in-depth comparative analysis). External irradiation at an elevated dose of 18Gy (revised from our previous 15Gy model) was exposed to the head-and-neck region of C57BL/6 mice. Salivary flow rate (un-stimulated) and salivary protein content/excretion were regularly assessed using an enzyme-linked immunosorbent assay (3-month period). Histological and histomorphometric evaluation and apoptosis/proliferation analysis followed by local versus systemic bio-distribution and immuno-histochemical assays were then performed on all harvested major organs (at the distinct experimental end-points). Results: Monodisperse, stable, and cytocompatible nanocapsules capable of maintaining the bioactivity of the encapsulant within the different compartments with the core and shell and with controlled/customizable pharmaco-kinetics, resulted, as is illustrated in the graphical abstract (Figure) below. The experimental animals demonstrated a significant increase in salivary flow rates when compared to the controls. Herein, salivary protein content was comparable to the pre-irradiation (baseline) level. Histomorphometry further confirmed the biocompatibility and localization of the nanocapsules, in vivo, into the site of injection. Acinar cells showed fewer vacuoles and nuclear aberration in the experimental group, while the amount of mucin was higher in controls. Overall, fewer apoptotic activities were detected by a Terminal deoxynucleotidyl Transferase (TdT) dUTP Nick-End Labeling (TUNEL) assay and proliferative rates were similar to the controls, suggesting an interesting reparative and regenerative potential of irradiation-damaged/-dysfunctional salivary glands. The Figure below exemplifies some of these findings. Conclusions: Biocompatible, reproducible, and customizable self-assembling layer-by-layer core-shell delivery system is formulated and presented. Our findings suggest that localized sequential bioactive delivery of dual cytokines (in specific dose and order) can prevent irradiation-induced damage via reducing apoptosis and also has the potential to promote in situ proliferation of salivary gland cells; maxSALIVA is scalable (Good Manufacturing Practice or GMP production for human clinical trials) and patent-pending.Keywords: saliva, head and neck cancer, nanotechnology, controlled drug delivery, xerostomia, mucositis, biopolymers, innovation
Procedia PDF Downloads 868 Modern Day Second Generation Military Filipino Amerasians and Ghosts of the U.S. Military Prostitution System in West Central Luzon's 'AMO Amerasian Triangle'
Authors: P. C. Kutschera, Elena C. Tesoro, Mary Grace Talamera-Sandico, Jose Maria G. Pelayo III
Abstract:
Second generation military Filipino Amerasians comprise a formidable contemporary segment of the estimated 250,000-plus biracial Amerasians in the Philippines today. Overall, they are a stigmatized and socioeconomically marginalized diaspora, historically; they were abandoned or estranged by U.S. military personnel fathers assigned during the century-long Colonial, Post-World War II and Cold War Era of permanent military basing (1898-1992). Indeed, U.S. military personnel remain stationed in smaller numbers in the Philippines today. This inquiry is an outgrowth of two recent small sample studies. The first surfaced the impact of the U.S. military prostitution system on formation of the ‘Derivative Amerasian Family Construct’ on first generation Amerasians; a second, qualitative case study suggested the continued effect of the prostitution systems' destructive impetuous on second generation Amerasians. The intent of this current qualitative, multiple-case study was to actively seek out second generation sex industry toilers. The purpose was to focus further on this human phenomenon in the post-basing and post-military prostitution system eras. As background, the former military prostitution apparatus has transformed into a modern dynamic of rampant sex tourism and prostitution nationwide. This is characterized by hotel and resorts offering unrestricted carnal access, urban and provincial brothels (casas), discos, bars and pickup clubs, massage parlors, local barrio karaoke bars and street prostitution. A small case study sample (N = 4) of female and male second generation Amerasians were selected. Sample formation employed a non-probability ‘snowball’ technique drawing respondents from the notorious Angeles, Metro Manila, Olongapo City ‘AMO Amerasian Triangle’ where most former U.S. military installations were sited and modern sex tourism thrives. A six-month study and analysis of in-depth interviews of female and male sex laborers, their families and peers revealed a litany of disturbing, and troublesome experiences. Results showed profiles of debilitating human poverty, history of family disorganization, stigmatization, social marginalization and the ghost of the military prostitution system and its harmful legacy on Amerasian family units. Emerging were testimonials of wayward young people ensnared in a maelstrom of deep economic deprivation, familial dysfunction, psychological desperation and societal indifference. The paper recommends that more study is needed and implications of unstudied psychosocial and socioeconomic experiences of distressed younger generations of military Amerasians require specific research. Heretofore apathetic or disengaged U.S. institutions need to confront the issue and formulate activist and solution-oriented social welfare, human services and immigration easement policies and alternatives. These institutions specifically include academic and social science research agencies, corporate foundations, the U.S. Congress, and Departments of State, Defense and Health and Human Services, and Homeland Security (i.e. Citizen and Immigration Services) It is them who continue to endorse a laissez-faire policy of non-involvement over the entire Filipino Amerasian question. Such apathy, the paper concludes, relegates this consequential but neglected blood progeny to the status of humiliating destitution and exploitation. Amerasians; thus, remain entrapped in their former colonial, and neo-colonial habitat. Ironically, they are unwitting victims of a U.S. American homeland that fancies itself geo-politically as a strong and strategic military treaty ally of the Philippines in the Western Pacific.Keywords: Asian Americans, diaspora, Filipino Amerasians, military prostitution, stigmatization
Procedia PDF Downloads 4867 Times2D: A Time-Frequency Method for Time Series Forecasting
Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan
Abstract:
Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation
Procedia PDF Downloads 416 A Comprehensive Study of Spread Models of Wildland Fires
Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran
Abstract:
These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling
Procedia PDF Downloads 815 Municipal Solid Waste Management in Ethiopia: Systematic Review of Physical and Chemical Compositions and Generation Rate
Authors: Tsegay Kahsay Gebrekidan, Gebremariam Gebrezgabher Gebremedhin, Abraha Kahsay Weldemariam, Meaza Kidane Teferi
Abstract:
Municipal solid waste management (MSWM) in Ethiopia is a complex issue with institutional, social, political, environmental, and economic dimensions, impacting sustainable development. Effective MSWM planning necessitates understanding the generation rate and composition of waste. This systematic review synthesizes qualitative and quantitative data from various sources to aggregate current knowledge, identify gaps, and provide a comprehensive understanding of municipal solid waste management in Ethiopia. The findings reveal that the generation rate of municipal solid waste in Ethiopia is 0.38 kg/ca/day, with the waste composition being predominantly food waste, followed by ash, dust, and sand, and yard waste. Over 85% of this MSW is either reusable or recyclable, with a significant portion being organic matter (73.13% biodegradable) and 11.78% recyclable materials. Physicochemical analyses reveal that Ethiopian MSW is suitable for composting and biogas production, offering opportunities to reduce environmental pollution, and GHGs, support urban agriculture, and create job opportunities. However; challenges persist, including a lack of political will, weak municipal planning, limited community awareness, and inadequate waste management infrastructure, and only 31.8% of MSW is collected legally, leading to inefficient and harmful disposal practices. To improve MSWM, Ethiopia should focus on public awareness; increased funding, infrastructure investment, private sector partnerships, and implementing the 4 R principles (reduce, reuse, and recycle). An integrated approach involving government, industry, and civil society is essential. Further research on the physicochemical properties and strategic uses of MSW is needed to enhance management practices. Implications: The comprehensive study of municipal solid waste management (MSWM) in Ethiopia reveals the intricate interplay of institutional, social, political, environmental, and economic factors that influence the nation’s sustainable development. The findings underscore the urgent need for tailored, integrated waste management strategies that are informed by a thorough understanding of MSW generation rates, composition, and current management practices. Ethiopia’s lower per capita MSW generation compared to developed countries and the predominantly organic composition of its waste present significant opportunities for sustainable waste management practices such as composting and recycling. These practices can not only minimize the environmental impact but also support urban greening, agriculture, and renewable energy production. The high organic content, suitable physicochemical properties of MSW for composting, and potential for biogas and briquette production highlight pathways for creating employment, reducing waste, and enhancing soil fertility. Despite these opportunities, Ethiopia faces substantial challenges due to inadequate political will, weak municipal planning, limited community awareness, insufficient waste management infrastructure, and poor policy implementation. The high rate of illegal waste disposal further exacerbates environmental and health issues, emphasizing the need for a more effective and integrated MSWM approach. To address these challenges and harness the potential of MSW, Ethiopia must prioritize increasing public awareness; investing in infrastructure, fostering private sector partnerships, and implementing the principles of reduce, reuse, and recycle (3 R). Developing strategies that involve all stakeholders and turning waste into valuable resources is crucial. Government, industry, and civil society must collaborate to implement integrated MSWM systems that focus on waste reduction at the source, alternative material use, and advanced recycling technologies. Further research at both federal and regional levels is essential to optimize the physicochemical analysis and strategic use of MSW. Prompt action is required to transform waste management into a pillar of sustainable urban development, ultimately improving environmental quality and human health in Ethiopia.Keywords: biodegradable, healthy environment, integrated solid waste management, municipal
Procedia PDF Downloads 114 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 1243 The Impact of the Macro-Level: Organizational Communication in Undergraduate Medical Education
Authors: Julie M. Novak, Simone K. Brennan, Lacey Brim
Abstract:
Undergraduate medical education (UME) curriculum notably addresses micro-level communications (e.g., patient-provider, intercultural, inter-professional), yet frequently under-examines the role and impact of organizational communication, a more macro-level. Organizational communication, however, functions as foundation and through systemic structures of an organization and thereby serves as hidden curriculum and influences learning experiences and outcomes. Yet, little available research exists fully examining how students experience organizational communication while in medical school. Extant literature and best practices provide insufficient guidance for UME programs, in particular. The purpose of this study was to map and examine current organizational communication systems and processes in a UME program. Employing a phenomenology-grounded and participatory approach, this study sought to understand the organizational communication system from medical students' perspective. The research team consisted of a core team and 13 medical student co-investigators. This research employed multiple methods, including focus groups, individual interviews, and two surveys (one reflective of focus group questions, the other requesting students to submit ‘examples’ of communications). To provide context for student responses, nonstudent participants (faculty, administrators, and staff) were sampled, as they too express concerns about communication. Over 400 students across all cohorts and 17 nonstudents participated. Data were iteratively analyzed and checked for triangulation. Findings reveal the complex nature of organizational communication and student-oriented communications. They reveal program-impactful strengths, weaknesses, gaps, and tensions and speak to the role of organizational communication practices influencing both climate and culture. With regard to communications, students receive multiple, simultaneous communications from multiple sources/channels, both formal (e.g., official email) and informal (e.g., social media). Students identified organizational strengths including the desire to improve student voice, and message frequency. They also identified weaknesses related to over-reliance on emails, numerous platforms with inconsistent utilization, incorrect information, insufficient transparency, assessment/input fatigue, tacit expectations, scheduling/deadlines, responsiveness, and mental health confidentiality concerns. Moreover, they noted gaps related to lack of coordination/organization, ambiguous point-persons, student ‘voice-only’, open communication loops, lack of core centralization and consistency, and mental health bridges. Findings also revealed organizational identity and cultural characteristics as impactful on the medical school experience. Cultural characteristics included program size, diversity, urban setting, student organizations, community-engagement, crisis framing, learning for exams, inefficient bureaucracy, and professionalism. Moreover, they identified system structures that do not always leverage cultural strengths or reduce cultural problematics. Based on the results, opportunities for productive change are identified. These include leadership visibly supporting and enacting overall organizational narratives, making greater efforts in consistently ‘closing the loop’, regularly sharing how student input effects change, employing strategies of crisis communication more often, strengthening communication infrastructure, ensuring structures facilitate effective operations and change efforts, and highlighting change efforts in informational communication. Organizational communication and communications are not soft-skills, or of secondary concern within organizations, rather they are foundational in nature and serve to educate/inform all stakeholders. As primary stakeholders, students and their success directly affect the accomplishment of organizational goals. This study demonstrates how inquiries about how students navigate their educational experience extends research-based knowledge and provides actionable knowledge for the improvement of organizational operations in UME.Keywords: medical education programs, organizational communication, participatory research, qualitative mixed methods
Procedia PDF Downloads 1122 Impacts of Transformational Leadership: Petronas Stations in Sabah, Malaysia
Authors: Lizinis Cassendra Frederick Dony, Jirom Jeremy Frederick Dony, Cyril Supain Christopher
Abstract:
The purpose of this paper is to improve the devotion to leadership through HR practices implementation at the PETRONAS stations. This emphasize the importance of personal grooming and Customer Care hospitality training for their front line working individuals and teams’ at PETRONAS stations in Sabah. Based on Thomas Edison, International Leadership Journal, theory, research, education and development practice and application to all organizational phenomena may affect or be affected by leadership. FINDINGS – PETRONAS in short called Petroliam Nasional Berhad is a Malaysian oil and gas company that was founded on August 17, 1974. Wholly owned by the Government of Malaysia, the corporation is vested with the entire oil and gas resources in Malaysia and is entrusted with the responsibility of developing and adding value to these resources. Fortune ranks PETRONAS as the 68th largest company in the world in 2012. It also ranks PETRONAS as the 12th most profitable company in the world and the most profitable in Asia. As of the end of March 2005, the PETRONAS Group comprised 103 wholly owned subsidiaries, 19 partly owned outfits and 57 associated companies. The group is engaged in a wide spectrum of petroleum activities, including upstream exploration and production of oil and gas to downstream oil refining, marketing and distribution of petroleum products, trading, gas processing and liquefaction, gas transmission pipeline network operations, marketing of liquefied natural gas; petrochemical manufacturing and marketing; shipping; automotive engineering and property investment. PETRONAS has growing their marketing channel in a competitive market. They have combined their resources to pursue common goals. PETRONAS provides opportunity to carry out Industrial Training Job Placement to the University students in Malaysia for 6-8 months. The effects of the Industrial Training have exposed them to the real working environment experience acting representing on behalf of General Manager for almost one year. Thus, the management education and reward incentives schemes have aspire the working teams transformed to gain their good leadership. Furthermore, knowledge and experiences are very important in the human capital development transformation. SPSS extends the accurate analysis PETRONAS achievement through 280 questionnaires and 81 questionnaires through excel calculation distributed to interview face to face with the customers, PETRONAS dealers and front desk staffs stations in the 17 stations in Kota Kinabalu, Sabah. Hence, this research study will improve its service quality innovation and business sustainability performance optimization. ORIGINALITY / VALUE – The impact of Transformational Leadership practices have influenced the working team’s behaviour as a Brand Ambassadors of PETRONAS. Finally, the findings correlation indicated that PETRONAS stations needs more HR resources practices to deploy more customer care retention resources in mitigating the business challenges in oil and gas industry. Therefore, as the business established at stiff competition globally (Cooper, 2006; Marques and Simon, 2006), it is crucial for the team management should be capable to minimize noises risk, financial risk and mitigating any other risks as a whole at the optimum level. CONCLUSION- As to conclude this research found that both transformational and transactional contingent reward leadership4 were positively correlated with ratings of platoon potency and ratings of leadership for the platoon leader and sergeant were moderately inter correlated. Due to this identification, we recommended that PETRONAS management should offers quality team management in PETRONAS stations in a broader variety of leadership training specialization in the operation efficiency at the front desk Customer Care hospitality. By having the reliability and validity of job experiences, it leverages diversity teamwork and cross collaboration. Other than leveraging factor, PETRONAS also will strengthen the interpersonal front liners effectiveness and enhance quality of interaction through effective communication. Finally, through numerous CSR correlation studies regression PETRONAS performance on Corporate Social Performance and several control variables.1 CSR model activities can be mis-specified if it is not controllable under R & D which evident in various feedbacks collected from the local communities and younger generation is inclined to higher financial expectation from PETRONAS. But, however, it created a huge impact on the nation building as part of its social adaptability overreaching their business stakeholders’ satisfaction in Sabah.Keywords: human resources practices implementation (hrpi), source of competitive advantage in people’s development (socaipd), corporate social responsibility (csr), service quality at front desk stations (sqafd), impacts of petronas leadership (iopl)
Procedia PDF Downloads 3471 A Study on the Use Intention of Smart Phone
Authors: Zhi-Zhong Chen, Jun-Hao Lu, Jr., Shih-Ying Chueh
Abstract:
Based on Unified Theory of Acceptance and Use of Technology (UTAUT), the study investigates people’s intention on using smart phones. The study additionally incorporates two new variables: 'self-efficacy' and 'attitude toward using'. Samples are collected by questionnaire survey, in which 240 are valid. After Correlation Analysis, Reliability Test, ANOVA, t-test and Multiple Regression Analysis, the study finds that social impact and self-efficacy have positive effect on use intentions, and the use intentions also have positive effect on use behavior.Keywords: [1] Ajzen & Fishbein (1975), “Belief, attitude, intention and behavior: An introduction to theory and research”, Reading MA: Addison-Wesley. [2] Bandura (1977) Self-efficacy: toward a unifying theory of behavioural change. Psychological Review , 84, 191–215. [3] Bandura( 1986) A. Bandura, Social foundations of though and action, Prentice-Hall. Englewood Cliffs. [4] Ching-Hui Huang (2005). The effect of Regular Exercise on Elderly Optimism: The Self-efficacy and Theory of Reasoned Action Perspectives.(Master's dissertation, National Taiwan Sport University, 2005).National Digital Library of Theses and Dissertations in Taiwan。 [5] Chun-Mo Wu (2007).The Effects of Perceived Risk and Service Quality on Purchase Intention - an Example of Taipei City Long-Term Care Facilities. (Master's dissertation, Ming Chuan University, 2007).National Digital Library of Theses and Dissertations in Taiwan. [6] Compeau, D.R., and Higgins, C.A., (1995) “Application of social cognitive theory to training for computer skills.”, Information Systems Research, 6(2), pp.118-143. [7] computer-self-efficacy and mediators of the efficacy-performance relationship. International Journal of Human-Computer Studies, 62, 737-758. [8] Davis et al(1989), “User acceptance of computer technology: A comparison of two theoretical models ”, Management Science, 35(8), p.982-1003. [9] Davis et al(1989), “User acceptance of computer technology:A comparison of two theoretical models ”, Management Science, 35(8), p.982-1003. [10] Davis, F.D. (1989). Perceived Usefulness, Perceived Ease of Use and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319-340。 [11] Davis. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319–340. doi:10.2307/249008 [12] Johnson, R. D. (2005). An empirical investigation of sources of application-specific [13] Mei-yin Hsu (2010).The Study on Attitude and Satisfaction of Electronic Documents System for Administrators of Elementary Schools in Changhua County.(Master's dissertation , Feng Chia University, 2010).National Digital Library of Theses and Dissertations in Taiwan. [14] Ming-Chun Hsieh (2010). Research on Parents’ Attitudes Toward Electronic Toys: The case of Taichung City.(Master's dissertation, Chaoyang University of Technology,2010).National Digital Library of Theses and Dissertations in Taiwan. [15] Moon and Kim(2001). Extending the TAM for a World-Wide-Web context, Information and Management, v.38 n.4, p.217-230. [16] Shang-Yi Hu (2010).The Impacts of Knowledge Management on Customer Relationship Management – Enterprise Characteristicsand Corporate Governance as a Moderator.(Master's dissertation, Leader University, 2010)。National Digital Library of Theses and Dissertations in Taiwan. [17] Sheng-Yi Hung (2013, September10).Worldwide sale of smartphones to hit one billion IDC:Android dominate the market. ETtoday. Retrieved data form the available protocol:2013/10/3. [18] Thompson, R.L., Higgins, C.A., and Howell, J.M.(1991), “Personal Computing: Toward a Conceptual Model of Utilization”, MIS Quarterly(15:1), pp. 125-143. [19] Venkatesh, V., M.G. Morris, G.B. Davis, and F. D. Davis (2003), “User acceptance of information technology: Toward a unified view, ” MIS Quarterly, 27, No. 3, pp.425-478. [20] Vijayasarathy, L. R. (2004), Predicting Consumer Intentions to Use On-Line Shopping: The Case for an Augmented Technology Acceptance Model, Information and Management, Vol.41, No.6, pp.747-762. [21] Wikipedia - smartphone (http://zh.wikipedia.org/zh-tw/%E6%99%BA%E8%83%BD%E6%89%8B%E6%9C%BA)。 [22] Wu-Minsan (2008).The impacts of self-efficacy, social support on work adjustment with hearing impaired. (Master's dissertation, Southern Taiwan University of Science and Technology, 2008).National Digital Library of Theses and Dissertations in Taiwan. [23] Yu-min Lin (2006). The Influence of Business Employee’s MSN Self-efficacy On Instant Messaging Usage Behavior and Communicaiton Satisfaction.(Master's dissertation, National Taiwan University of Science and Technology, 2006).National Digital Library of Theses and Dissertations in Taiwan.
Procedia PDF Downloads 408