Dissertations/Thesis

Clique aqui para acessar os arquivos diretamente da Biblioteca Digital de Teses e Dissertações da UNIFEI

2024
Dissertations
1
  • FLÁVIA GONTIJO CUNHA
  • Modeling and Simulation for Economic Viability Analysis of Reverse Logistics for Waste Electronic and Electrical Equipment (WEEE)

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • MARCELLA BERNARDO PINTO
  • RAFAEL DE CARVALHO MIRANDA
  • RENATO DA SILVA LIMA
  • Data: Feb 6, 2024


  • Show Abstract
  • The technological advancements in recent decades have brought numerous improvements to daily life; however, they have also raised concerns about the management of Waste Electrical and Electronic Equipment (WEEE), as waste generation is inevitable. A large portion of this waste is improperly disposed of, a practice that not only harms the environment and human health but also results in the loss of materials present in these equipment compositions. This situation has driven the development of legislation and guidelines to achieve Circular Economy goals, aided by tools such as Reverse Logistics (RL), aiming to mitigate environmental impacts and build a more sustainable system. However, currently, only 17.40% of the total WEEE generated worldwide is recycled. Various studies have been conducted in the scientific literature on WEEE management; however, few studies focus on economic feasibility analysis. Therefore, this work aims to develop a mathematical model and employ Monte Carlo Simulation (MCS) to evaluate the economic viability of RL applied to WEEE in two situations. The first situation involves only the collection and sale of intact parts of WEEE, referred to as RInt, while the second includes collection, pre-treatment, shredding, and sale, designated as RDesmont. The study was conducted using the IDEF-SIM technique, MCS, and the development of Incremental Cash Flow (ICF). Additionally, economic feasibility indicators, Net Present Value (NPV), and Internal Rate of Return (IRR) were used. The proposed model was applied to a waste management company located in southern Minas Gerais. To conduct the analyses, nine scenarios were established, in which price and distance variables were modified. The results obtained from the proposed model revealed that, with the current collection of a total quantity of 3,529 kg of WEEE, neither of the two situations, RInt and RDesmont, proved to be economically viable. To make both situations viable, it would be necessary to increase collection by 76.54% for the RInt situation and by 186.31% for RDesmont. The analysis also reveals that a higher investment in a project does not necessarily result in higher NPVs. Therefore, if investments and recycling costs remain high, RDesmont will present NPV values lower than RInt. An improvement suggestion for the RDesmont situation concerns the disassembly phase. It is noted that by reducing the disassembly time, RDesmont becomes more economically viable than RInt. Thus, adopting ecodesign when designing Electrical and Electronic Equipment (EEE) so that they are easily disassembled can significantly contribute to reducing recycling costs. Finally, it was also verified that reducing tax values results in improvements in NPV values. However, the implementation of economic incentives, such as exemptions or tax reductions, while beneficial, is not sufficient on its own to confer greater economic viability to the RDesmont situation compared to RInt.

2
  • JULIANA CÁSSIA DE SOUZA CALIARI
  • Accreditation and quality assessment of dental services

  • Advisor : JULIANA HELENA DAROZ GAUDENCIO
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • JOÃO EDÉRSON CORRÊA
  • JULIANA HELENA DAROZ GAUDENCIO
  • RICARDO COSER MERGULHAO
  • Data: Feb 6, 2024


  • Show Abstract
  • The National Accreditation Organization developed an evaluation manual for health services, whose main criteria focused on Administrative Management, People Management and Care. Initially, in 1998, this program focused on evaluating hospitals. However, it was only in 2012 that the assessment criteria were adapted for dental services. With this in mind, the general aim of this study is to draw up an assessment of the quality of dental services, using the criteria established by the ONA's Brazilian Accreditation Manual and the Servqual model as a reference. To this end, the research method was defined as a case study in order to achieve validation in both public and private schools in a city located in the south of Minas Gerais. The results were evaluated using statistical tests and the Hierarchical Analysis Process. With the participation of 159 patients and seven dentists interviewed in the Municipal Network, the Servqual model shows that, for both interviewees, the quality sub-criteria were Equipment and Service provided correctly, on time; and the intervention sub-criterion was the fact that the Dentist was polite to the patient. In relation to the ONA model, patients pointed to the sub-criterion Channel as an improvement, the means of reporting abnormalities in relation to the service; for dentists, the improvement was the sub-criterion Hygiene of offices. In the parallel between seven dentists from the Municipal Network and five dentists from the Private Network, the criteria coincided. The results pointed to improvements, in relation to the Servqual model, in the Tangible, Safety in care and Professional responsibility criteria; and intervention strategies in the Empathy criterion. In relation to the ONA model, the criteria Organizational Leadership and Infrastructure Management had to be worked on with the other criteria used in the assessment to ensure all the gaps were filled and guarantee the effectiveness of the service. These results suggest that the criteria selected from the Servqual model and the ONA model were complementary in the evaluation of both networks. As a practical action, we propose the promotion of events to raise awareness of oral health, and thus reduce the indicators for seeking dental care.

2023
Dissertations
1
  • RODRIGO DE PAULA OLIVEIRA
  • Support for decision-making in a logistics e-marketplace through Agent-Based Simulation

  • Advisor : ALEXANDRE FERREIRA DE PINHO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • RENATO DA SILVA LIMA
  • ROSINEI BATISTA RIBEIRO
  • Data: Feb 27, 2023


  • Show Abstract
  • The growth of the internet favored the development of electronic marketplaces, online platforms that act as intermediaries in the negotiation process between two independent parties. When inserted in the context of load transport, they are known as a logistics emarketplace, and it connects the agents involved in the movement of various goods. Despite the advantages, several electronic marketplaces have failed recently due to the low performance of the platform. In this context, correct decision making, mainly at the tactical and strategic levels, can be a determining factor for the organization to present a satisfactory performance in the market in which it is inserted. One of the techniques that collaborate to help the decision-making process is the Agent-Based Simulation (ABS). ABS is a computer simulation method, and it is recommended when the analysis to be performed involves many individuals who have well-defined decisions and behaviors. The literature, despite being composed of several works that developed an ABS model to assist the decision-making process, still lacks studies that use the method in the logistics e-marketplaces environment. In fact, the dynamics of electronic marketplaces – not just the freight transport model – is still a field to be explored in academia. Thus, this work aims to develop an SBA model able to analyze the impact that different customer acquisition channels have on the final number of new users of a logistics e-marketplace, in order to support the decision making of the organization's managers. In general, three important topics obtained in this research can be highlighted: 1- the elaboration of a conceptual model, as suggested by the methodology, favored the visualization of the process to be modeled; 2 - the ODD Protocol (Overview, Design Concepts and Details) was adequate to fulfill all the activities proposed in the first phase of the simulation methodology, in addition to facilitating the replication of the model by other researchers; 3 - the validation of the model, performed in order to combine two methods, was developed in a more robust and reliable way. Regarding the results of the simulation, it was possible to establish the scenarios with the greatest impact on activating new customers in the logistics e- marketplace, in addition to defining an order of priority for the acquisition channels when making new investments. In the alternative scenario that was observed the best performance, there was improvements of up to 20% in the conversion rate of new users compared to the initial scenario. In addition, the “indicação interna” acquisition channel was defined as the priority to be invested and may improve the conversion rate of potential users by up to 16.9%. As future works, the study based on “cost benefit” is suggested, as well as the analysis of the causalities between the acquisition channels.

2
  • YAGO TOLEDO LIMA
  • Utilizing Machine Learning in Order to Include Human Emotional Factors in Simulation Projects

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • JOSE ARNALDO BARRA MONTEVECHI
  • MONA LIZA MOURA DE OLIVEIRA
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Mar 7, 2023


  • Show Abstract
  • Computational simulation is a widely used tool for monitoring and optimizing performance indicators in both academia and industry. However, human operators are commonly represented as resources with constant nominal production in simulation models. Such models are often insufficient, as they disregard the inherent variations due to human physiology. Although there are ways to represent such variations in human performance in the academic literature, emotional factors are rarely addressed. Moreover, the studies that seek to model the psychophysical attitude, which includes emotions, of the operator usually require sophisticated equipment or time-consuming forms to be filled by the operators. To present an alternative for the inclusion of the emotional human factor in simulation models, this work uses a machine learning model to identify the apparent mood of operators from footage of a manufacturing production line in operation. This approach does not interfere with the workload of the operators nor require any extra equipment other than a common video camera, filling a gap observed in the literature. This study revealed 66 statistically significant correlations between the mood variables estimated by the machine learning model and operation time on the assembly line. It was also shown that the probability distribution functions for the operation time differ significantly when considering different classes of mood. Lastly, those different curves were demonstrated in a simulation model. The results show an improvement trend in the model’s results, thus demonstrating the viability of this technique.This study also provides instructions for applying the proposed technique that can be used in similar projects. Besides simulations, this technique can be applied to a variety of fields including defects prevention, occupational health and safety.

3
  • RENAN RYUJI MURASHITA TAKENAKA
  • A COMPARATIVE STUDY BETWEEN THE SIMULTANEOUS CONTROL CHART (X-BAR, S) AND THE SHEWHART CONTROL CHARTS FOR MEAN AND STANDARD DEVIATION

  • Advisor : PAULO HENRIQUE DA SILVA CAMPOS
  • COMMITTEE MEMBERS :
  • ALEXANDRE FONSECA TORRES
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • JULIANA HELENA DAROZ GAUDENCIO
  • PAULO HENRIQUE DA SILVA CAMPOS
  • Data: Mar 16, 2023


  • Show Abstract
  • Control charts are tools from Statistical Process Control widely used to monitor key quality characteristics of a product or service. For being extremely efficient in monitoring and improving quality, research is needed to investigate different approaches. However, the diversity of control charts and application methodologies makes the selection of the best combination to implement in a process harder. Furthermore, some works shows a few or no comparison between related subjects, making the selection even harder. The simultaneous (X-bar,S) control char is a case where the presented comparison raise doubts about is effectiveness even compared to Shewhart charts for mean (X-bar) and standard deviation (S). Therefore, this study aims to compare Shewhart control charts for mean (X-bar) and standard deviation (S) with the simultaneous (X-bar,S) chart in terms of ease of identifying the process state, performance with the average run length (ARL), and ability to determine when the deviation occurred in the sample series. To do so, the procedures for constructing control charts were presented, data were simulated to calculate the NMA, and a visual evaluation of the information presented in the charts was carried out. As a result, the innovations presented in the simultaneous (X-bar,S) chart shows inferior advantages compared to traditional Shewhart charts.

4
  • ANTÔNIO ALVES DOS SANTOS JÚNIOR
  • ANALYSIS OF BUSINESS PARTNERSHIP DEVELOPMENT PROJECTS BETWEEN STARTUPS AND LARGE COMPANIES IN AN OPEN INNOVATION CONTEXT: CASE STUDY

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • ANDREA APARECIDA DA COSTA MINEIRO
  • CARLOS HENRIQUE PEREIRA MELLO
  • EDUARDO GOMES SALGADO
  • JULIANA HELENA DAROZ GAUDENCIO
  • Data: Apr 5, 2023


  • Show Abstract
  • This work proposes the development of a study to guide the business relationship between Startups and Large Companies, analyzing the network and orchestration mechanisms that govern their interactions, identifying bottlenecks that compromise existing relationships and discussing the best practices that can guide future relationships. Therefore, the concept of Startups and the mechanisms of interaction between them and large corporations in a concept of Open Innovation were explored. The adopted methodology was field study, with bibliographical research and interviews for data collection. As expected results, some aspects identified in the theory on the subject can be confirmed from the interactions with the interviewees. Others will not be so evident and will demonstrate, from the data collected and analyses, the influence of specific characteristics and realities in this process, which are also targets to be discovered in this study. The conclusion of the work should present important contributions for the different actors within this process: the network and orchestration mechanisms between the actors involved; large corporations and the challenge of operationalizing the relationship; and startups, with the challenge of effectively transforming such a relationship into a partnership for successful innovation.

5
  • LEONARDO LOURENÇO DE SOUZA
  • Performance measurement system analysis: case study in public transport service notices

  • Advisor : JULIANA HELENA DAROZ GAUDENCIO
  • COMMITTEE MEMBERS :
  • FABIANE LETÍCIA LIZARELLI
  • CARLOS HENRIQUE PEREIRA MELLO
  • JOAO BATISTA TURRIONI
  • JULIANA HELENA DAROZ GAUDENCIO
  • Data: May 3, 2023


  • Show Abstract
  • The pursuit for excellence in services and the interest in measuring service quality are considered important strategies used by organizations to achieve competitive advantage. In the same context, we have public services characterized as all activities that the Estate performs to satisfy public needs through the typical procedure of public law. Today, through concession contracts, many public services are provided indirectly. The Federal Constitution of 1988 establishes that public transport is a type of public service that must be offered to the population. Public transport is a democratic way for society to have access to goods, services and activities, however, its use is decreasing due to several factors, mainly due to the low quality of the service provided. To meet the needs of users and verify that the service meets contractual requirements, measuring quality performance is important, both for service providers and for the inspection body. In this context, the general objective of the research is the analysis of the performance measurement system with a focus on improving the quality parameters to be integrated in the public notice for contracting the public transport service offered in the city of Itajubá-MG. City in which, due to the user’s complaints about public transport, the parliamentarians of the City Council requested that measures were taken in relation to the service provided and an inspection of the concessionaire was carried out, thus, improving the quality of the service. The method applied was the multiple case study, through the following steps: initially a bibliometric analysis and literature review were carried out, both dealing with the review and context of the available literature on the performance measurement system and quality with a focus on public services, establishing the theoretical support and the research problem; then the cases were selected, including the case of Itajubá-MG and 02 cities in the state of São Paulo; the research protocol was elaborated, validated by specialists in the area; and then conducted data collection and reporting of individual cases and cross-case reporting. Some relationships and information between the analyzed cases were identified. Among the results, we highlight the proposal of a performance measurement system involving 15 important parameters to measure the quality of the services provided and to be integrated in the public transport hiring notice, they are: adaptation for people with special needs, accessibility, existence of benches and coverage at bus stops, frequency, drivers skill, information, punctuality, rate, tariff integration, cleanliness, age of fleet, cordiality of employees, public safety and environment. In addition, some actions were exposed to assist in the evaluation of this service for the management of the contract and guidelines on the importance of monitoring the indicators to maintain the quality of the service provided. It is expected that the results obtained will be a subsidy to help the competent body evaluate and supervise the services provided. The performance measurement system allows better efficiency and effectiveness in actions by both the concessionaire and the managing body, impacting the quality of services provided to users. The results obtained can contribute to the city's transport system and be used as a model to monitor the quality of any public service offered by the municipal body of Itajubá-MG, as well as for other cities. Also, it is expected that the renewal and the next concession contracts for this service may incorporate the guidelines listed here, in order to reverse the trend of reduced demand for collective public transport by bus and improve user’s satisfaction.

6
  • THAIS FERNANDA SOUSA PIRES
  • Multi-objective Robust Optimization of Helical Milling of Super Duplex Stainless Steel UNS S32760

  • Advisor : JOAO ROBERTO FERREIRA
  • COMMITTEE MEMBERS :
  • JOAO ROBERTO FERREIRA
  • LINCOLN CARDOSO BRANDÃO
  • PAULO HENRIQUE DA SILVA CAMPOS
  • ROBSON BRUNO DUTRA PEREIRA
  • Data: May 3, 2023


  • Show Abstract
  • Super duplex stainless steels have a mixed microstructure consisting of ferrite and austenite phases and are used especially in the oil and gas industry, but have low machinability. The helical milling process consists of rotating the cutter around its own axis combined with the helical feed, it presents greater processing efficiency and lower cost compared to the conventional drilling process and allows obtaining holes of different diameters. The process also has advantages such as high levels of surface quality, greater dimensional accuracy and shape quality, lower levels of cutting efforts and greater smoothness in the machining operation. This work consists of a robust multi-objective optimization of helical milling of the super duplex stainless steel UNS S32760. The experimental design was carried out with a central composite arrangement, considering as control factors the axial feed per tooth, the tangential feed per tooth and the cutting speed. Mean roughness responses, axial thrust force and circularity deviations were evaluated. The robust parameter design was used, the response surface methodology for conducting the experiments, analysis and modeling of the responses of interest, and multi-objective optimization was performed through the methods of optimization of multi-objective particle swarm with agglomeration distance (MOPSO-CD) and non-dominated sorting genetic algorithm (NSGA-II). For the robust parameter design, the overhang length of the cutter (lto), the measured height of the machined hole (lb) and the cutting fluid flow rate (Q) were considered as noise variables. Robust multi-objective evolutionary optimization allows the evaluation of process factor levels. Different attributions for the objective functions were analyzed in order to obtain optimal solutions on the Pareto Frontier for the evaluated responses.

7
  • JAÍNE CÁSSIA FONSECA AMARAL
  • Multicriteria model for benefit/cost analysis in a constructive process of urban mobility - case study in Santa Rita do Sapucaí, MG

  • Advisor : JOSIANE PALMA LIMA
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • JANAINA ANTONINO PINTO
  • JOSIANE PALMA LIMA
  • Data: May 4, 2023


  • Show Abstract
  • Cities are characterized by problems since the industrial revolution that generated an accelerated urbanization, which resulted in a need for construction processes of infrastructure for the population, disregarding the evaluations of resource use from different areas, such as social, environmental, economic, historical and cultural aspects of urbanized places. Some authors address the disorderly growth of urban areas, simultaneously with the discontinuity of works, which result in waste of resources at municipal, state and federal levels and do not serve the
    population. Despite the creation of the Ministry of Cities and the National Urban Mobility Policy, there is a lack of a structured evaluation method with ustainability criteria for selecting infrastructure construction processes that maximize benefits/advantages and minimize costs/disadvantages for the population. In this way, the objective of the work was to develop a combined model of cost-benefit analysis, using a multicriteria tool to evaluate the use of economic, environmental and social resources in the construction process of transport infrastructure and urban mobility. Through the theoretical foundation, several international initiatives for process evaluation were verified, with emphasis on The Green Book of the United Kingdom and it was verified that Brazil does not have the selection of proposals and processes outside of strictly objective criteria. We then sought to build and test the applicability of the model in the project of Vila Criativa, Vila Feliz in Santa Rita do Sapucaí, Minas Gerais, creator of the Movimento Cidade Criativa, Cidade Feliz, which uses the city as a platform to increase people’s quality of life. . Twenty-six (26) criteria were defined for the combined methodology of Analytic Hierarchy Process (AHP) and Benefit-Cost Analysis (ABC), submitted in comparison matrices for evaluation by 9 (nine) specialists. The criteria and their respective threevalued metrics were defined to contribute to the application of data in the tool ABC de Constructive Processes of Infrastructure (ABCINFRA). The tool contributes to the generation of an individual assessment report for each construction process for each sustainability resource in both benefit/advantage and cost/disadvantage classes. The application of the model in the construction process of Vila Criativa, Vila Feliz, allowed the validation of the technical criteria, expert weighting, determination of critical points and the entire construction of the tool through combined methods, contributing to the manager's decision-making in the application of the public resources connected with the purpose of sustainability and mobility. The tool made it possible to determine that the constructive process object of study minimized the costs/disadvantages and maximized the benefits/advantages, being viable for the city/people, mainly in the social resource that had a performance of 100% in both classes, followed by the
    economic resource and finally the environmental resource. In this way, the work met the main and specific objectives and provided a scientific contribution with the definition of technical performance criteria and metrics through the ABCINFRA data processing tool that delivers the quantitative analysis of the AHP and critical points and can also be used by the public managers/decision makers directed to constructive processes. It is suggested, as future works, the application of the ABCINFRA model, in different cities, contributing to new comparisons and analyses. As well as the expansion of specialists, representing the different Brazilian states.

8
  • JOANA DARC TEODORO BONALDI
  • Implementation of a Traceability System in a Technology-Based Company with Emphasis on Industry 4.0 Technologies

  • Advisor : JULIANA HELENA DAROZ GAUDENCIO
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • JOSE CARLOS DE TOLEDO
  • JOSE HENRIQUE DE FREITAS GOMES
  • JULIANA HELENA DAROZ GAUDENCIO
  • Data: May 25, 2023


  • Show Abstract
  • With the advent of Industry 4.0 and the approach of a connected production system using Industry 4.0 technologies, the adoption of traceability systems becomes an essential application of this new industrial era. The work takes into account the need for small and medium-sized companies to improve the processes carried out, through the use of technologies mentioned in the pillars of Industry 4.0. In this context, the work proposes to develop a traceability system applied to the production chain of stainless-steel capacitor tanks with the reading of the bar code in stainless steel AISI 304 and stainless steel AISI 409 of one of the units of a Brazilian company located in Itajubá, south of Minas Gerais. Initially, a systematic literature review conducted based on the PRISMA protocol was adopted, covering the Scopus and Web of Science databases, which allowed the selection of the most relevant articles on this topic. The methodology used was Action Research in which it was possible to develop and implement the proposed traceability system called RAST 4.0. Through a pilot batch, it was possible to collect data and analyze them through the stages of the PDCA cycle; thus, an Action Plan was developed with the purpose of solving the flaws found. Continuing with the action-research stages, with the application of the Action Plan, the company achieved improvements through simple actions, thus generating the 2nd Pilot Batch, which was monitored and tested within the process, resulting in time taken for each of tasks and maintaining traceability from the beginning to the shipment of the capacitor tank to the customer, through internal traceability.

9
  • RAPHAEL LOPES MARTINS
  • MODELING OF PROCESSES AND DECISIONS: A STUDY APPLIED TO THE PROCESS OF DRUG TREATMENT IN ONCOLOGY

  • Advisor : FABIANO LEAL
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • ALEXANDRE FONSECA TORRES
  • FABIANO LEAL
  • Data: Jun 23, 2023


  • Show Abstract
  • This research proposes the study of a health area process through the modeling of processes and decisions. The modeling of processes and decisions applied to the health area can help in several factors, including the verification of control and traceability points, which reduce the risks to the patient's health and increase the quality of the service provided. The main objective of this research is to describe the logic of an oncology drug treatment process, from end to end, in order to indicate its control points and describe the logic of the decisions made in critical points of this process. The research also presents some specific objectives that are detailed in the introduction. In the theoretical foundation, the terms used during the research are clarified, in order to research the themes related to the modeling of processes and decisions, as well as their relationship in the health area. For the scientific conduct of this research, the modeling method will be used. This paper presents the creation of business process and decision models related to drug treatment in oncology through the use of Business Process Model and Notation and Decision Model and Notation. In addition, it aims to highlight the control points mapped in the process and address them using the 5W1H tool. The model created in BPMN will be exposed to the application of some comprehensibility guidelines, evaluating which ones were followed by the model and the work that was done to adapt the model to comply with others. It is intended to apply a method of verification and validation of diagrammatic models which is still under development and will be detailed during this research. Finally, this work concluded that the BPMN notation was effective in mapping the cancer treatment process, demonstrating the already established control points and the risks at certain points in the process. Also, the application of the DMN notation and its relationship with the BPMN notation was discussed, which showed the difficulty in applying it in processes where the decision has a high level of personalization.

10
  • BRUNO STORCH DE ALMEIDA CALIXTO
  • ANALYSIS OF THE USE OF THE AHP METHOD IN THE PROCESS OF EVALUATION AND PRIORITIZATION OF OCCUPATIONAL RISKS IN A TEACHING AND RESEARCH LABORATORY OF AN INSTITUTION FEDERAL OF HIGHER EDUCATION

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • CARLOS HENRIQUE PEREIRA MELLO
  • DENISE RANSOLIN SORANSO
  • JULIANA HELENA DAROZ GAUDENCIO
  • LUCIANO JOSÉ MINETTE
  • Data: Jun 29, 2023


  • Show Abstract
  • The activities carried out in laboratories of Federal Institutions of Higher Education (IFES)
    cover several areas of knowledge, being essential for the development of scientific research,
    teaching activities and university extension. However, the laboratory environment exposes
    individuals to various occupational risks that are inherent in carrying out activities. In view of this, risk management at IFES becomes fundamental in these environments, as it reduces the risk of accidents, promotes the health and satisfaction of workers, in addition to improving operational results and the institution's image. This research presented an analysis and proposal for the use of the Analytic Hierarchy Process (AHP) technique in the risk assessment process of risk management, based on the ISO 31000:2018 Standard, which applied methods of identification, analysis, evaluation and control of occupational risks in an academic laboratory of an IFES in southern Brazil, in order to assist the specialists in occupational health and safety of that institution in prioritizing risks, making the actions of elaboration and implementation of prevention programs in occupational safety and health more assertive. For this, the Preliminary Risk Analysis method was used to identify the risks, the Risk Index method for the analysis of the severity and probability of these risks and in the evaluation stage, to prioritize the environmental risks and assist in the specialists' decision making, the final development of the research indicated the order of importance of environmental risks through the AHP. The results indicated that the environmental risks of the laboratory with the highest priority and importance for the implementation of control measures analyzed by the specialists were the Accident Risks (39.28%) and the Chemical (34.08%), followed by the Physical Risk (14 .46%) and Ergonomic (12.17%).

11
  • JONATHAN SERAFIM LÚCIO
  • Simulação a Eventos Discretos; Lean Six Sigma; estrutura DMAIC; framework integrado.

  • Advisor : RAFAEL DE CARVALHO MIRANDA
  • COMMITTEE MEMBERS :
  • JOSE ARNALDO BARRA MONTEVECHI
  • MARCELO MACHADO FERNANDES
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Jul 3, 2023


  • Show Abstract
  • There are many Modeling and Simulation methods available in the literature, which are commonly used in Discrete Event Simulation (DES) projects. However, most of these methods focus on constructing computational models, which results in opportunities for improvement in the problem definition and results analysis stages, as they are often overlooked. One way to address this gap is by incorporating alternative methods to DES. In this regard, the integration of the Lean Six Sigma (LSS) method, based on the Define, Measure, Analyze, Improve, and Control (DMAIC) framework, with the DES method has shown positive results in various application areas. However, the scientific literature provides few studies addressing this integration. Given this context, the present work aims to propose a framework that allows the integration of a robust and systematic problem-solving method (LSS) with a modeling and simulation tool (DES), aiming for more accurate and effective execution of discrete event simulation projects. To achieve this objective, a systematic literature review (SLR) was conducted to gather information regarding the state of the art on the topics addressed, to identify the focus of the works (DES or LSS), their structure, usage, and the main tools involved in the process, to determine in which phases of DMAIC the DES models were applied, and finally, the main elements found in works that presented some kind of framework. Among the analyzed works, only seven proposed an integrated framework, all of which were evaluated to identify past and future trends in the main practices presented. Based on these analyses, associations were established between the concepts of the two methods to relate common objectives within the same stage. Thus, the initial version of the framework was developed and subsequently subjected to evaluation through an online questionnaire, answered by experts in the areas of DES and LSS, with the purpose of being statistically validated. After considerations and tests, the final version of the framework was proposed, representing the outcome of this work.

12
  • ANDRÉ AOUN MONTEVECHI
  • FRAMEWORKS FOR CREDIT RISK MODELLING USING CLASSIFICATION ALGORITHMS

  • Advisor : RAFAEL DE CARVALHO MIRANDA
  • COMMITTEE MEMBERS :
  • ANDRE LUIZ MEDEIROS
  • ANEIRSON FRANCISCO DA SILVA
  • JOSE HENRIQUE DE FREITAS GOMES
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Jul 4, 2023


  • Show Abstract
  • Granting credit is a vital activity in the financial industry. For the success of financial institutions, as well as the equilibrium of the credit system as a whole, it is important that credit risk management systems efficiently evaluate the probability of default of potential debtors based on their historical data. Classification algorithms are an interesting approach to this problem in the form of Credit Scoring models. Since the emergence of quantitative analytical methods with this purpose, statistical models persist as the most commonly chosen method, given their easier implementation and inherent interpretability. However, advances in Machine Learning have developed new and more complex algorithms capable of handling a bigger amount of data, often with an increase in predictive power. These new approaches, although not always readily transferable to practical applications in the financial industry, present an opportunity for the development of credit risk modeling and have piqued the interest of researchers in the field. Nonetheless, researchers seem to focus on model performance, not appropriately setting up guidelines to optimize the modeling process or considering the present regulation for model implementation. Thereby, this dissertation establishes frameworks for consumer credit risk modeling based on classification algorithms while guided by a systematic literature review on the topic. The proposed frameworks incorporate ML techniques, data preprocessing and balancing, feature selection (FS), and hyperparameter optimization (HPO). In addition to the bibliographic research, which introduces us to the main classification algorithms and appropriate modeling steps, the development of the frameworks is also based on experiments with hundreds of models for credit risk classification, using Logistic Regression (LR), Decision Trees (DT), Support Vector Machines (SVM), Random Forest (RF), as well as boosting and stacking ensembles, to efficiently guide the construction of robust and parsimonious models for credit risk analysis in consumer lending.

13
  • LUCAS GOMES PEREIRA
  • Clustering as a Decision Support Technique for a Logistics E-marketplace

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • ANDRE LUIZ BARBOSA NUNES DA CUNHA
  • FABIO FAVARETTO
  • RENATO DA SILVA LIMA
  • Data: Sep 21, 2023


  • Show Abstract
  • Road transport is one of the most impactful modes in the global transportation matrix. Despite its advantages of flexibility and availability, the sector is characterized by high fragmentation. In the past, the intermediation process in this logistics chain was inefficiently handled by freight agents in terms of time and cost. An effective solution to address the need for agility and ease is through electronic logistics marketplaces, which are systems allowing carriers to advertise their loads to truck drivers searching for freight. However, the ease of automating load and capacity matching has resulted in technology providers dealing with an unprecedented volume of data. Valuable insights about user behavior can be derived from this diverse dataset. Despite the popularity of logistics marketplaces, scientific literature has not kept pace with their growth. Given these opportunities, this study aims to identify patterns in a cargo advertisement database
    of a logistics marketplace using clustering, which can assist in decision-making. Following the CRISP-DM procedure, data on load postings from 2019 to 2021 were collected, and the
    clustering trend of the database was confirmed using RStudio software. The CLARA algorithm was subsequently employed, and the quality of clusters was assessed using the Silhouette index. The most representative group at the national level consisted of freight within the state of São Paulo, featuring full loads, covering distances of around 500 km, and requiring heavy-duty vehicles for transportation. In the context of São Paulo, the most significant partition comprised full freight journeys of just over five hours, also requiring heavy-duty vehicles. The higher frequency of full freight was attributed to its benefits, such as efficiency in space and resource utilization. The main strategies identified for the national context involve offering progressive discounts on additional services to carriers conducting a high volume of trips and targeted promotion of the logistics marketplace to potential customers interested in transporting heavy loads over medium distances. An interesting business opportunity was identified in Acre, where the company could expand its operations and provide support in a region of Brazil where road freight is less common. Additionally, encouraging the use of the platform in São Paulo for posting fractional loads was suggested, highlighting the advantages for owners of small vehicles. In conclusion, CLARA produced satisfactory results by reducing the computational complexity of a database with over three million entries, and the study revealed data clusters as
    potential opportunities for platform growth. However, there were instances of overlaps of
    structures clearly seen as distinct in the scatterplots.

14
  • JORGE YURI OZATO
  • N/a

  • Advisor : EDSON DE OLIVEIRA PAMPLONA
  • COMMITTEE MEMBERS :
  • EDSON DE OLIVEIRA PAMPLONA
  • GIANCARLO AQUILA
  • VICTOR EDUARDO DE MELLO VALERIO
  • WILSON TOSHIRO NAKAMURA
  • Data: Sep 29, 2023


  • Show Abstract
  • The interest in offshore wind power in Brazil is a recent development, and there is a lack of
    research on the actual competitiveness of this energy source in the country. This study proposes
    a stochastic approach to compare the levelized cost of electricity (LCOE) for offshore wind
    farms at five different locations along the Brazilian coast. Various possibilities of corporate
    taxation and the commercialization of carbon credits are also considered. To validate the
    comparison, the offshore wind potential of the evaluated locations is assessed by modeling the
    uncertainties associated with monthly variations in wind speed, as well as the economic
    uncertainties related to Capital Expenditure (CAPEX), Operational Expenditure (OPEX), and
    Decommissioning Expenditure (DECEX). Subsequently, 10,000 iterations of the Monte Carlo
    Simulation are conducted to calculate the LCOE for different scenarios, including taxation
    based on the Actual Profit Method (APM) and the Presumed Profit Method (PPM), with and
    without Tradable Green Certificates (TGC). The results indicate that the application of the PPM
    for taxation and the identification of the optimal location for wind power exploitation are the
    most influential factors in reducing the LCOE and the financial risk associated with offshore
    wind farm investments in Brazil. On the other hand, the possibility of trading TGC contributes
    to lowering the LCOE of projects but does not significantly mitigate the financial risk.
    Furthermore, the study reveals that the coastal region near the Northeast of Brazil exhibits the
    highest offshore wind potential, while locations in Rio Grande do Sul and Rio de Janeiro
    demonstrate some competitiveness, although to a lesser extent than the Northeast.


15
  • BRENDA DE QUEIROZ VIANA
  • STRATEGIC PLANNING: CONDUCTION METHODOLOGY, APPLICATION AND FOLLOW-UP PROPOSED FOR THE SECRETARY OF HIGHER EDUCATION (SESu)

  • Advisor : JULIANA HELENA DAROZ GAUDENCIO
  • COMMITTEE MEMBERS :
  • DANY FLAVIO TONELLI
  • CARLOS EDUARDO SANCHES DA SILVA
  • JULIANA HELENA DAROZ GAUDENCIO
  • Data: Oct 26, 2023


  • Show Abstract
  • It is important to plan strategically for optimization of resources within educational institutions and seek continuous improvement of the quality of services provided and a strategic management of the main requirements related to teaching, research, and extension. Strategic Planning (SP) is developed as a way to assist in setting objectives, future goals, decision making, monitoring, control, and strategic organization. It should use resources such as mission, vision, values, Strategic Map and SWOT Analysis. In Higher Education Institutions (HEI), there is a need to develop an effective SP, which can align with the budget planning and always be prepared to adapt to possible budget changes and political issues being considered a legal obligation according to the Federal Constitution of 1988, decrees and laws. In Brazil, the planning, orientation, coordination, and supervision of the entire National Higher Education Policy is the responsibility of the Secretary of Higher Education (SESu), which requires the formalization and systematization of your own SP. Based on this premise, this work has as its primary objective to evaluate the actions carried out during the development of the SESu’s SP in order to identify the means of conducting, implementing, and monitoring proposed for the Secretary to meet all necessary requirements in accordance with the guidelines of the Ministry of Education (MEC). In this context, the case study method was employed, using it as an instrument of data collection interviews and remote meetings. Among the main results, it was possible to track the progress of the actions developed in all stages executed for the SESu’s SP, make a comparison with what was executed in the MEC’s SP and what was proposed by the Technical Guide for Strategic Management, seek to identify possible changes and alterations caused by the government transition, as well as model the planning process using Business Process Model and Notation (BPMN) notation and analyze the compressibility of the model.

Thesis
1
  • ERIVELTON ANTONIO DOS SANTOS
  • What Matters in Hiring Professionals for Global Software Development or Gig Economy?

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • DALESSANDRO SOARES VIANNA
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • DALTON GARCIA BORGES DE SOUZA
  • DANIEL JUGEND
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Feb 2, 2023


  • Show Abstract
  • Supply chains are susceptible to uncertainties, such as large-scale natural disasters, manufacturing fires, terrorist attacks, widespread electrical shutdowns, financial and political tension, and wars.
    Therefore, rising unemployment rates have driven the workforce into short-term contracts or the on-demand market known as the gig economy.
    However, selecting skilled professionals is difficult and risky when organizations are immersed in fast-paced environments. In this context, we investigated the analysis scenario of contracting professionals in global software developments (GSD).
    This thesis aims to develop clusters of criteria for hiring self-employed professionals in the Global Software Development or Gig Economy context.
    We systematically reviewed 319 criteria in 65 papers and grouped them into two innovative ways. Thus, we obtained 25 criteria clusters and a hierarchical structure with their relationships, indicating that we had only 40% of the cause.
    We are proposing two innovative criteria grouping methods. The first delivers a fast aggregation clustering, and the second with the relationships between the criteria clusters. This tool can be handy for researchers in exploring new data via literature review or even through surveys. Another point is that the practitioners could easily use the spreadsheet with all the data, remove or join new criteria, and run the algorithm to create new clusters on their own.
    The main results were, firstly, for the applicants, in software development, the project requirements are gathered over the clients and stakeholders; this process involves rich and looping communication.
    Secondly, the enterprises first check the criteria clusters. Then, the list of criteria, and taking into account the job position or profile, they choose how to make the hiring process, reflecting on the relationship of criteria clusters (cause/effects).
    Finally, these results also imply the design of new subjects for computer science courses, mainly concerning soft skills, as highlighted in the Communication criteria cluster, in which we have a list of criteria highly cited in SLR.

2
  • GUSTAVO DOS SANTOS LEAL
  • Optimization of SARIMA-DEA models with ensembles and Mixed Design

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • LUPERCIO FRANÇA BESSEGATO
  • CARLOS HENRIQUE PEREIRA MELLO
  • JULIANA HELENA DAROZ GAUDENCIO
  • PEDRO PAULO BALESTRASSI
  • WESLEY VIEIRA DA SILVA
  • Data: Feb 27, 2023


  • Show Abstract
  • Accurate forecasting is crucial for several areas of knowledge, such as Economics, Management, Engineering and Statistics. There are several approaches to perform forecasting: time series analysis, regression analysis, artificial neural networks, etc. However, both researchers or analysts must be aware when applying any of the aforementioned techniques because of overfitting – which occurs when a given model has so many parameters that it fits well to the training set, but predicts the test set very poorly. Recently, model combination techniques are widely spread, since the ensemble of models is proven to make the forecast metrics better. However, the overfitting problem may still occur in these cases. To overcome this, this thesis suggests the application of an intermediate step between the selection of models for the ensemble and the optimization of their weights, which is the application of a Data Envelopment Analysis model suitable for the presence of fractional variables so as not to harm the assumption of convexity. To analyze this method, this thesis applies Box & Jenkins models. Therefore, Decision Making Units (DMU) are created through a Complete Factorial Arrangement, modifying the computational parameters. Super-efficiency analysis is applied and the 4 DMUs with the highest efficiency indexes are retained for later combination through Response Surface (RSM) optimization in the context of Mixture Design. It is also proposed the application of multivariate statistical techniques for dimensionality reduction, in order to make the problem computationally smaller. To validate the proposed method, a simulation study was created, comparing the results with the Naive method. The simulation showed that the method proposed in this thesis presents, on average, better results. Finally, the method was applied to series about electricity demand from Brazil and its five geographic regions.

3
  • CARLOS HENRIQUE DOS SANTOS
  • Monitoring of Digital Twin Simulation Models: An approach based on Machine Learning and Control Chart

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • ANDERSON RODRIGO DE QUEIROZ
  • ANTONIO FERNANDO BRANCO COSTA
  • EDUARDO GOMES SALGADO
  • JOSE ANTONIO DE QUEIROZ
  • JOSE ARNALDO BARRA MONTEVECHI
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Jun 27, 2023


  • Show Abstract
  • The use of simulation models as Digital Twins (DTs) has been standing out in recent years and represents a revolution in decision-making in production processes, being a key solution in the context of the so-called Industry 4.0. In this sense, we highlight increasingly faster and more efficient decisions from the mirroring of the behavior of physical systems through sensors, intelligent equipment, management systems and databases. The models used as DTs are updated periodically, in real or near real time according to physical changes, and provide guidelines or commands for decision making. On the other hand, despite the great applicability of this approach, challenges related to the validity of simulation models over time stand out, since traditional validation approaches do not consider the periodic update of the model. Ensuring the validity of DTs is essential, since it usually involves decisions of great impact for production systems. In addition, although it is a field of research with great importance for both researchers and professionals, we noted that there is still a gap in terms of methods aimed at monitoring the validity of DTs. Therefore, in order to contribute to the literature and fill this gap, the present work proposes an approach based on the periodic evaluation of simulation models used as DTs through Machine Learning and control chart. We suggest a monitoring tool based on the K-Nearest Neighbors (K-NN) classifier, combined with the p control chart, in order to periodically assess the validity of DT models. Initially, the proposed approach was tested in several theoretical cases in order to evaluate the functioning of the tool in situations where the physical environment differs significantly from the virtual one, a fact that would represent a possible case where the DT is not valid. In this case, data corresponding to the physical and digital environments were emulated considering standardized probability distributions. Furthermore, the tool was also implemented in two real objects of study, acting as a supplement to make DTs more robust and reliable. In this case, DTs already implemented and in the operational phase were adopted. The first object of study refers to a model that supports operational planning decisions in a medium-sized company of a clothing industry, whose processes are mostly manual. The second object of study refers to a DT implemented in an automated production cell that operates in near real time, allowing the evaluation of the main process parameters. The tool proved to be capable of monitoring the functioning of both DTs and identifying possible special causes that could compromise its results and, consequently, its validity. Finally, the broad applicability of the tool is highlighted, which can be used in different approaches of DT, including simulation models with different characteristics of connection, integration, and complexity. In this case, the proposed approach operates independently of the characteristics of the DTs, including models that operate in real or near real time, considering automated or manual physical systems and covers systems with different levels of complexity.

4
  • JOÃO PAULO BARBIERI
  • ELEMENTAL ANALYSIS OF THE COMPLEXITY OF DISCRETE AND HYBRID SIMULATION MODELS

  • Advisor : ALEXANDRE FERREIRA DE PINHO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • FABIANO LEAL
  • FERNANDO DESCHAMPS
  • JOAO JOSE DE ASSIS RANGEL
  • JOSE ARNALDO BARRA MONTEVECHI
  • RENATO DA SILVA LIMA
  • Data: Sep 19, 2023


  • Show Abstract
  • Under the influence of technological development, real systems are experiencing a significant increase in size and complexity. Faced with a context of technological advances, simulation maintains its scientific relevance, supporting business decision-making. Through simulation, real systems, and their idiosyncrasies, are analyzed and improved. During a simulation project, the specialist needs to make a series of decisions, which includes defining the model's level of detail. Furthermore, in the computational model development phase, the specialist also needs to decide on which simulation approach to use. In the midst of this specialist's decisions, a dilemma emerges: real systems are progressively becoming larger and more complex as a result of technological progress; even with technological advances, the scientific literature states that a computational model is an abstraction of reality and it should be as simple as possible. In the context of this dilemma, the present thesis has the general objective of deepening discussions on the introduction of a greater level of detail in computational models, considering the Discrete Event Simulation (DES) and Hybrid Simulation (HS) approaches, id est, DES approach combined with Agent-Based Simulation (SBA) approach. The need to deepen these discussions generated the iDAV method, a method used to measure computational models. With the application of the iDAV method, it was found that discrete models are simpler to be developed when the level of detail is lower. On the other hand, when the scope and level of detail are increased, hybrid models are more suitable.

5
  • BRUNA STÉFANY COSTA
  • Performance analysis of control charts with supplementary signaling rules

  • Advisor : ANTONIO FERNANDO BRANCO COSTA
  • COMMITTEE MEMBERS :
  • ANTONIO FERNANDO BRANCO COSTA
  • CARLOS HENRIQUE PEREIRA MELLO
  • PEDRO PAULO BALESTRASSI
  • ROBERTO DA COSTA QUININO
  • ROGÉRIO SANTANA PERUCHI
  • Data: Oct 20, 2023


  • Show Abstract
  • The most important objectives of industrial and non-industrial companies are related to cost, time and quality. On the one hand, the control charts represents an important quality tool, showing if the process is in control or out of control. On the other hand, the Markov chains can be used with the control charts, modeling the process under analysis. Thus, this research aims to analyze the performance of control charts with supplementary signaling rules through the comparison of the average run length. To do so, the process of literature under investigation was analyzed and modeled through a Markov chain. Following that, the average run length was calculated for distinct scenarios. The control charts under analysis refer to a capability index chart with supplementary signaling rule, a X ̅ chart with supplementary signaling rule and a X ̅ chart with basic signaling rule. The results obtained allowed the conduction of two distinct analysis. The first analysis demonstrated that, when it comes to mean shifts, the calculation complexity resulted from the introduction of the capability index does not always generate the best results. The second analysis demonstrated that, when it comes to mean shifts, the calculation complexity resulted from the introduction of the supplementary signaling rule does in general generate the best results. In some cases the capability index chart with supplementary signaling rule was the chart with the best performance among the three under analysis and in other cases the X ̅ chart with supplementary signaling rule was the chart with the best performance among the three under analysis. Finally, the results reassured that the X ̅ control chart does not display good performance when it comes to small mean shifts.

6
  • ALINE CUNHA ALVIM
  • Robust optimization in turning femoral heads for hip arthroplasty: a comparative tool analysis

  • Advisor : JOAO ROBERTO FERREIRA
  • COMMITTEE MEMBERS :
  • ANTONIO FERNANDO BRANCO COSTA
  • JOAO ROBERTO FERREIRA
  • JOSE HENRIQUE DE FREITAS GOMES
  • LINCOLN CARDOSO BRANDÃO
  • MARCOS VALERIO RIBEIRO
  • MATHEUS BRENDON FRANCISCO
  • Data: Oct 30, 2023


  • Show Abstract
  • The large elderly population has naturally led to an increase in the number of orthopedic surgeries, such as knee and hip replacements. In developing countries, the growing demand for this type of surgery is already a reality, highlighting the need to develop machining manufacturing technologies that meet the quality requirements of the prosthesis. ABNT 316L austenitic stainless steel is used in the manufacture of joint prostheses and, although it is considered a material with low machinability, it is an economical alternative to the application of other biomaterials such as titanium alloys and ceramics. In this context, this work presents the optimization of the turning of femoral heads for a total hip prosthesis, comparing two different tool geometries: round and rhombic ISO format. To this end, it has used the response surface methodology and the design of robust parameters to model and optimize the main process responses: roughness and sphericity. The experiments were carried out based on a combined array considering three control variables and two noise variables. The control variables studied were: cutting speed, feed rate and depth of cut; and the noise variables considered were: fixed length of the workpiece and cutting fluid flow. As quality characteristics, the surface finish and the sphericity of the femoral heads were analyzed using the mean roughness and total circularity deviation, respectively. Robust optimization was performed by combining two methods: mean square error and normal boundary intersection. Therefore, the formulation of the optimization problem was to minimize the roughness in the turning process of ABNT 316L stainless steel, limiting the sphericity to 10 μm. The results showed that the rhombic tool is preferable when it is desired to simultaneously obtain components with the best surface quality and the least shape deviation.

7
  • MILENA SILVA DE OLIVEIRA
  • Discrete event simulation in industrial processes: a approach involving facilitated modeling and hybrid meetings

  • Advisor : FABIANO LEAL
  • COMMITTEE MEMBERS :
  • MARCO AURÉLIO DE MESQUITA
  • ALEXANDRE FERREIRA DE PINHO
  • FABIANO LEAL
  • JOSE ARNALDO BARRA MONTEVECHI
  • MARCELO MACHADO FERNANDES
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Oct 31, 2023


  • Show Abstract
  • Discrete Event Simulation (DES) stands as one of the primary and most significant simulation
    techniques to assist decision-making in various areas of Industrial Engineering. Industrial
    enterprises of various sizes can significantly benefit from DES as it can assist in comprehending
    and analyzing systems, decision-making, improving operations and designing changes in the
    real system with less costly errors. However, despite a broad and growing literature on DES
    applications, it has been observed that industrial companies may encounter certain constraints
    in utilizing this technique in its traditional mode. These constraints include financial limitations
    that hinder the adoption of mechanisms for collecting extensive data, as well as the hiring of
    qualified personnel to process and explore their data. Additionally, data deficiencies, where
    limited or even unavailable data are collected, can lead to an inability to engage in simulation
    model development and scenario creation. There is also hesitancy in using DES due to the
    substantial time required for design and complexity in its use. Therefore, the objective of this
    thesis is to create a framework using facilitated modeling in conjunction with DES. In other
    words, proposing a simpler project management method. Facilitated DES offers advantages
    that address these aforementioned issues, as it allows for working with data estimated by
    experts in the process. It also advocates for the use of a simple computational model with few
    details, yet useful in generating understanding and fostering discussions about the problem
    situation, aiding in the pursuit of improvements. The FaMoSim (Facilitated Modeling
    Simulation) framework was developed following the steps of the Action Research method, and
    its implementation took the form of remote applications through hybrid meetings. With this
    meeting format, it is understood that the use of facilitated DES can be expanded beyond inperson meetings. Thus, FaMoSim brings unique features to the conduct of facilitated DES
    studies in industrial enterprises. By applying FaMoSim to four different case studies, its
    effectiveness in providing stakeholders with a better understanding of the studied processes
    using a simplified computational model with fewer data and fewer details was evident. It also
    assisted stakeholders in decision-making and identifying improvements.

8
  • JESSICA TITO VIEIRA
  • Multi-Objective Evolutionary Optimization of Internal Turning of PEEK Tubes through Extreme Gradient Boosting Models

  • Advisor : JOAO ROBERTO FERREIRA
  • COMMITTEE MEMBERS :
  • ALESSANDRO ROGER RODRIGUES
  • JOAO ROBERTO FERREIRA
  • JOSE HENRIQUE DE FREITAS GOMES
  • MATHEUS BRENDON FRANCISCO
  • MESSIAS BORGES SILVA
  • ROBSON BRUNO DUTRA PEREIRA
  • Data: Dec 13, 2023


  • Show Abstract
  • Turning is one of the most widely used manufacturing processes in the industry. Its
    extensive application means that turning processes are increasingly focused on producing
    of high-quality parts, aiming to associate efficiency, precision, and productivity. The
    challenges of achieving high-precision surface finishes are even greater when internal
    turning is applied to modern materials such as polyetheretherketone (PEEK). To achieve
    the best process conditions, predictive models must be estimated, and optimization must be
    conducted. This work presents a statistical learning approach for modeling and optimizing
    the internal turning process in PEEK tubes. Average roughness and roundness of the hole
    were measured to quantify the hole quality. The cutting force, considered an important
    indicator of machinability, was also measured. Cutting speed, feed rate, and fixture position
    were considered as input parameters. For modeling, a learning procedure was proposed,
    considering polynomial response surface regression, generalized additive methods, treebased methods, support vector regression and extreme gradient boosting. Cross-validation
    was used for learning and model selection, including k-fold and bootstrap approaches.
    The results indicated that the extreme gradient boosting model was the best for all
    predictors. For Ra the final prediction metrics results were RMSE = 0.1395, MAE =
    0.1126, and R2 = 1.0000, for Fc, RMSE = 1.8609, MAE = 0.9311, and R2 = 0.9280, and for
    Ront, RMSE = 21.3084, MAE = 17.8053, and R2 = 0.6562. Multi-objective evolutionary
    optimization was performed, considering the extreme gradient boosting models for average
    roughness, roundness, and cutting force, in addition to the deterministic model of material
    removal rate. The NSGA-II method was selected considering the hypervolume for the
    three-objective optimizations. The pseudo-weight approach is used to select high trade-off
    solutions, facilitating selection in practical production scenarios. For optimization of Ra
    vs Ront vs MRR, the balance between the three responses was achieved with a higher
    vc, f = 0.12 mm/v, and fp = 15.14 mm. For optimization of Fc vs Ront vs MRR, the
    balance between the three responses was achieved with vc = 378.78 m/min, f = 0.10
    mm/v, and fp = 13.00 mm. The proposed learning and optimization approach enabled
    the achievement of the best results for the internal turning process in PEEK and can be
    applied to other intelligent manufacturing applications.

9
  • PEDRO ALBERTO CHAIB DE SOUSA BERNARDES
  • Approach for the optimal configuration of investments in biogas-solar PV microgeneration projects based on multi-objective optimization

  • Advisor : EDSON DE OLIVEIRA PAMPLONA
  • COMMITTEE MEMBERS :
  • ANDERSON RODRIGO DE QUEIROZ
  • ANDRE LUIZ MEDEIROS
  • BENEDITO DONIZETI BONATTO
  • EDSON DE OLIVEIRA PAMPLONA
  • GIANCARLO AQUILA
  • MARCELO NUNES FONSECA
  • Data: Dec 14, 2023


  • Show Abstract
  • Brazil is one of the countries that seek to encourage distributed generation (DG) based on a net-metering scheme, combined with the exemption from the trading tax (ICMS), which is offered by different Brazilian states at different levels, and the lines financing from the national development bank. However, incentivized renewable energy sources (RES) have advantages in certain attributes, which many other RES do not have. In this sense, the combination of two or more sources in a hybrid way is able to minimize the disadvantages of renewable sources, including investment, operation and maintenance costs, intermittency and occupied area of the land where the generation system is installed. Therefore, the objective of this work is to contribute to the configuration of hybrid biogas-solar photovoltaic (PV) systems for pig farms, using multiobjective optimization. For this, a design of experiment technique will be adopted to define the objective functions, the construction of the Pareto frontier using the Normal Boundary Intersection (NBI) method and the Pareto-optimal solution will be located by the ratio between Entropy and Mahalanobis distance. The data obtained through the experimental design will also be used in the construction of confidence ellipses and in the Multivariate Analysis of Variance (MANOVA) technique, to compare the results obtained in three cities in three different brazilian states. The input variables used will be operational data from biogas and solar photovoltaic (PV) generation and the response variables used are the mean and standard deviation of the Net Present Value (NPV). The main results obtained were the optimal configuration of the biogas-PV hybrid system and the comparison of the investment made in the three selected cities. The contributions generated are intended to assist the decisions of renewable energy market regulators and investors.

2022
Dissertations
1
  • MARCELA XAVIER TEREZA DE MELLO
  • Empirical investigation of the ISO 9001:2015 standard in Brazil: Motivations, benefits and difficulties

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • CARLOS HENRIQUE PEREIRA MELLO
  • OTAVIO JOSE DE OLIVEIRA
  • Data: Feb 16, 2022


  • Show Abstract
  • The ISO 9001 standard is recognized as an important alternative for organizations to implement their quality management system and generate competitive advantage. organizational routines. Currently in its 5th edition, published in September 2015, the ISO 9001:2015 standard represents a milestone in the history of quality management, introducing new approaches that have resulted in important changes and opportunities. Although three years have passed since the transition process has passed, since the adaptation period for this version ended on September 15, 2018, the ISO 9001:2015 standard continues to be a prominent topic for organizations and, as a result of dealing with an update considered recent, there is still a lack of knowledge about its main benefits and difficulties, being a topic that has not been intensively addressed in the Brazilian literature. Given this context, the present work proposed to empirically investigate the impact of the ISO 9001:2015 standard in Brazil, identifying its main motivations, benefits and difficulties. For that, the survey method was used, using a structured questionnaire self-administered through the internet as a data collection instrument and, for data analysis, quantitative methods, such as descriptive analysis, cluster analysis and sentiment analysis. The survey was carried out over three months and obtained 103 responses. The results enabled a series of conclusions on the subject, highlighting, in particular, the potential value of the ISO 9001:2015 standard for organizations, recognizing that the standard is even more aligned with reality and modern management practices and the relevance of adding of the new requirements, as well as its High Level Structure (HLS). The main motivations reported by organizations for the implementation/maintenance of the ISO 9001 standard were presented, which end up being the same over time; and the benefits and difficulties of the ISO 9001:2015 standard, showing that some of the new requirements, especially risk-based thinking, were seen as benefits and also as difficulties. Finally, the groups of motivations, benefits and difficulties for the implementation/maintenance of the ISO 9001:2015 standard were related to the profile of the respondent companies, making it possible to observe the existence of associations between some variables.

2
  • CECÍLIA APARECIDA PEREIRA
  • Disaster-related buying behavior and its impacts on supply chain management: the case of the COVID-19 pandemic in Brazil

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • CLAUDIO BARBIERI DA CUNHA
  • RENATO DA SILVA LIMA
  • Data: Feb 16, 2022


  • Show Abstract
  • The COVID-19 pandemic, tragic in terms of confirmed cases and deaths, impacted not only the health system, but also several other sectors of society. Given the measures of distancing and social isolation, the population's consumption patterns have changed. Empty retailers' shelves and full consumers' grocery carts brought focus to panic buying. Meanwhile, e-commerce was showing a growth that has never been seen before in several countries, configuring itself as a type of commerce that increased the sanitary security of consumers and the financial security of store owners during the pandemic. In this sense, among other unforeseen changes in consumer behavior, the supply chain management of different businesses was exposed to the occurrence of possible disturbances and/or disruptions. Consequently, people's access to certain products, particularly basic supplies, was at risk of being limited during the crisis, potentially affecting the population's well-being. Given the above, this research aims to analyze the manifestations and influencing factors of disaster-related buying behaviors in Brazil during the COVID-19 pandemic. Based on the survey research method, a questionnaire was designed and applied throughout the country, being available from April 15th, 2020 to July 14th, 2020. Thus, 601 responses were obtained, which were duly treated using the Iterative Proportional Adjustment method to guarantee a sample that represented the Brazilian population at the most. The previous analysis of results shows that a considerable portion of respondents started to buy in greater quantities (40.8%) and online (36.8%) after the beginning of the COVID-19 pandemic. These were the main changes identified in the sample's consumption pattern during this period. As a result of these changes, increases in the level of stock-days of basic supplies were identified, which, consequently, increased the respondents' perception of scarcity in relation to some basic supplies, such as alcohol gel and face masks. Finally, it was possible to observe that no country is able to represent Brazil in relation to the manifestations of disaster-related buying behaviors and this shows the importance of studies such as this thesis.

3
  • ÍTALO DE ABREU GONÇALVES
  • The Use of the Cpk, Cpm and Cpmk Index to Monitor Process

  • Advisor : ANTONIO FERNANDO BRANCO COSTA
  • COMMITTEE MEMBERS :
  • ANTONIO FERNANDO BRANCO COSTA
  • PEDRO PAULO BALESTRASSI
  • ROBERTO CAMPOS LEONI
  • Data: Feb 24, 2022


  • Show Abstract
  • The Cpk, Cpm and Cpmk indices was originally created to measure the ability of the processes to produce products meeting specifications but, more recently, they have also been used to control processes. In these new applications of the Cpk, Cpm and Cpmk indices, the specification limits are no longer limits that decide whether the item is conforming or nonconforming, they are tunning parameters that increase or decrease the speed with the control charts signal process changes in the mean and/or the variance. Therefore, this research proposes the use of control charts using the three process capability indices as an alternative to the traditional charts 𝑋̅ e R Through the Monte Carlo simulation, it was that the Cpk, Cpm and Cpmk control charts are great alternatives to the use of the joint 𝑋̅ and R charts, due they signal faster mean shifts, variance increases, and also combined mean shifts with variance increases.

4
  • MÍRIAN BENTO DE ALMEIDA
  • Criteria used in the decision-making process for venture capital investments in startups

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • ANDRÉA MARIA ACCIOLY FONSECA MINARDI
  • ANA CAROLINA OLIVEIRA SANTOS
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • JEAN MARCEL SOUSA LIRA
  • Data: Apr 12, 2022


  • Show Abstract
  • Venture Capital has a prominent role in the innovation scenario, supporting
    and financing the evolution of startups through the size of resources. Brazil,
    stands out in America in relation to innovation and large volume of sports
    received in recent years, and with the perspective of growth, however it is
    below more developed developed countries. Therefore, it is necessary to
    improve investment decision-making strategies, so that the criteria are
    efficient. In this, this objective work is analyzed how it will determine the
    decision of specialists in the context of Venture Capital to contribute their
    resources. To this end, a survey was carried out with 14 experts, with data
    collection carried out through an internet delivery. And for the analysis of the
    data, the characterization of the respondents, detection of outliers, analysis of
    the relevance attributed to the investment decision criteria, analysis of
    agreement and, finally, a comparison between the results of the Systematic
    Literature Review and the survey. There were a total of 121 criteria, 51
    subcategories and 7 categories. From the comparison of practical and
    theoretical results, it was found that the category entrepreneur characteristics
    is the most relevant, corroborating other works identified in RSL, where
    human capital is the highlight; reinforcing the importance of the development
    of these characteristics by educational systems. As for the other categories,
    despite appearing in a different order, the dispersion between the values is
    low. This dissertation contributes in a way to boost the development of
    venture capital ecosystems in Brazil, making more startups obtain capital
    contributions through the criteria presented here. It also contributes to the
    global literature on this topic.

5
  • ANDERSON LINO DE PAULA MARTINS
  • Integrating Machine Learning into a agent based simulation model

  • Advisor : ALEXANDRE FERREIRA DE PINHO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • FERNANDO AUGUSTO SILVA MARINS
  • JOSE ANTONIO DE QUEIROZ
  • PAULO HENRIQUE DA SILVA CAMPOS
  • Data: May 10, 2022


  • Show Abstract
  • Industrial competitiveness has been increasing more and more in recent years and one of the alternatives to face this competitiveness is the use of Industry 4.0 technologies, and among them are Simulation and Big Data. Big Data involves a large generation of data that needs interpretation, which can be interpreted using machine learning algorithms from reinforcement learning, which can be in conjunction with simulation. Computer simulation is the incorporation of the real world into a virtual system, absorbing the fundamental characteristics, and one of the simulation methods is Agent-Based Simulation, in which the agent is the focus of the system. In this context, this work proposes to explain how it is possible to integrate machine learning to an agent-based simulation system. Being a tool to aid the modeler, showing two ways to carry out this implementation in AnyLogic® software. The first way will be using an external tool, the Pathmind, for that, a system will be created that generates boxes of three different colors (red, green and blue), represented by vectors, at random. The system must be able to identify the color of the box, focusing on the description of the steps to be followed to carry out the implementation using this tool. The tool efficiency test was given based on the number of adjustments that the machine is capable of performing, and the result found showed a high efficiency by this tool. Since before the implementation of machine learning, the system acted randomly, matching the colors following the statistical probability of randomness predicted for this problem, which was 12.5%, and after the implementation, the system reached a rate 100% hit. The second way will be directly in the AnyLogic® software, using Java programming language through the Q-learning reinforcement learning algorithm, which was developed in this research. For this, the same basis as the previous computational model will be used, however, for this application, boxes of five different colors will be created (red, green, blue, white and black), and will be represented through strings, in which the system seeks to hit the right box color from machine learning using the Q-learning algorithm and using the Q result matrix. And as with the form using the external tool, the emphasis will be on demonstrating all the steps to be followed to complete this implementation. The system again proved to be efficient, being able to correctly identify in all attempts. So this work was able to show two efficient ways to implement reinforcement learning in AnyLogic® software, using an external tool and in a direct way, in which the first one needs a lower level of knowledge of machine learning and programming, proving to be simpler, however, it is black box, while the second way is the opposite, requiring a high level of knowledge of machine learning and programming, but with open source.

6
  • JONAS MARCELO DE CARVALHO SIMÃO
  • Evaluation and Analysis of Public Management System in the Warehouse of a Higher Education Institution: An Approach Based on Excellence Models

  • Advisor : JOSE HENRIQUE DE FREITAS GOMES
  • COMMITTEE MEMBERS :
  • CARLOS HENRIQUE PEREIRA MELLO
  • JOSE HENRIQUE DE FREITAS GOMES
  • JOÃO EDÉRSON CORRÊA
  • Data: Jul 1, 2022


  • Show Abstract
  • Regardless of nature, area of activity or target audience, organizations are increasingly being requested to present improvements in their performance, results and product / service quality. To help public entities adopt management excellence practices, the Model of Excellence in Public Management (MEGP) has been created, aiming for results and for the improvement of public services quality. Considering this, through a case study, the purpose of this thesis is to evaluate and analyze a Federal Institution of Higher Education's (IFES) warehouse management performance using an adapted evaluation instrument based on the MEGP. Before carrying out the data collection, the MEGP's own assessment instrument underwent an adaptation process, so that it was aligned with the characteristics of public warehouses and, at the same time, objective, easier to use and to interpret. The adapted instrument utilization has showed that, in addition to verifying the alignment of the unit's management practices with the MEGP, it is possible to have a comprehensive understanding of the evaluated sector, discovering aspects not addressed in internal audits and that directly influence the warehouse’s performance. With regard to warehouse’s evaluation, according to the research carried out with the adapted instrument, out of a total of 100 possible points, the warehouse reached 76.5 points. This demonstrates that the warehouse has a fine level of management practices, presenting work standards that are aligned with the fundamentals of excellence. In addition, it is observed that it is within the scope of the unit to implement the improvement opportunities identified with the evaluation process.

Thesis
1
  • JOSE CLAUDIO ISAIAS
  • Methodology for Project Portfolios Selection using Multicriteria of the CAPM, Semi Variation, and the Gini Risk Coefficient

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • ANTONIO FERNANDO BRANCO COSTA
  • CARLOS HENRIQUE PEREIRA MELLO
  • CLAUDIMAR PEREIRA DA VEIGA
  • GUILHERME AUGUSTO BARUCKE MARCONDES
  • PEDRO PAULO BALESTRASSI
  • WESLEY VIEIRA DA SILVA
  • Data: Mar 3, 2022


  • Show Abstract
  • Criteria from Gini-CAPM and of Gini-semivariations metrics are good options to compose methods for projects portfolio selection. The adequacy is even more, especially when considering the trade-off between return and risk and the covariations in the adjacent selection. These methods can help significantly because they have more robust risk coefficients for assessments of non-normal probability distributions, which are very common in projects portfolio selection. However, searches for methods that meet the selection needs using the adjacent criteria are unsuccessful, including for projects of renewable solar energy generations using cellular photovoltaic panels, which have stood out among the options. Thus, this work seeks to help minimize the gap by presenting methods for selection using criteria from Gini-CAPM and of Gini-semi variations, and with significant novelties. Historical and simulations data stochastic evaluations indicate that the portfolios selected by the methods are attractive options for implementations. These portfolios have reasonable probabilistic expectations of the trade-off between risk and return and satisfactory protection to avoid mistakes caused for not considering covariations in return on investment. These are significant advances on the current knowledge frontier and will likely allow the increased use of the concept. The methods also present theoretical contributions in adaptations of the benchmark models, which help to minimize the adjacent literary gap, in addition to a financial class structure as it considers most of the scenario variables.

2
  • LEANDRO FRAMIL AMORIM
  • -OptConfidence Ellipses for Multivariate Paretoimal Solutions

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • FABRICIO JOSÉ PONTES
  • MARCELA APARECIDA GUERREIRO MACHADO DE FREITAS
  • PEDRO PAULO BALESTRASSI
  • Data: Mar 21, 2022


  • Show Abstract
  • This study presents a non-linear bi-objective optimization method for correlated responses of Robust parameter design optimization (RPD) using Normal Boundary Intersection (NBI) method. Even in capable region for multiple and conflicting objectives optimization Pareto-frontiers could be formed by indistinguishable points which may require a highly confirmatory sample sizes to verify non-dominance. Taking advantage of uniformly spread Pareto-frontiers, some propositions are established to treat a trade-off between mean and variance. In this approach, Response Surface Methodology (RSM) is applied for modeling the quality characteristics of process, using propagation of error to extract the implicit variance. Moreover, in order to avoid correlated variables on subsequent optimization, Factor Analysis rotated by Equimax is applied replacing original data by factor scores regressions. In contemplation of distinguishing Pareto-solutions are formed (1-α) confidence ellipse region for centrality and dispersion of every solution, which variability is quantified by variance-covariance matrix. These ellipses are especially important to understand the stochastic nature of Pareto-optimal solutions obtained when NBI is used coupled with RSM. As a key result, this study conceives the Fuzzy decision-maker, which is a smart Pareto filter based on Fuzzy logic, combining confidence ellipses volume (variability) and Mahalanobis distance (mean shift) as a quality indicator. This approach becomes possible to synchronously minimize accuracy and precision. The adequacy of the proposal is illustrated with two real cases of hardened steel turning process, optimizing cost and tool life. The quality of practical results motivates us to suggest the method may be extended to applications on similar manufacturing processes problems.

3
  • FLÁVIO FRAGA VILELA
  • PROPOSAL OF A MATHEMATICAL EQUATION FOR THE QUANTITATIVE INTEGRATION OF THE HUMAN FACTOR IN DISCRETE EVENT SIMULATION PROJECTS

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • ANEIRSON FRANCISCO DA SILVA
  • FABIANO LEAL
  • JOSE ARNALDO BARRA MONTEVECHI
  • LUIZ RICARDO PINTO
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Apr 20, 2022


  • Show Abstract
  • In discrete event simulation (DES) projects, generally, some computational validation criteria are not defined assertively and human factors are not integrated and considered in the input data modeling phase. An erroneous assumption made during the input data modeling step is that workers operate at an unchanging rate. This ends up being a problem for the modeling of production systems, especially if the process to be modeled involves a large amount of manual work. In this context, the objective of this thesis is to propose a mathematical equation to integrate three human factors in an DES project. And the innovation of this research is based on the proposed mathematical equation, which intends to model and integrate human factors, circadian rhythm, performance and learning and apply them through the equation in three different DES projects. The adherence of the proposed equation was verified with the help of validation tests, used in order to validate the developed computational model. Therefore, the aforementioned equation was conceived in this research to represent the human factor from the processing times of each activity considered in each object of study. In the methodological stage, the system conceived in this thesis was presented, and six scenarios were planned, CIA, CIB and CIC that do not consider human factors, circadian rhythm, performance and learning. And the CIIA, CIIB and CIIC scenarios that consider, through the proposed equation, these three human factors. Finally, a quantitative validation was performed using the two-sample t-test and a qualitative validation using the Turing Test on the results from the computational model. As a conclusion, it was found that the validation of the computational model conducted through the two techniques mentioned above did not occur for the CIA, CIB and CIC scenarios and, therefore, only occurred partially or in full, for the CIIA, CIIB and CIIC. Therefore, it was found that the computational validation in an DES project is influenced when these three human factors are considered and the proposed mathematical equation is a novelty for researchers in the field of discrete-event simulation.

4
  • SIMONE CARNEIRO STREITENBERGER
  • Predictive model of total oil and grease in produced water quantified by the gravimetric method

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ALOISIO EUCLIDES ORLANDO JÚNIOR
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • CLAUDIMAR PEREIRA DA VEIGA
  • PEDRO PAULO BALESTRASSI
  • WESLEY VIEIRA DA SILVA
  • Data: Jun 10, 2022


  • Show Abstract
  • The produced water generated by the primary oil processing carried out by offshore oil platforms, which has a total of oil and grease (TOG), is usually reinjected or disposed into the open ocean. This disposal is monitored by environmental regulatory agencies that determine maximum TOG values. In Brazil, the gravimetric method is that homologated for measuring TOG, which must be carried out in onshore laboratories. Due to the logistics of transferring samples from the platform to the laboratory, the measurement result is available approximately 20 days after the day of collection. This work proposes the development of a predictive model of gravimetric TOG (TOG-G) from process variables, added to a variable extracted from the response variable, which can be used offshore and in real time, to more quickly guide possible preventive or corrective actions in order to avoid its non-compliance. For this, the observations were grouped into clusters associated with TOG-G ranges, through which the base balancing was performed. Training and test sets were generated and a classifier was built for the cluster according to the most significant process variables for the prediction of TOG-G, identified through linear regression. Subsequently, the TOG-G was modeled from the significant process variables and the cluster. The results obtained for the test set were evaluated by means of Mean Absolute Error (MAE), Mean Absolute Percent Error (MAPE), coefficient of determination (R^2) and Pearson's correlation coefficient (ρ), and showed to be superior both to the forecasts generated from the predictive model developed from the same forecasters, but disregarding the cluster, as to the real values of the spectrophotometric TOG (TOG-S) measurements, which constitutes the real-time method currently used as a reference in the platform. To validate the gains in accuracy with the proposed method, it was also applied to a classical set of linear regression for predicting fish weight. Thus, the inclusion of the cluster information in the TOG-G model proved to be an innovative and efficient approach to increase the accuracy of its prediction from information available on the platform, which may considerably benefit the oil industry in terms of process control.

5
  • ESTEVÃO LUIZ ROMÃO
  • Impact of meteo-oceanographic variables and total oil and grease content on the formation of oil sheen during primary oil processing

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • ALOISIO EUCLIDES ORLANDO JÚNIOR
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • CLAUDIMAR PEREIRA DA VEIGA
  • PEDRO PAULO BALESTRASSI
  • WESLEY VIEIRA DA SILVA
  • Data: Jun 10, 2022


  • Show Abstract
  • The appearance of oil sheens in the ocean is a challenge for companies that perform primary oil processing on offshore platforms. After the separation of the gas, oil and water that are present in crude oil, part of the water is returned to the oceans with a certain content of oils and greases. The value of the total oil and greases (TOG) associated with the values of metoceanographic variables such as: wind direction (WD), wind speed (WS), current direction (CD), current speed (CS), wind wave direction (WWD) and peak period (PP) create scenarios that favor or hinder the appearance of oil sheens. In Brazil, these oil sheens can lead to sanctions for companies if they exceed 500 meters in length. In view of this, the present work conducts a study about how such variables influence the probability of occurrence and detection of oil sheens via satellite, as well as their extent, applying machine learning techniques (random forest, k-nearest neighbors, artificial neural networks, logistic regression, and support vector machines), factor analysis, design of experiments (DOE) and the optimization algorithm desirability. The main conclusions
    of the study were: (i) random forest outperformed the other analyzed classifiers and a model whose area under the Receiver Operating Characteristic Curve (ROC curve) was 0.93 was achieved; (ii) the methodology used, combining the classifiers with the aforementioned techniques proved to be satisfactory; (iii) the higher the values of WS, WD and CS, the lower the probability of occurrence and detection of oil sheens, whereas the higher the values of TOG, PP, WWD and CD the higher the value of this probability;
    (iv) variables such as CS and TOG contribute positively to increasing the extension of the oil sheens, while high values of WD, WS and PP reduce the extension of the features.

6
  • GUSTAVO TEODORO GABRIEL
  • Computer model validation: a study integrating Generative Adversarial Networks and Discrete Event Simulation

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • JOSE ARNALDO BARRA MONTEVECHI
  • FABIANO LEAL
  • JOSE HENRIQUE DE FREITAS GOMES
  • RAFAEL DE CARVALHO MIRANDA
  • ANDERSON RODRIGO DE QUEIROZ
  • FERNANDO AUGUSTO SILVA MARINS
  • Data: Jul 18, 2022


  • Show Abstract
  • Computer model validation of Discrete Event Simulation (DES) is essential for project success since this stage guarantees that the simulation model corresponds to the real system. Nevertheless, it is not possible to assure that the model represents 100% of the real system. The literature suggests using more than one validation technique, but statistical tests are preferable. However, they have limitations, since the tests usually test the mean or standard deviation individually, and do not consider that the data may be within a pre-established tolerance limit. In this way, Generative Adversarial Networks (GANs) can be used to train, evaluate and discriminate data and validate DES models, because they are two competing neural networks, where one generates data and the other discriminates them. The proposed method is divided into two phases. The first is the "Training Phase" and it aims to train the data. The second, the "Test Phase" aims to discriminate the data. In addition, in the second phase, the Equivalence Test is performed, which statistically analyze if the difference between the judgments is within the tolerance range determined by the modeler. To validate the proposed method and to verify the Power Test, experiments were carried out in continuous, discrete, and conditional distributions and in a DES model. From the tests, the Power Test curves were generated considering a real tolerance of 5.0%, 10.0% and 20.0%. The results showed that it is more efficient to use the dataset that presents larger sample in the “Test Phase” while the set with smaller sample size needs to be used in the “Training Phase”. In addition, the confidence of the Power Test increases with big higher dataset in first phase, presenting smaller confidence intervals. Also, the more metrics are evaluated at once, the greater the amount of data inputted in the GANs' training. The method suggests classifying a validation based on the achieve tolerance: Very Strong, Strong, Satisfying, Marginal, Deficient and Unsatisfying. Finally, the method was applied to three real models, two of them in manufacturing and the last one in the health sector. We conclude that the proposed method was efficient and was able to show the degree of validation of the models that represent the real system.

7
  • AFONSO TEBERGA CAMPOS
  • General Adversarial Networks: an alternative for modeling input data in simulation projects

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • ANIBAL TAVARES DE AZEVEDO
  • ALEXANDRE FERREIRA DE PINHO
  • ANEIRSON FRANCISCO DA SILVA
  • FABIANO LEAL
  • JOSE ARNALDO BARRA MONTEVECHI
  • Data: Aug 31, 2022


  • Show Abstract
  • In general, stochastic simulation consists of input data and logic, the former being the basic source of uncertainty in a simulation model. For this reason, data modeling is an essential step in the development of stochastic simulation projects. Many advances have been observed in recent years in simulation software and in data collection tools. However, the methods for input data modeling have remained largely unchanged for over 30 years. In their daily lives, modelers face difficulties related to the choice of input data models, mainly due to the challenge of modeling non Independent and Identically Distributed Data (IID) data, which requires specific tools not offered by simulation software and their data modeling packages. For this reason, few studies consider elements of complexity such as heterogeneities, dependencies, and autocorrelations, underestimating the uncertainty of the stochastic system. Given the new developments in Artificial Intelligence, it is possible to seek synergies to solve this problem. The present study aims to evaluate the results of the application of Generative Adversarial Networks (GANs) for input data modeling. Such networks constitute one of the most recent architectures of artificial neural networks, being able to learn complex distributions and, therefore, generate synthetic samples with the same behavior as real data. Therefore, this thesis proposes a method for Input Data Modeling based on GANs (MDE-GANs) and implements it through the Python language. Considering a series of theoretical and real study objects, the results are evaluated in terms of representation quality of the input models and comparisons are made with traditional modeling methods. As a main conclusion, it was possible to identify that the application of MDE-GANs allows obtaining input data models with strong accuracy, surpassing the results of traditional methods in cases of non-IID data. Thus, the present thesis contributes by offering a new alternative for input data modeling, capable of overcoming some of the challenges faced by modelers.

8
  • RENATA PINTO RIBEIRO MIRANDA
  • PROPOSAL OF A SYSTEMATIC OF GOOD PRACTICES FOR SPECIFICATION AND IMPLEMENTATION OF ALARMS DURING THE DEVELOPMENT OF MEDICAL DEVICES WITH A FOCUS ON USABILITY

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • JANAINA MASCARENHAS HORNOS DA COSTA
  • MARCELO GITIRANA GOMES FERREIRA
  • RENATA APARECIDA RIBEIRO CUSTODIO
  • RODRIGO MAXIMIANO ANTUNES DE ALMEIDA
  • Data: Nov 25, 2022


  • Show Abstract
  • Intensive care units are highly complex environments, given the need for care and monitoring of critically ill patients. In order to ensure survival and quality of care, bedside monitoring by medical devices is crucial to provide essential life support measures. For the monitoring to be uninterrupted and signal the user/operator technical or clinical changes, there are alarms. However, what has been noticed in clinical practice and the literature is the considerable increase of these devices without clinical or technical necessity, which may lead the user/operator to be fatigued and lose perception of them. According to the literature, one of the reasons for this increase in the number of alarms is the difficulty of developers to meet the regulatory requirements of the alarm standard (IEC 60601-1-8) because this activity requires manufacturers to have much more than technical knowledge of the product given the complexity of information in the standard, the technical language, and the lack of detail on how to perform each requirement. In order to remedy this bias in the literature, this thesis sought to develop a systematic approach to assist in specifying and implementing alarms to be incorporated into the development process of medical devices, focusing on usability, aiming to minimize alarm fatigue, and maximizing the safety of users/operators and patients. Thirteen systematics were developed, which were based on the literature, the norm and the cases studied. To create and validate the developed material, exploratory multiple case studies were carried out with three companies that develop medical devices with alarms in order first to know the companies and identify their work processes in developing equipment and alarms, and later evaluate their perceptions about the constructed systems and comparing them with the organizations’ reality. Our findings enabled us to conclude that after the creation of the systematics and their evaluation by the companies, the materials created were positively evaluated and showed a potential use by the companies, serving as a visual and objective step-by-step for the development process of medical products with alarms. Lastly, it is worth mentioning that the material can also be used by laboratories that make conformity evaluations or certifiers to assist in certifying products more effectively and quickly, being a valid material to unify the understanding and compliance of the norm by companies and laboratories.

9
  • NATÁLIA MARIA PUGGINA BIANCHESI
  • A Nonlinear Time-Series Prediction Methodology Based on Neural Networks and Tracking Signals

  • Advisor : ANTONIO FERNANDO BRANCO COSTA
  • COMMITTEE MEMBERS :
  • ANTONIO FERNANDO BRANCO COSTA
  • CARLOS HENRIQUE PEREIRA MELLO
  • CLAUDIMAR PEREIRA DA VEIGA
  • JULIANA HELENA DAROZ GAUDENCIO
  • PEDRO PAULO BALESTRASSI
  • WESLEY VIEIRA DA SILVA
  • Data: Dec 12, 2022


  • Show Abstract
  • Nonlinear time series forecasting is widely used in several areas to make good inferences about the future and to support decisions. Many examples of nonlinear time series include medical observations, financial recordings, and weather data. The accuracy of forecasts is determined by considering how well a model performs on new data that were not used when fitting the model and the monitoring of forecast errors is essential to ensure forecasting accuracy. Therefore, this thesis presents a nonlinear time series prediction methodology using Neural Networks and Tracking Signals method to detect bias and their responsiveness to non-random changes in the time series. Datasets were generated to simulate different nonlinear time series by changing the error of the series. The datasets were predicted by Artificial Neural Network, Multilayer Perceptron, and the forecast errors were monitored by Cumulative Sum Tracking Signals. Different from many studies published in the area, the statistical methodology of Design of Experiments was applied to evaluate the tracking signals based on Average Run Length. After, the methodology was applied in data based on total oil and grease and it was compared with the application of other traditional methodologies. The results showed that the proposed prediction methodology is an effective way to detect bias in the process when an error is introduced in the nonlinear time series because the mean and the standard deviation of the error have a significant impact on the Average Run Length. This study contributes to a discussion about time series prediction methodology since this new technique could be widely used in several areas to improve forecast accuracy.

10
  • YASMIN SILVA MARTINS XAVIER
  • Analysis of risk management in the medical device sector by means of a Multiple Case Study

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • PAULO ALEXANDRE DA COSTA ARAÚJO SAMPAIO
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • JEAN MARCEL SOUSA LIRA
  • JOSE HENRIQUE DE FREITAS GOMES
  • JULIANA HELENA DAROZ GAUDENCIO
  • Data: Dec 14, 2022


  • Show Abstract
  • Considering its potential for innovation and the large number of micro and small businesses, the medical device sector has become a relevant object of study. Due to the characteristics of their activities, companies are subject to several standards and regulations, such as the ISO 13485 which defines the requirements for Quality Management Systems (QMS) and based on the standard ISO 9001, leaves implicit the need to manage risks. The adoption of risk management practices is considered a major challenge for small companies, and the need for scientific and methodological support for such organizations is evidenced in the literature. In this context, this research aims to propose a systematic approach to risk management, suitable for QMS of innovative companies in the medical device sector, and to compare the proposal with the practices performed by companies, in order to verify its suitability for medical devices SMEs (small and medium sized enterprises). The method applied was the Multiple Case Study, through the following steps: initially a systematic literature review was conducted in order to identify how companies have performed the risk management in the normative and practical contexts, establishing the basis of the proposal and also analyzing the aspects discussed in the literature about this issue; then 11 cases were selected, six Brazilian companies and five Portuguese companies; the case study protocol was elaborated and validated through a pilot test; finally, data collection and elaboration of individual and crossed case reports were performed. Some relations were identified among the cases, suggesting the existence of potential patterns. Among the results we highlight that the normative process in the national scenario is considered by companies as something bureaucratic, and meeting the requirements becomes subject to the auditors' interpretation, which ends up leading companies to opt for the use of FMEA (Failure Modes and Effects Analysis), which is not observed in the international scenario. The study shows that previous experience is always a requirement for risk management, no matter how it is implemented, and it can be a challenge for the companies. Regarding the systematic, it was verified that it can positively impact the guidance and preparation of the team to implement the process, although it may face resistance to change among companies with already structured risk management. The analyses suggest that the systematic may be more beneficial to SMEs that are at the beginning of the process of risk management implementation.

11
  • CARLOS HENRIQUE DE OLIVEIRA
  • Robust multivariate optimization in end milling of UNS S32205 duplex stainless steel

  • Advisor : JOAO ROBERTO FERREIRA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • JOAO ROBERTO FERREIRA
  • MANOEL CLEBER DE SAMPAIO ALVES
  • MESSIAS BORGES SILVA
  • PAULO HENRIQUE DA SILVA CAMPOS
  • TARCISIO GONCALVES DE BRITO
  • Data: Dec 14, 2022


  • Show Abstract
  • Duplex stainless steel pertains to a class of materials with low machinability due to its right rate of hardening, low thermal conductivity and high ductility. This characteristic represents a significant challenge in the manufacture of components, especially in the end milling process. Optimization is a viable alternative to determine the best process parameters and obtain higher production values with sustainability and quality. The presence of noise variables is an additional complicating factor during material machining of this material, and their presence causes an increase in variability during the process, and their effect can be mitigated by employing robust modelling methods. This thesis presents the robust multivariate optimization in the end milling of duplex stainless steel UNS S32205. The tests were carried out using a central composite design combining the input variables (cutting speed, feed per tooth, milled width and depth of cut) and the noise variables (tool flank wear, fluid flow and overhang length). The concept of robust parameter design, response surface methodology, factor analysis, optimization of the multivariate mean square error for robust factors and the normal boundary intersection were applied. The combination of all these methodologies gave rise to the EQMMFR-NBI method. As a result of the factor analysis, the response variables were grouped into 3 latent variables, the first referring to the roughness Ra, Rq, Rt and Rz (quality indicator); the second to the electricity consumption and CO2 emissions (sustainability indicator) and the third to the material removal rate (productivity indicator). Multivariate robust optimization was performed considering sustainability and productivity indicators, while quality was used as a constraint to the nonlinear optimization problem. By applying the EQMMFR-NBI method, Pareto optimal solutions were obtained and an equispaced frontier was constructed. Confirmation tests were performed using Taguchi's L9 arrangement. The results showed that the optimal setups found were able to neutralize the influence of noise variables on the response variables, proving the good adequacy of proposal and the application of the method.

2021
Dissertations
1
  • JOÃO VICTOR SOARES DO AMARAL
  • Metamodel-Based Optimization: An Approch to Metamodeling in Discrete Event Simulation

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • ANEIRSON FRANCISCO DA SILVA
  • JOSE ARNALDO BARRA MONTEVECHI
  • RAFAEL DE CARVALHO MIRANDA
  • TABATA NAKAGOMI FERNANDES PEREIRA
  • Data: Feb 9, 2021


  • Show Abstract
  • In the context of industry 4.0, optimization via simulation (OvS) emerges as one of the most powerful tools in the modern industry, allowing decision-makers to allocate their resources more assertively. However, in very complex systems, the use of conventional OvS techniques requires computational time, which frequently, makes its application unfeasible. In recent years, the development in the machine learning area has emerged algorithms with high learning capacity, making the use of optimization via simulation by metamodeling (OvSM) techniques to solve complex problems a promising field of study. In this sense, the present study proposes a framework for OvSM based on the insights and analyses derived from the systematic literature review carried out. The proposed framework incorporates the use of discrete event simulation techniques, design of experiments, machine learning algorithms, and hyper-parameter optimization via genetic algorithm for OvS problems. To validate the proposed method, this dissertation tested and compared six machine learning algorithms (Support Vector Machine, Artificial Neural Networks, Gradient-Boosted Trees, Randon Forest, Polynomial Regression, and Gaussian Process) with and without the hyper optimization step -parameters in two experimental arrangements (Latin Hypercube Design and Random) applied to the problem of resource allocation in three real cases in the industry. With the application of the method in the study objects presented, the best performing metamodels obtained solutions that reached, respectively, 100%, 96.17%, and 100% of the optimal benchmark location, demanding, on average, 35.22% less time computational. Also, the incorporation of the hyper-parameter optimization step in the proposed metamodeling method allowed a 31.28% reduction in the root mean square error of the metamodels compared to the traditional method, which does not include this step.

2
  • LARYSSA DE ANDRADE MAIRINQUE
  • THE PANDEMIC BY COVID-19 AND ITS IMPACTS ON URBAN MOBILITY: A CASE STUDY USING SPATIAL STATISTICAL ANALYSIS

  • Advisor : JOSIANE PALMA LIMA
  • COMMITTEE MEMBERS :
  • CIRA SOUZA PITOMBO
  • ALEXANDRE FERREIRA DE PINHO
  • JOSIANE PALMA LIMA
  • Data: Feb 11, 2021


  • Show Abstract
  • Due to the rapid advance of the disease by Covid-19 and its spread on a global level, the new coronavirus has significantly impacted people's daily activities and created an unprecedented scenario, since several measures were implemented as a way to reduce contagion and spread Covid-19 disease. Thus, a theoretical deepening on the social variables that can influence the spread of the disease, is important for the control measures to Covid-19 to be effective both in the present moment and in the future. Therefore, the work aims to assess the impact of travel patterns, land use and socioeconomic aspects on the spatial distribution of Covid-19 cases. The methodology consists of modeling using Ordinary Least Squares (OLS) and Geographically Weighted Regression (GWR) methods. A case study was carried out in São João del Rei, a medium-sized municipality in the state of Minas Gerais. Initially, a descriptive and spatial study was developed in the cities of Itajubá and São João del Rei, both medium-sized cities located in Minas Gerais, as a way to assess how the pandemic by Covid-19 impacted the travel behavior, and daily activities of its inhabitants, such as also the impact on public and road safety. For this, data from a survey on the daily activities of the population before and during the pandemic applied in 2020 in both cities were used, data from traffic accidents and assaults on public roads collected from the Military Police of each city. It was possible to identify that there are associations between socio-demographic variables and the place where the main activity was carried out during the pandemic by Covid -19 of the respondents through the application of Pearson's chi-square test. Through an exploratory analysis it was identified in both cities a percentage reduction in the use of buses as a means of transport, an increase in short trips of up to 10 minutes, as well as a reduction in the frequency of accidents and robberies on public roads during the analysis period, for both cities. In the city of São João del Rei, the sample with the data showed a better spatial distribution and information on the number of Covid-19 cases was available to the entire population, enabling a more in-depth study with statistical analysis, using the OLS and GWR methods with the variables determined for the city of São João del Rei. The results show a strong association between the number of cases of Covid-19 and several variables of the travel behavior, socioeconomic and land use. The GWR approach proved to be an important tool to explain the spatial distribution of Covid-19 cases in the municipality, showing in most cases a better fit than the OLS method. The study on the association between social variables and the spread of disease was important and remains necessary. The result serves as a subsidy to the planning of urban mobility with measures aimed at health security and service to the population during and after this period of crisis, and thus make more efficient use of public resources with a view to sustainable development.

3
  • MARIANNA LUCINDA DE OLIVEIRA
  • Study of socioeconomics, built environment and perceived quality variables as contributions to the demand for Urban Public Transport

  • Advisor : JOSIANE PALMA LIMA
  • COMMITTEE MEMBERS :
  • BARBARA STOLTE BEZERRA
  • JOSIANE PALMA LIMA
  • JULIANA HELENA DAROZ GAUDENCIO
  • Data: Feb 18, 2021


  • Show Abstract
  • Cities, increasingly populated, need the development of efficient transport systems to achieve improvements in the quality of life of the population. Factors such as air pollution, congestion and road accidents have been aggravated by the constant increase in private vehicles and the promotion of the use of sustainable modes of transport is essential to mitigate problems related to the sector. Thus, Public Transport (PT) is a great ally in the search for more sustainable urban mobility, mainly for long distance travel and for the transportation of a greater number of people. However, the PT has faced a continuous drop in the demand in recent years, which has been further aggravated by the Covid-19 pandemic and understanding the factors that are related to its use is extremely important. The work aims to evaluate the influence of socioeconomic, travel mode, perception of quality, built environment and security variables, in the demand for PT, through the dependent variable frequency of use. Multinomial Logistic Regression was used to demonstrate the significance of the variables in relation to the dependent variable. Some variables were developed in a Geographic Information System (GIS) environment, portraying the relationship with the users' spatial location. The results obtained from the satisfaction level show that users are more dissatisfied with the fare value, frequency of buses and issues related to the characteristics of bus stops, such as information and shelter. The factors related to human aspects were the most satisfactory for users, such as cordiality and driver ability. Regarding the characteristics of the built environment, it was observed that most users have good accessibility to the PT, with only 2.5% of the interviewees being at a distance greater than 500 meters from a bus stop. In relation to safety aspects, it is perceived that most crimes occur in the downtown region and its surroundings, and also in specific neighborhoods on the city. Road accidents occur around the downtown region and along the main streets of the city. Finally, in relation to the significant influence on the dependent variable frequency of use, the variables choice of PT for work and study, payment using transportation vouchers, the age group of 36 to 59 years, a greater distance to the business center and a good accessibility to the bus stop, are related to the frequent use of PT. The male gender, the possession of a private vehicle and dissatisfaction with the fare and punctuality of the buses were the variables related to the rare or occasional use of PT. The results can be used as subsidy to direct the public authorities to improve the PT system and, consequently, to increase the demand. For example, the provision of financial subsidies to reduce the fare value, the inspection of the provision of services and limiting access to individual vehicles. A good internal planning of the operating company is also necessary, ensuring a quality, continuous and efficient service. The results provide technical support in identifying factors that affect the frequency of use of the PT.

4
  • JADE DE SOUZA BORDÓN
  • INTEGRATION OF SYSTEMATIC LAYOUT PLANNING, LEAN THINKING AND SIMULATION TO DISCRETE EVENTS FOR THE LAYOUT PROJECT OF A HOSPITAL PROMPT SERVICE

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • CARLOS HENRIQUE PEREIRA MELLO
  • FERNANDO AUGUSTO SILVA MARINS
  • JOSE ANTONIO DE QUEIROZ
  • JOSE HENRIQUE DE FREITAS GOMES
  • Data: Feb 22, 2021


  • Show Abstract
  • The importance of the healthcare facilities is unquestionable and, in times of the Covid-19 pandemic and its uncertainties, the application of Production Engineering knowledge in studies that can cause improvements and better understanding about this area is even more crucial. A study is carried out on the layout design of hospital department from the perspective of the integration of three tools: the Systematic Layout Planning, Lean and Discrete Events Simulation. The first one, used for assist problems in hospital departments, presents a script for the layout project or even its modification. The second brings Lean Thinking and its concepts in healthcare. And the last one, allows multiple health-related analyzes to be carried out, without real risk and at low cost. The research proposes to build a script, based on this integration, where each one of its stages is described and presents how they were used for the layout design of the emergency care of a philanthropic hospital, in the southern region of Minas Gerais. With the use of the systematic proposal for the object of study, it is intended to find results that help in making a safer decision in relation to physical changes. Thus, in the final stage of its application, layout alternatives were tested for three levels of possible demands for the future. The parameters of interest were defined: the lead time, the waiting time and the number of visits. It was found that, depending on the variation in demand, it possible to add processes or change some activity so the workload can be divide, waste of movement and waiting reduced and use better the human resources. The displacement of the patient without accompaniment allows to minimize the waste of movement of the doctor. The insertion of triage in the current layout is not advantageous, since the number of visits goes from an average of 25 to 28 patients per day, with large increases in the values of the parameters of interest. The scenario with the highest capacity for daily care, when compared to the previous layout, has a small increase in parameters that is justified by being able to attend an average of 30 more patients per day. The guide was successfully used to study the proposals for physical changes in the department of interest, it allows to assess the current status of the service and what are the possible changes in layout and its implications, as well as when they should be made.

5
  • ANISSA SASSE CARDOSO
  • Systematics for Evaluation of Training Courses Graduation: Perspective of Students

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • JULIANA HELENA DAROZ GAUDENCIO
  • MILTON VIEIRA JÚNIOR
  • VICTOR EDUARDO DE MELLO VALERIO
  • Data: Feb 22, 2021


  • Show Abstract
  • Performance measurement systems allow managers to identify points for improvement to
    ensure organizations' competitive advantage. The ENADE (National Student Performance
    Exam) is one of the legal forms of evaluation of Brazilian Higher Education Institutions.
    This evaluation, carried out every three years, measures the perception of graduating
    students in relation to the course during the period of graduation. Often students only have
    contact with the questions that assess their perception of the course when they take ENADE. Evaluating students' perceptions of the undergraduate course in shorter intervals can enhance improvement actions. The objective of this work is to adapt the questions of the Student Questionnaire, which composes ENADE, in a systematic way, allowing the self-assessment of undergraduate courses. Its object of study was the current performance evaluation system of the Production Engineering course - UNIFEI, campus Itajubá-MG, where two cycles of action research were carried out. The proposed systematic has the following steps: Plan the application of the questionnaire; Tabulate data and generate the report; Analyze the report; Propose and monitor the proposed actions; and evaluate the results. For this, the Net Promoter Score scale and classification was used in the application of the current assessment questionnaire, aimed at students, graduates and non-graduates. After tabulation of the collected data, a cluster analysis was carried out in order to identify possible groupings of questions. After a critical analysis of each proposed grouping and validation with specialists, it was possible to reduce the questionnaire assigned to concluding students from 42 to 37 questions and the one designated for non-concluding students from 42 to 34 questions, that is, a reduction of approximately 12% and 19%, respectively. Actions were implemented to increase the response rate, which went from 22.16% in the first cycle to 29.61% in the second. The results identified that most of the improvement actions are autonomous for teachers, with few higher actions falling to higher hierarchical levels. In both cycles, despite the second being impacted by the pandemic, the course obtained an NPS of 73% and 61.4% respectively, being classified as a quality service in the students' perception.

6
  • THALITA RAMIRES DA SILVA
  • EVALUATION AN ALGOTRADING BASED ON DEEP LEARNING FOR THE CAPITAL MARKETS USING RISK MANAGEMENT.

  • Advisor : EDSON DE OLIVEIRA PAMPLONA
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • EDSON DE OLIVEIRA PAMPLONA
  • RODOLFO CARNEIRO CAVALCANTE
  • Data: Feb 26, 2021


  • Show Abstract
  • Financial time series predictions are a challenge due to their nonlinear and chaotic nature. In recent decades, many researchers and investors have studied methods to improve quantitative analysis. In the field of artificial intelligence, sophisticated machine learning techniques, such as deep learning showed better performance. In this work, an automated trading system, an algotrading, to predict future trends of stock index prices Ibovespa is showed and evaluated. Using an LSTM-based (Long Short-Term
    Memory) agent to learn temporal patterns in the data, the algorithm triggers automatic trades according to the historical data, technical analysis indicators, and risk management. Initially, five different strategies were developed using the LSTM algorithm as a basis, then the model that reported the best performance was selected. During the experimental tests, it was possible to prove that the use of trading strategy and risk management techniques helped to minimize losses and reduce operating costs, which have a direct influence on profitability. Subsequently, the model that obtained the best result, the LSTM-RMODV, underwent several improvements. Among them, the implementation of the Break-even and Trailing Stop techniques, and a series of optimizations for the trading strategy. Then, it was possible to obtain a set of parameters that brought better results to the ATS (Automated Trading System), giving rise to the new model called Algo-LSTM. In the last step, the evaluation of slippage alow to infer that in the long-term the impact of slippage under reasonable market conditions is not significant for the final result. Finally, the results demonstrated that the proposed method, Algo-LSTM, shows better performance when compared with other methods, including the buy-and-hold technique. The proposed method also works in bear or bull market conditions, showing a rate over net income based on invested capital of 208.23% in 2019 and 112,81% in 2015. That is, despite the low accuracy, the algorithm is capable of generating consistent profits when all the transaction costs and the income tax over net revenue are considered.

7
  • BRENO SILVA RODRIGUES
  • Proposition of improvements in services destinated to the implementation of Lean Thinking in agribusiness

  • Advisor : EDSON DE OLIVEIRA PAMPLONA
  • COMMITTEE MEMBERS :
  • ANDREI BONAMIGO
  • EDSON DE OLIVEIRA PAMPLONA
  • FABIANO LEAL
  • PAULO FERNANDO MACHADO
  • Data: Mar 3, 2021


  • Show Abstract
  • The importance of agribusiness for the Brazilian economy brings the need to implement techniques for managing processes and people that allow maintaining profitability and competitiveness, even in a scenario with a significant increase in production costs. In this context, the adoption of Lean philosophy can take to agriculture and livestock the same benefits experienced in other economic sectors, such as industry, known as Lean Manufacturing, and services, where Lean Office is applied. Despite this potential, there are few publications related to this subject, when compared to articles related to the other previously mentioned applications. Another barrier found is the resistance of rural landowners to start a lean journey, as it is not known if some management models, such as the one proposed by the Agro + Lean® Management School, meet the needs of producers in different levels of managerial maturity, or if they are applicable in a more rustic environment, such as a farm. In this sense, this research aims to propose improvements to the dairy farm management model proposed by the Agro + Lean® School of Management, which is the benchmark in Brazil for implementation of lean philosophy in the agribusiness. In order to achieve such objective, a case study was carried out on a farm where the model is implemented and information was collected on the courses offered by the School, in order to substantiate the proposals made by this work. Through the analysis of data collected in interviews, documents, observations, workplace layout analysis and informal conversations, it was possible to cross evidences and among themselves and with theoretical references, in addition to applying Pearson's correlation and analyzing the reliability of questionnaires by calculating their Cronbach's Alpha. This study allowed the author to make seven concrete proposals for improvement in the process of implementing the model and in the perception of the problems faced by the students of the School of Management Agro + Lean®, allowing better addressing of the contents taught.

8
  • JOSÉ EUCLIDES FERNANDES GIGLIO
  • DESIGN OF EXPERIMENTS IN TRAINING ARTIFICIAL NEURAL NETWORKS FOR THE PROBLEM OF PREDICTING NON-LINEAR TIME SERIES

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • JULIANA HELENA DAROZ GAUDENCIO
  • PEDRO PAULO BALESTRASSI
  • WESLEY VIEIRA DA SILVA
  • Data: Jul 29, 2021


  • Show Abstract
  • A time series is defined as a collection of observations of a variable over time, whose data order has a fundamental importance due to the dependence between these consecutive values. The analysis of these data, and the understanding of this correlation, is an important tool in understanding phenomena in various sciences, such as Economics, Engineering and Operations Management, where prices, demands and values are these variables. The modeling of this data sequence provides its use in order to, based on historical data, make predictions for future periods. This consecutive relationship can be considered complex and, not uncommonly, non-linear. The use of Artificial Neural Networks has proven increasingly effective in establishing pattern recognition, modeling and predicting future values. The statistical programs available on the market provide user-friendly tools and results demonstrated in several scientific available in publications, but the number of factors and levels that are available for use during the training of Artificial Neural Networks, which may indicate the need for hundreds of years to execute every possible combination. In this study, the statistical methodology of Design of Experiments (DOE) is applied in order to determine the best parameters of an Artificial Neural Network for the prediction of non-linear time series and, thus, significantly reduce the time needed to point out the choice of the best Artificial Neural Network capable of solving our prediction problem. Instead of using the most common technique for training an Artificial Neural Network, that is, the empirical method, DOE is proposed to be the best methodology. The main motivation for this dissertation was the prediction of non-linear seasonal time series - which is related to many real problems, such as short-term electrical load, daily prices and returns, water consumption, etc. A case study is presented. The objective was fulfilled when it was proved to reach error results, between prediction and real value, smaller for the Artificial Neural Network than the error reached with the model.

9
  • LAERCIO ALMEIDA DE SIQUEIRA JUNIOR
  • Use of Machine Learning to Classify Suppliers in the Context of Data Science

  • Advisor : ALEXANDRE FERREIRA DE PINHO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • JORGE MUNIZ JUNIOR
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Sep 3, 2021


  • Show Abstract
  • Decision-making for groups, public or private, is indispensable to the development of organizations, and searching for mechanisms to support the managers more assertively is fundamental to this goal. Know how to use raw data transforming them into knowledge allows these decisions to be based on data besides purely on intuition. Between the important decisions taken by any organization, the classification and selection of suppliers are an important practice to industrial engineering and Data Science is an ascendant field that studies data and how to realize this transformation of raw data into knowledge. To this research were used real data from suppliers of an enterprise of the aeronautical sector in its analyses. So, this research acted between Data Science and Classification and Selection of suppliers and had the focus on a problem known as clusterization that is the segmentation of data in regions as homogeneous as possible when there´s no existence of previous categories and aim to solve this problem supporting in the supplier´s management. This happens in practice using Data Science tools known as Machine Learning that are algorithms that can be used in the segmentation of groups without an initial classification. To the development has been used the procedure CRISP-DM that allows elucidate analyses´ problems helping to structure the scientific thinking. That way, by using this procedure, this dissertation had its general objetive in the use of the technique of Machine Learning to help in the classification and selection of suppliers of that organization. Having two specific objectives, the first one consisted of an analysis of those algorithms in the demonstration of the operation and behavior of the classic algorithms of clusterization of the real database. The second one consisted in analyzing those clusterization algorithms in search of the most appropriate to the supplier´s base culminating with the creation and suggestion of a framework that can be used for future clustering analises. The clustering modelings were realized and through internal and stability validations had their efficiency tested allowing the data to be split into clusters. The use of CRISP-DM allowed that the clustering framework was proposed.

10
  • ARTHUR AURÉLIO DE ALMEIDA FREITAS
  • Analysis and Multiobjective Optimization of Shoestring Potatoes Frying Industrial Process

  • Advisor : JOSE HENRIQUE DE FREITAS GOMES
  • COMMITTEE MEMBERS :
  • FLAVIO SOARES SILVA
  • MARCELO MACHADO FERNANDES
  • PEDRO PAULO BALESTRASSI
  • Data: Nov 4, 2021


  • Show Abstract
  • Fried potatoes are one of the most consumed snacks in the world. The size of the global potato chip market reached a value of 31.2 billion dollars in 2020, being considered one of the most important foods in the world snack industry. The main problem that accompanies the manufacturing process of this product is the high content of lipids in the finished product, which can cause an increase in blood pressure and cholesterol levels. In this context, this study aimed to optimize the frying process of shoestring potatoes using deep-fat frying process. Thus, it was intended the identification of the optimal combination of the process parameters, providing better results for the characteristics of the product, namely, moisture content, fat content and color. Also, it was intended the optimization of the process productivity by reducing cycle time. The parameters analyzed were temperature, duration of deep-fat frying process and duration of the oil drainage period after frying. Potatoes of the species Solanum tuberosum from Asterix variety and palm oil were used. The Response Surface Methodology was used in order to carry out the planning, data collection and analysis of the experiments. The model optimization was performed using the Multivariate Mean Square Error. At the end of this study, it was possible to obtain the optimized values of the analyzed inputs, which were 171.38 °C for temperature, 4.13 min for process duration and 32.27 s for oil drainage duration. The responses optimized values were 3.00% for moisture content, 29.57% for fat content, 1.75 for total color difference and 280.05 for cycle time.

11
  • BRUNO DE CASTRO FARIA
  • Location Of Delivery Lockers For Urban Logistics In A Midsize Brazilian City: The Case Of Divinópolis, Minas Gerais

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • RENATO DA SILVA LIMA
  • ALEXANDRE FERREIRA DE PINHO
  • RENATA LUCIA MAGALHAES DE OLIVEIRA
  • Data: Dec 9, 2021


  • Show Abstract
  • With the advancement of technologies and the spread of the internet around the world, there was a strong expansion of negotiations carried out through electronic commerce in all areas. Consequently, there is a growth in the flow of logistical operations that are responsible for the delivery of products acquired through this means, causing problems related to the transport of these goods to appear. Among the transport problems caused in the last stage of the distribution chain, known as the last mile, or last mile, we can highlight: failure to deliver products, excessive travel, high operating costs, poorly sized transport resources, etc. . Therefore, research in the area evaluates possibilities to alleviate these difficulties in cargo distribution so that logistical operations are more efficient and offer a good level of service to its users. An alternative that has been used in different regions of the world for this purpose are collection and delivery points, which are stations where customers pick up their products purchased over the internet on their own. Collection and delivery points (PCE's) can be automated, known as Delivery Lockers (DL), or not. The literature indicates that one of the difficulties encountered in installing DL's is knowing how to define which location will best serve consumers in each specific region. Therefore, this study aimed to propose suitable locations for the installation of DL's in a medium-sized city in Minas Gerais, as well as to analyze which are the most influential factors for the use of these devices according to the opinion of the local consumer. It was observed that the opening hours, distance from central regions and the safety of these operations are the main factors mentioned by consumers. A multicriteria mathematical model based on the AHP method was developed to help choose the establishments that would best meet the evaluated criteria. It was concluded that several scenarios can satisfy the problem, however those that presented alternatives that are located in central regions or in small commercial centers were preferred by the model. Establishments that work outside conventional hours, such as supermarkets and hypermarkets, were positive highlights in the analyzed scenarios, unlike bank branches, which did not show good results due to their limited space and hours. Finally, it is noted that the use of DL's is also valid for medium-sized cities, however, for this to work effectively, it is ideal that e-commerce users are aware of the benefits that this practice can bring, both in financial and operational terms.

12
  • Fernando Helton Sanches da Silva
  • ANALYSIS OF DIFFICULTIES FOUND DURING THE KANSEI ENGINEERING IMPLEMENTATION

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • CARLOS HENRIQUE PEREIRA MELLO
  • JULIANA HELENA DAROZ GAUDENCIO
  • RICARDO COSER MERGULHAO
  • Data: Dec 17, 2021


  • Show Abstract
  • Kansei Engineering is a product development methodology introduced by professor Mitsuo
    Nagamachi in Japan, in 1970, which seeks to translate the Kansei (impressions, feelings and emotional demands) of users, allowing the development of new products and satisfying
    consumer needs. In this context, this paper presents the results of an applied research, using a quantitative approach, with the objective of identifying what are the difficulties encountered during the process of implementing this methodology, evaluating the degree of these difficulties and how they relate to each other. The contribution of the research is given in terms of improved performance, improved product design, less waste, increased quality and success of Kansei products, in addition to providing the academic community and
    professionals in the field with knowledge about the main difficulties that may occur during
    the implementation of the methodology. To achieve the proposed objective, after a systematic literature review addressing the Product Development Process and its Critical Success Factors (FCS), also on the Kansei Engineering methodology and its generic steps, the main authors of articles on the subject were identified on the Web of Science and Scopus platforms, located on the CAPES journal portal. Subsequently, through a survey, the authors were asked about the degree of difficulty encountered in applying each step of the methodology and in relation to the FCS. First, the sample of respondents was analyzed for the presence of outliers using the Mahalanobis multivariate distance method, returning the exclusion of five respondents. Nagamachi, creator of Kansei Engineering and a second experienced respondent from Kansei Engineering state that the implementation of the methodology does not present difficulties, but through the analysis of agreement (Kappa coefficient) with the other respondents in the sample, it was found that they do not agree with the two authors, indicating that there are difficulties in specific stages of the methodology implementation process. Investigating a group that performed the implementation inside the business environment with a second that performed it outside, through the Mann-Whitney test, it was observed that there was no statistically significant difference that showed the difference between the two groups regarding the difficulty of implementing the methodology. . Finally, through Exploratory Factor Analysis (EFA), of the 28 initial variables, it was possible to reduce 15 variables (8 External Variables and 7 Methodological Variables). Composed of three factors, the External Variables formed: “Leadership and Structure”, “Development Team” and “Importance of the Product”. For the Methodological Variables, two groups were formed, called “Evaluate and
    Create” and “Plan and Validate”. Thus, it can be concluded that the factors formed show more adequately the difficulties of implementing the methodology perceived by the respondents, allowing a more accurate diagnosis of these difficulties before starting the Kansei product development project.

Thesis
1
  • LEOVANI MARCIAL GUIMARÃES
  • Predictive model of Active Learning in Engineering based on Structural Equation Modeling with Partial Least Squares (PLS-SEM)

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • RENATO DA SILVA LIMA
  • FABIANO LEAL
  • PEDRO PAULO BALESTRASSI
  • RUI MANUEL DE SÁ PEREIRA DE LIMA
  • CARLOS NAZARETH MOTTA MARINS
  • Data: Apr 22, 2021


  • Show Abstract
  • The Brazilian National Curriculum Guidelines (DCN) of the Undergraduate Engineering Courses of 2019 presented new pedagogical demands, one of them being the application of Active Learning (AL). Scientific studies on AL applied to Engineering courses has grown significantly in recent years and its results uncovered challenges and opportunities for future research. Classroom observation instruments designed for AL environments have emerged and have supported research to objectively assess the behaviors and attitudes that characterize such environments. However, there was a lack of predictability for Engineering Higher Education Institutions (EHEIs) regarding learning gains when applying AL techniques within a challenging process of changing teaching practices. In this context, the objective of this thesis was to propose a predictive mathematical model that demonstrates the relationship between the students' degree of learning and the application or not of AL techniques in the classroom (measured by the level of activity captured by an observation protocol). To achieve this objective, a strict and systematic methodological process was established, using controlled experimental research in an EHEI over three years, in two dimensions of analysis. The first one, intraclass, used a repeated measures experimental design to demonstrate the probable cause-effect relationship in a two-level one-factor approach. In addition, it allowed a qualitative analysis of AA application in individual courses. The second, interclass, involved independent class samples in subsequent semesters and used Partial Least Squares Structural Equation Modeling to test and identify the best predictive model for learning based on the application of AL. The intraclass results demonstrated, in a positive cause-effect relationship, that the global average academic performance was 14% better in the post-AL assessment, compared to the first, without the application of AL techniques, representing 40% of the standard deviation of the grades. In addition, the individual analysis of performance in each of the courses revealed the most and least successful strategies and allowed to recommend those most viable AL strategies for specific groups of courses. In the interclass dimension, the improvement was 10% and the PLS-SEM predictive model was positively validated by several performance indexes, demonstrating a significant and non-linear positive relationship between the latent constructs, with a moderate to high relevance for learning prediction (Q^2 > 0.344). In the demonstration of the predictive relevance, the best-fitting curve of the relationship allowed, from an average score between 0 and 35.97 in the level of AL adherence (NAA) to predict an average Learning score (AP) between 45.89 and 74.90 on the scale of performance degrees. β coefficients were positive and significant, with p values < 0.01. The systematic methodological design and the results obtained are intended to be the main contributions of this research to the literature and to the latent discussion of the effectiveness of active learning methods in Engineering Education.

2
  • FABRICIO ALVES DE ALMEIDA
  • Discriminatory power improvement of ellipsoidal functions modified by rotated factor loadings in the optimized formation of clusters

  • Advisor : JOSE HENRIQUE DE FREITAS GOMES
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • MARCELA APARECIDA GUERREIRO MACHADO DE FREITAS
  • PEDRO PAULO BALESTRASSI
  • ROBERTO DA COSTA QUININO
  • Data: Apr 30, 2021


  • Show Abstract
  • The technological advent provided the rise of data collection in companies, governments and various industrial segments. In this respect, techniques that seek to perform groupings and discrimination of clusters are widely used in datasets with multiple variables, bringing the need to use specific tools, which contemplate the existing variance-covariance structure. Based on this, this work presents a proposal to improve the discriminatory power of confidence regions in the formation and estimation of optimal clusters, using multivariate and experimental techniques to extract information in an optimized way in correlated datasets. Factor analysis was used as the exploratory multivariate method, tuning the rotation for factor loads through the mixture design, and agglutinating the total variance explained functions by the mean square error afterwards. The optimization of this step is performed through the sequential quadratic programming algorithm. Knowing the optimal scores, a multilevel factorial design is formed to contemplate all combinations of the linkage methods and the types of analysis, seeking to find the parameter that presents the least variability, generating confidence ellipses with better discrimination between groups. A strategy to analyze the levels of agreement and the inversions existence in the formation of clusters is proposed using the Kappa and Kendall indicators. Motivated by the need for strategies to classify substations in the face of voltage sag phenomena, which cause faults in the distribution of electricity, the method was applied to a set of real data, representing the power quality indexes of substations located in southeastern Brazil. Optimum values were found in the factor loads rotation and the parameterization “Ward-analysis of covariance” was defined as the ideal strategies to create the clusters in this dataset. Thus, low variability clusters and precise confidence ellipses were generated to estimate the voltage sag patterns, promoting a better discriminatory power in the clusters’ classification through the regions of confidence. The confirmatory analysis inferred that the “Ward” linkage proved to be the most robust method for this dataset, even under the influence of disturbances in the original data.

3
  • ANDRÉ MARQUES MANCILHA DA SILVA
  • Definition of a gamified model for organizational management.

  • Advisor : ALEXANDRE FERREIRA DE PINHO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • ANDRE LUIS RIBEIRO LIMA
  • ANEIRSON FRANCISCO DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • FABIANO LEAL
  • Data: Apr 30, 2021


  • Show Abstract
  • With the behavioral change of professionals in the labor market, increasingly influenced by
    new generations and technological evolutions of the 21st century, this research, aimed at
    obtaining a better performance of these employees in organizations, proposes an organizational management model based on gamification. Its unprecedented aspect is to
    translate the competition among players, resulting from a traditional gamified intervention,
    into a collective result which does not benefit the individual in corporate environments.
    Using the research-action methodology, gamification was applied in two real companies from different market segments. As a result of the application of the gamified model, it was possible to identify its contribution regarding knowledge management, senses of critical and risk analysis, interactivity between people, organizational climate, employee’s commitment and engagement and, above all, uniformity and synergy of actions in order to achieve a positive and common result for companies, making the work challenging and consistently interesting; however, the obtained results highlight that the proposed model still requires some adjustments and new applications in other companies.

4
  • LUIZ GUSTAVO DE MELLO
  • OPTIMAL COMBINATION OF FORECASTING METHODS ACCORDING TO THE FACTORIAL PAYOFF-JOLLIFFE CRITERION: A MULTIVARIATE APPROACH TO ESTIMATING NATURAL GAS DEMAND

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • FABRICIO JOSÉ PONTES
  • JULIANA HELENA DAROZ GAUDENCIO
  • PAULO ROTELLA JUNIOR
  • PEDRO PAULO BALESTRASSI
  • Data: Jul 15, 2021


  • Show Abstract
  • This study presents a nonlinear multi-objective optimization method for defining optimal weights for combining time series forecasting methods used to estimate annual natural gas demands. The weight allocation approach employs mixed experimental arrangements to model the relationship between various predictive performance metrics, and the weights assigned to the prediction residuals of the individual time series methods chosen for the combinations. The Double Exponential Smoothing (DES) method, the Holt-Winters additive (WA) method, and the multiplicative (WM) method were used in this study. Various performance metrics related to location, dispersion, and diversity were modeled using canonical polynomials for mixtures, which were then individually optimized to form a Payoff matrix for the individual solutions. These were then grouped according to the minimum distance between optimal points and the Jolliffe criterion, defined by the Principal Component Analysis (PCA), and applied to each group identified for non-redundant metric first selection (Payoff-Jolliffe Criteria). Factor analysis (FA) was applied to the remaining metrics, via principal component extraction and varimax rotation, storing the rotated factor scores. After modeling these scores with the same canonical polynomial mixture class, the Normal Boundary Intersection (NBI) optimization method was used, modified by adding an auxiliary elliptic constraint class. The set of results was compared with results from the best individual forecasting methods, results from traditional combination methods, results from the FA-NBI method, and its variants according to the 3 applied Jolliffe rules, in order to verify the reasonableness of the data treatment. The results for all methods were compared with a test set not used in the modeling and optimizing stages, i.e., an out-of-sample set, which verified the remarkable efficiency of the method proposed in this paper, relative to the other methods. Although the results are limited to the studied series alone, the adequacy of the methods presupposes that all other types of time series, or combinations of methods, might result in similar significant improvements in forecast assertiveness.

5
  • EDUARDO RIVELINO DA LUZ
  • A New Multiobjective Optimization Technique of Multivariate Probabilistic Models of a MIG Welding Process in Aluminum Tubes AA6063

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • JULIANA HELENA DAROZ GAUDENCIO
  • MARCELO MACHADO FERNANDES
  • PEDRO PAULO BALESTRASSI
  • ROGÉRIO SANTANA PERUCHI
  • Data: Jul 26, 2021


  • Show Abstract
  • To assist in solving the problem of process improvement, restrictions and better welding operation conditions, this work applies the Design of Experiments (DoE), Multiobjective Optimization and Multivariate Statistics methodologies together to provide the necessary support in the management of the production process of MIG welding (Metal Inert Gas), of anti-corona protection rings, manufactured with tubes aluminum alloy 6063 (Aluminum Alloy 6063 - AA6063), T4, 100 mm in diameter and 2 mm thick. This type of process can be controlled by a relatively small number of input variables, that is, the wire feed rate (WF), voltage (V), welding speed (Fr) and the distance from the contact tip to the part of work (Cf). In addition, many outputs can be evaluated and optimized simultaneously. In the present work, the variables of yield (Y), dilution (D), reinforcement index (IR) and penetration index (PI) were investigated. To consider the multivariate nature of the problem, techniques such as Factor Analysis and Bonferroni's simultaneous confidence intervals were applied combined with elliptical constraints. The response variables were modeled mathematically using Poisson regression and the results obtained were satisfactory, since accurate models were achieved. The normal bound intersection method (NBI) produced a set of viable configurations for the input variables that allows the experimenter to find the best configuration of the system in relation to the level of importance of each response. The application demonstrated the optimal parameter solution for the welding process in AA6063 and presented characteristics of minimizing the weld bead geometry to contribute to the better efficiency and effectiveness of the productive management of the welding process. An experimental confirmation procedure was successfully performed to validate the theoretical results obtained in the prediction model.

6
  • BETANIA MAFRA KAIZER
  • Multivariate learning assessment model in emergency remote higher education

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • CARLOS EDUARDO SANCHES DA SILVA
  • EDUARDO GOMES SALGADO
  • FABIANO LEAL
  • MESSIAS BORGES SILVA
  • THAIS ZERBINI
  • Data: Sep 14, 2021


  • Show Abstract
  • Before the pandemic of COVID-19, university managers had been showing interest in identifying factors that lead individuals to learn better, or to drop out of a course, or even to fail a subject. Finding answers to these questions becomes more evident when it comes to undergraduate Engineering courses, since these have high dropout rates. However, higher education institutions still lack protocols or, at least, indicators or instruments that allow managers to know these factors and the underlying problems in order to act preventively or make decisions. After the adoption of Emergency Remote Learning - ERE, and facing the uncertainties and new challenges of online education, the absence of this information can further compromise the quality of the offer of new distance education actions. Moreover, the ignorance of these factors makes it impossible to make a current diagnosis about the benefits and losses that the pandemic has brought to Engineering students. This thesis, therefore, seeks to identify which factors - here denominated predictor variables - impacted the learning processes of undergraduates who took Calculus 2, one of the curricular components with the highest retention rates in the initial semesters of Exact Science courses. To this end, a predictive multivariate model was proposed and tested. The research was conducted at a Brazilian federal public university in the second semester of 2020. A total of 507 individuals participated in the study, representing 51% of the target population. Primary data (three psychometric scales measuring students' psychosocial and contextual variables, as well as variables referring to teachers' teaching procedures) and secondary data (official documents of the institution involved) were used. Summative evaluation was performed, with after-the-fact analysis of results. Multivariate statistical techniques and methodological procedures based on psychometrics were used. After the analysis, based on the proposed model, the predictors that significantly impacted learning in Calculus 2 were: the family income variable, the self-regulatory and cognitive learning strategies variable, and the instructional events variable, which refers to the learning conditions provided by teachers during the academic semester. The multivariate model of this thesis is replicable and can guide managers in future decisions about the offer of remote courses and subjects, in any field of knowledge. The originality of this work is marked, above all, by the discovery of new variables that may compose future psychometric scales to assess learning outcomes of Engineering students in any discipline.

7
  • CLARA MOREIRA SENNE
  • Sustainability and Integration Index of Urban Transport and Logistics (ISITransLog)

  • Advisor : JOSIANE PALMA LIMA
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • BARBARA STOLTE BEZERRA
  • FABIO FAVARETTO
  • JOSIANE PALMA LIMA
  • RAFAEL DE CARVALHO MIRANDA
  • ROBERTA ALVES
  • Data: Dec 8, 2021


  • Show Abstract
  • Millions of people daily seek opportunities for a better quality of life in cities. What makes cities such attractive places, today concentrating more than 50% of the world's population, is the capacity of large urban centers to promote social interaction and, therefore, catalyze development – of the city and of its people. One of the objectives of developed societies in terms of mobility is to evolve towards models of low carbon consumption and less energy consumption, always with criteria of social equity and fair distribution of wealth. In short, the goal of sustainability. A narrow definition of sustainable transport tends to favor individual technological solutions, while a broader definition tends to favor more integrated solutions, including better travel options, economic incentives, institutional reforms, land use changes, as well as technological innovation. Sustainability planning may require a change in the way people think about and solve transport problems. The objective of this work is to understand the interactions and functioning of the transportation of people and goods in the urban environment, and to propose an evaluation model in terms of sustainability and integrated transportation. Firstly, an in-depth literature review allowed us to understand the interactions and functioning of transport of people and goods in the urban environment, verifying the main initiatives to promote sustainability. These initiatives, together with public policies, constitute the proposed hierarchical model. The hierarchy was then submitted to a multi-criteria decision analysis methodology consisting of the application of Analytic Hierarchy Process (AHP) in a system specially developed for remote evaluation of the hierarchical model. This weighting of the initiatives resulted in the model's impact factor which, together with the degree of sustainability and integration and the stage of implementation of the initiatives, make up the ISITransLog index through a weighted linear combination. The index was then applied to São Paulo, where two distinct periods were considered, 2010 and 2020, and the results provided an assessment of the evolution of the city regarding the sustainability and integration of the passenger and freight systems. Among the results, the following stand out: policies to reduce the use of private vehicles; education and awareness of the population on sustainable urban transport and logistics; investment in clean technologies for transporting people and goods; investment in the integration of transport multimodality and; logistics management policies that promote a balance between operational efficiency and sustainability. The conclusions indicate improve in the sustainability of the urban transport and logistics in the city, highlighting the importance of incentives to the use of active modes of transport and the communication channel with population.

8
  • CARLOS ALBERTO DE ALBUQUERQUE
  • Sustainable product development: Analysis of the replacement of aggregates by electronic waste in the production of concrete

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • AMANDA FERNANDES XAVIER
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • JOAO PAULO MARTINS
  • JOSE HENRIQUE DE FREITAS GOMES
  • VALQUIRIA CLARET DOS SANTOS
  • Data: Dec 10, 2021


  • Show Abstract
  • Among the main concerns with environmental issues, this present work highlights two. The first is with the residue of the non-metallic fraction of the printed circuit board (PCB). This residue, despite having one of the highest rates of increase in its generation, arouses little interest from recyclers. The second issue is the concrete production process. Not only because it uses many natural resources, but because the Portland cement manufacturing process is responsible for generating a large part of greenhouse gases. In a survey carried out by this present work, no research was identified that investigated the replacement of part of the natural aggregates by the non-metallic fraction of PCB in the production of concrete. Therefore, this present work investigated the use of replacing part of the natural aggregates by the non-metallic fraction of PCB in the production of concrete. The replacement of part of the Portland cement with fly ash was also investigated. To carry out these investigations, a literature review was carried out to identify the types of waste that were being used as aggregates in the production of concrete and, also, to identify the percentages of replacement of Portland cement by fly ash. The data obtained in the literature review helped in the experimental design. The design was carried out using the mixing experiment methodology and the extreme vertex method. With this design, concrete traces were produced to investigate the percentage of replacement of natural aggregates by electronic waste, which produces concrete that is classified as structural concrete. This investigation was carried out for mixes that used the replacement of part of Portland cement by fly ash and for mixes that did not use this substitution. To carry out this investigation, slump and compressive strength tests of concrete were carried out. With the data from the experiment, a hypothesis test was performed (t test for two samples). The objective of this test is to compare the average compressive strength of the specimens whose traces replaced 20% of Portland cement with fly ash with the specimens whose traces did not perform this replacement. The result showed that there is no statistically significant difference between the two means. Another result of the experiments was the development of a mix using 20% fly ash as a substitute for Portland cement and part of the aggregates for e-waste, with a strength of 23.010 MPa. Four traces were also developed, without fly ash and with electronic waste, whose specimens obtained a concrete compressive strength of 21.804 MPa; 23.329 MPa; 23.614 MPa and 24.637 MPa. In this way, these five traits produced structural concrete, since their strengths are greater than or equal to 20 MPa. Finally, leaching tests were carried out. For leaching, a scanning electron microscopy was performed on the samples, conductivity and pH test in the water where the samples were soaked, in the periods of 24, 72, 168 and 384 hours. The result of the leaching showed that there was no chemical element in the specimens of the experiments that were different from the chemical elements found in the control traces. It also showed that there were no toxic elements, such as lead and mercury, in the specimens of the experiments. With these results, this present work concludes that from an environmental and natural resource saving point of view, the replacement of part of the natural aggregates by electronic waste, and of 20% of Portland cement by fly ash, are reasonable and significant. Among the objectives of this present work are not to minimize the cost of concrete production nor to improve the compressive strength of concrete. The objectives of this present work are to mitigate the problems caused to the environment by the inadequate destination of electronic waste and to reduce the generation of gases that cause the greenhouse effect, by the construction industry. As future works, this present work intends to investigate the replacement of part of the natural aggregates by the non-metallic fraction of PCI, with replacement percentages higher than those investigated here. The reason is that the statistical analysis indicated that a higher percentage of this replacement can improve the strengths obtained in this present work.

9
  • CLAUDIA ELIANE DA MATTA
  • METHODOLOGY OF DESIGN AND REDUCTION OF THE NUMBER OF EXPERIMENTS IN ACTIVE SEARCH PROBLEMS

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • CLAUDIMAR PEREIRA DA VEIGA
  • ANTONIO FERNANDO BRANCO COSTA
  • ELIANE VALENCA NASCIMENTO DE LORENCI
  • PEDRO PAULO BALESTRASSI
  • WESLEY VIEIRA DA SILVA
  • Data: Dec 16, 2021


  • Show Abstract
  • Many engineering problems involve the optimization of the unknown objective function. Recently, active search has emerged as a powerful tool to solve problems of this nature, whose objective function involves high evaluation costs, whether computational or experimental. This thesis proposal seeks to find an object (x) with an optimal value for a given property (y). However, direct determination of this property of interest across all available objects may not be a viable option given the resources, workload and/or time required. Thus, this proposes an active machine learning approach, called active search, to find an optimal solution, using the design of experiments for the initial search. To apply this method, two regression techniques were used, called k-nearest-neighbours and Gaussian processes. Furthermore, a stopping criterion was defined for the Gaussian regression technique to reduce the algorithm processing time. The originality of the theme lies in the proposed methodology, in the use of experimental design, no active search algorithm using regression techniques that quickly converge to a global optimum, and in the use of a stopping criterion for the algorithm based on statistical criteria. The studies were carried out with simulated data and with real data for the production of medicines, agrochemicals and application in electrical microgrids. In all cases, active search reduced the number of experiments and simulations to obtain the property of interest, compared to traditional algorithms such as Optimal Experiment Design and Kennard-Stone.

2020
Dissertations
1
  • NATÁLIA MARIA PUGGINA BIANCHESI
  • A Design of Experiments Comparative Study on Clustering Methods

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • ANTONIO FERNANDO BRANCO COSTA
  • PAULO ROTELLA JUNIOR
  • PEDRO PAULO BALESTRASSI
  • Data: Feb 14, 2020


  • Show Abstract
  • Cluster analysis is a multivariate data mining technique that is widely used in several areas. It aims to
    group automatically the n elements of the database into k clusters, using only the information of the
    variables of each case. However, the accuracy of the final clusters depends on the clustering method
    used. In this paper, we present an evaluation of the performance of main methods for cluster analysis as
    Ward, K-means and SelfOrganizing Maps. Differently from many studies published in the area, we
    generated the datasets using the Design of Experiment (DOE) technique, in order to achieve reliable
    conclusions about the methods through the generalization of the different possible data structures. We
    considered the number of variables and clusters, dataset size, sample size, cluster overlapping, and the
    presence of outliers, as the DOE factors. The datasets were analyzed by each clustering method and the
    clustering partitions were compared by the Attribute Agreement Analysis, providing invaluable
    information about the effects of the considered factors individually and about their interactions. The
    results showed that, the number of clusters, overlapping, and the interaction between sample size and
    the number of variables significantly affect all the studied methods. Moreover, it is possible to state that
    the methods have similar performances, with a significance level of 5%, and it is not possible to affirm
    that one outperforms the others.

2
  • ANDREZA DE AGUIAR HUGO
  • DECISION SUPPORT MODEL FOR EVALUATION OF REVERSE LOGISTICS AND HEALTHCARE WASTE
    MANAGEMENT IN HOSPITALS

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • RENATO DA SILVA LIMA
  • RICARDO ANDRE FIOROTTI PEIXOTO
  • Data: Feb 17, 2020


  • Show Abstract
  • Hospitals are the main generators of infectious and non-infectious waste called Healthcare Waste
    (HCW). The Healthcare Waste Management (HCWM) is considered a challenge for the hospital
    administration, since the Healthcare Waste represents great risk to both human health and the
    environment. In this context, developing tools to help health centers to evaluate HCWM is necessary. In
    fact, some evaluation proposals can be found in the literature. Nevertheless none of them covered the
    operational, human, economic and environmental areas in a single research. Thus, this study aims to
    develop a support tool that helps with opportunities identification for improvement of the hospitals
    waste management. A Hospital Healthcare Waste Management Index (IGeReS) was developed, based
    on indicators from this area. These indicators were organized into three dimensions: Operational,
    Human Resources and Environmental / Economic Management. The Multi-Criteria Decision Analysis
    was used to assign the degree of importance of the indicators. The index was applied to hospitals in
    three distinct regions of Minas Gerais: the south the state, Belo Horizonte metropolitan region and
    Jequitinhonha Valley. This proved to be very satisfactory, since it was possible to quantify the reality of
    each hospital investigated, helping the understanding of how efficient the HCWM of the establishments
    can be. In general, it was observed that even though most of the evaluations were considered adequate,
    the results show that hospitals find it more difficult not only to properly train their staff, making them
    aware of the importance of HCWM, but also to encourage more sustainable practices, especially
    regarding to avoid waste generation. Thus, since the GRSS assessment tool developed in this research
    was effective in pointing out the situation of hospitals, it appears that the application of IGeReS can be
    extended to any Brazilian hospital. Thereby, it would be possible to analyze whether the difficulties

3
  • SIMONE CARNEIRO STREITENBERGER
  • MULTIOBJECTIVE STOCHASTIC OPTIMIZATION OF MANUFACTURING AND RECOVERY PROCESSES: AN
    APPROACH TO QUALITY AND SUSTAINABILITY

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • JOSE HENRIQUE DE FREITAS GOMES
  • ROBERTO DA COSTA QUININO
  • Data: Feb 17, 2020


  • Show Abstract
  • Searching for improvements in the quality of industrial products emerges as a
    constant concern when dealing with manufacturing and recovery processes. Recovering, recycling and
    reusing are some recovery processes that are also in the spotlights
    because of environmental and sustainability issues. These processes generally involve
    a number of input variables that can be adjusted willing to optimize determined relevant responses,
    producing considerable benefits. Nevertheless, the complexity when
    contemplating distinct output variables with different but simultaneously goals makes
    this kind of research being continuously neglected. Applying analysis, modeling and
    optimizing tools in an adequate way may produce interesting results. This study
    proposes a two-phase optimization method based on the use of factor analysis, the
    Normal Boundary Intersection method and stochastic programming. The first phase
    focuses on the process quality characteristics and the second one on its sustainability. A real
    application was developed in a cladding process of ABNT 1020 carbon
    steel plate using austenitic ABNT 316L stainless steel cored wire to exemplify this
    approach. The first stage of the method focused on optimizing the geometric characteristics of the weld
    bead in order to improve the quality of the final product. In the
    second stage, which focuses on a sustainability aspect, it was solved a multiobjective
    stochastic problem aiming the minimization of the waste of material (scrap and rework) jointly with the
    energy consumption. It was possible to state that the method
    may provide consistent results when dealing with a large number of responses, and
    it also allows embodying external relevant information, like the electricity’s cost
    variation, producing important data to support the decision making.

4
  • LAILA ALVES DA SILVA
  • Nested Design with mixed effects: a quantitative approach applied to optimal tool selection of AISI H13 steel turning.

  • Advisor : RAFAEL CORADI LEME
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • PAULO HENRIQUE DA SILVA CAMPOS
  • RAFAEL CORADI LEME
  • ROGÉRIO SANTANA PERUCHI
  • Data: Mar 2, 2020


  • Show Abstract
  • This dissertation presents the development and application of Nested Design with mixed effects, for optimization problems with multiple correlated responses. The proposal combines the techniques of response surface methodology, factor analysis, normal boundary intersection method and hierarchical clustering. With this new approach, this paper, refers a way of analyse and configure the data and informations, as well as the accuracy of mathematical calculations of variance analysis and parameter estimates of the different arrangement models and variable levels to be tested. To demonstrate the applicability of the proposed quantitive approach, it was considered the turning process of the hardened AISI H13 steel machined with the tool Wiper CC 6050w, CC 650w, PCBN 7025w and convencional CC 6050, CC650, PCBN 7025. Were considered 3 inputs variables (x) and 7 responses (Y), being the total cost, tool life, machininf force, average surface roughness, total roughness, process noise ratio and specific cutting energy. The original data were adequate for the application of the factor analysis. The quadratic models for the two productivity/sustainability and quality factors presented excellent values for the adjustment coefficients. The method NBI could generate a Pareto frontier with equidistant solutions, allowing a good exploration of the feasible region of objective functions. Allowing the application of Nested Design and considering the nested factor A, B and C, respectively: tool geometry, tool class and the optimal solution in clusters. Hierarchical and crossover arrangements were compared for the proposed problem and it has been shown statistically that better results are obtained when the correct nested or crossed data pattern is respected, avoiding a biased perspective on information and results.

5
  • MARINA FERNANDES BRANCO PITANGA LOPES
  • Comparative analysis between the performance of nonlinear regression and artificial neural networks methods through the design of experiments

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • GABRIELA BELINATO
  • JULIANA HELENA DAROZ GAUDENCIO
  • PEDRO PAULO BALESTRASSI
  • Data: Apr 30, 2020


  • Show Abstract
  • Data modeling is a technique that assists decision making and the resolution of several types of problems in real systems. However, choosing the best technique for modeling a real system is not always an easy task, as each system has its specific characteristics and it is not possible to infer that the best method for a given situation will be the best in other contexts. Due to this need for generalization, the present work aims to present a comparative analysis between the performance of nonlinear regression (NLR) and artificial neural networks (ANNs) methods in modeling data sets generated through a design of experiments (DOE), in order to simulate different scenarios in which the methods will be applied. Thus, the data sets were modeled by each of the methods and their responses were assessed using the coefficient of determination (R²). The results showed that the studied methods present statistically significant differences, with a significance level of 5%. In addition, in this research it was concluded that if the experimenters know a priori the nonlinear model that defines the relationship between the input variables and the response variable, nonlinear regression outperforms artificial neural network in terms of R².


6
  • FLAVIO CIRINO GASPAR JUNIOR
  • A methodology for applying the shop floor management method for sustaining lean manufacturing tools and philosophies: a study of an automotive company in Brazil

  • Advisor : FABIANO LEAL
  • COMMITTEE MEMBERS :
  • FABIANO LEAL
  • JOSE ANTONIO DE QUEIROZ
  • MARCELO MACHADO FERNANDES
  • Data: May 20, 2020


  • Show Abstract
  • Many companies from various industries, especially the manufacturing industry, have become highly specialized at applying the Toyota Production System (TPS), and thus developing and optimizing their processes in order to reduce costs and increase market competitiveness. Direct management is needed to develop a culture of continuous improvement among both processes and personnel so that methods may be continually improved, especially on the shop floor, and as new opportunities have emerged. One structured way to do this is via the Shop Floor Management (SFM) method. The objective of this paper is to analyze the applicability of the SFM implementation model as was presented by Hanenkamp (2013). This model has been outlined, detailed, and applied in practice. Opportunities for improvement during the application process of this model were verified by changes in its steps. These improvements were applied following Action Research (AR) methodology procedures applied in two cycles. In addition, the influence of SFM method on the sustainability of lean manufacturing tools on the shop floor was also evaluated. The object of study in this research was an automobile company located in southeastern Brazil. After developing Action Research, we have shown that the SFM model, as proposed in this paper, can indeed assist managers in applying and maintaining lean manufacturing practices on the shop floor. Within this context, this study contributes by aiding in filling the gap between practical applications and the sustainability of lean manufacturing concepts and solutions.

7
  • JOÃO VICTOR RIBEIRO SANTOS
  • Social Skills and the Performance of the Quality Management System based on ISO 9001

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • JULIANA HELENA DAROZ GAUDENCIO
  • LAERCIO ANTÔNIO GONÇALVES JACOVINE
  • Data: May 26, 2020


  • Show Abstract
  • Quality is recognized as an essential component of competitiveness, and is currently seen as a necessity of organizations to the demands and dynamics of the market. Thus, the recognition that it is an effective strategic factor has forced companies to adopt strategies to manage it, being a strategy used for that purpose the implementation of a Quality Management System (QMS). Despite its strategic value, researches with Quality Management generally focuses its efforts on technical issues, leaving aside the human factor. The same occurs with scientific literature on the subject, which despite recognizing the importance of human resources, has a predominance of work involving technical tools. In order to fill the knowledge gap on how the human factor interferes in the quality results in organizations, the following presentation identified Social Skills (SS) as a little explored research field with high potential to generate gains, both of knowledge for the academy, as well as performance for organizations. Thus, the objective of the research is “To verify the relationship between Social Skills and the performance of the Quality Management System based on the ISO 9001 standard”. For this purpose, a multiple case study was carried out using 34 employees from 3 small-sized companies with QMS certified by ISO 9001, that produce high value-added. Data were collected on the organizations, as well as the SS of each employee (using the Inventário de Habilidades Sociais 2 Del Prette) and the performance of each in the QMS. Subsequently, the data were analyzed by statistical techniques in order to determine whether there was a relationship between the two types of variables collected. As a result, the ability to express positive feelings is correlated with the ability to generate motivation and satisfaction, effective communication and a lower number of conflicts generated. It was found that quality leaders in companies have similar behavior, with high scores in SS and performance. A possible dependency relationship between the relevance of SS and the maturity of the QMS was also identified. Thus, it is suggested for organizations similar to the objects of study, if there are needs to improve communication, motivation, satisfaction and reduce the number of conflicts, develop the SS of expression of positive feeling, through Social Skills Training. Other companies are recommended to study their maturity level, perform statistical tests to determine the impact of skills on the performance of the QMS, and subsequently invest in tools to improve the relevant HS repertoires in order to obtain gains in performance.

8
  • ANDREZZA DE FATIMA LEAL MACHADO
  • Contribution of competence management in the strategies of graduate programs

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • BRUNO HENRIQUES ROCHA FERNANDES
  • CARLOS EDUARDO SANCHES DA SILVA
  • RENATO DA SILVA LIMA
  • Data: Jul 9, 2020


  • Show Abstract
  • Competence Management (CM) is understood as an important resource in the strategic management process. It allows to generate competitive advantage for organizations, mainly, in which intelligence is a preponderant factor. In Brazil, universities are mainly responsible for research activities, which take place within graduate programs, in particular. This research has the general objective of proposing the use of competence management in the strategic planning of graduate programs. First, a bibliometric analysis was carried out followed by a literature review on the themes of competences, their management, applications in the public sector, in universities and in graduate programs in the Federal Institutions of Higher Education. The research method used was the study of case. Based on the models identified in the literature, this investigation uses a competence management model adapted to graduate programs consisting of five stages: A) Define the purpose of the graduate program; B) Diagnose the necessary competences for the position/function of professors and their work processes; diagnose competences already existing in its faculty; C) Formulate competences development strategies; D) Monitor and evaluate competences; E) Analyze the cost-benefit of development actions and ways of paying teachers. Phases A and B were performed in the Graduate Program in Industrial Engineering at UNIFEI. By the results of the first stage we identified the need to formalize strategic elements (mission, vision, values and objectives). In the second stage, mainly secondary data were collected, considering that the program's lines of research are the categories by the organizational competences; the obtained information allowed to identify the categories and subcategories of competences in the program by: publications of papers; completed master's and doctorate guidelines; research projects with funding; and disciplines taught. In Phases C, D and E, propositions of strategies were identified through the analysis of actions carried out by graduate programs in the area with concept 7. Finally, they were identified as the main difficulties ordered for the CM implementation: overwork and lack of servers in the graduate program administrative team (coordinator and secretary); bureaucratic management (structure and processes of the institution), limitations of current legislation and legal restrictions for implementing the model; in addition to mapping competences. As for the possible prioritized benefits obtained with the implementation: improvement in management and organizational processes; provision of technical subsidies for personnel planning, in particular, competitions and teaching accreditation and disaccreditation processes; greater communication than is expected from the professors in the program; contribution to the alignment between the people management subsystems with the strategy of the graduate program; incentive to continuing education (post-doctorate); and improved service to society through the training of human resources, development of research and socialization of results through scientific publications and intellectual property.

9
  • TANITA CAROLINE PIRES MACIEL
  • A framework proposal to carry out discrete event simulation projects

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • FABIANO LEAL
  • JOAO JOSE DE ASSIS RANGEL
  • JOSE ARNALDO BARRA MONTEVECHI
  • JOSE HENRIQUE DE FREITAS GOMES
  • Data: Jul 9, 2020


  • Show Abstract
  • Several frameworks can be find in the literature and its objective is helping modelers, through a step by step of activities to be followed in the development of simulation projects. Some studies show that a simulation project is iterative, being possible to return in several stages to add aspects that had not been realized before, because simulation is a process of obtaining knowledge over time. Therefore, it is important to represent in the frameworks the iterations that normally occur in a simulation project, in order to make it more understandable for modelers, especially those with little experience in this area. Despite the research found on the subject, there are few studies that compare and integrate the characteristics of the frameworks. Therefore, we used an analysis of the main frameworks and a bibliometric analysis of how discrete event simulation is being used in the brazilian scenario and we found that the framework proposed by Montevechi et al. (2010) is the most complete and detailed, in addition to being the most used in Brazil in the last five years. Therefore, this framework was chosen to be revised, through its application in a simulation project, in order to test and verify steps that could be added and the iterations that could occur. To complete an unstructured interview was conducted with specialists from simulation consulting companies, who reported on how a simulation project is carried out and how iterations normally happens. It was possible to make changes from the framework of Montevechi et al. (2010) and propose a new framework to conduct discrete event simulation projects, representing the iterations between stages. Subsequently, a questionnaire was sent to simulation specialists, both in the academic and professional environment to evaluate the proposed framework. Based on the questionnaire’s answers some changes were made to make the framework cleaner visually, so that it can be used effectively by students and professionals, especially those with little experience.

10
  • ALEXANDRE LABEGALINI
  • FREIGHT TRIP GENERATION MODELS AND DEMAND FOR PARKING SPACES FOR URBAN FREIGHT TRANSPORTATION IN MEDIUM-SIZED CITIES: THE CASE OF ITAJUBÁ-MG

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • RENATO DA SILVA LIMA
  • RODRIGO DE ALVARENGA ROSA
  • WILFREDO YUSHIMITO DEL VALE
  • Data: Aug 3, 2020


  • Show Abstract
  • The relevance of urban freight transportation has been growing due to the increase in the population which lives in urbanized regions. This fact increases the number of vehicles in urban centers and also the demand for goods, which generates negative externalities such as congestion, emission of pollutants in the air, lack of parking spaces, among others. These problems are present not only in large cities, but also in small and medium-sized cities. Despite this, urban freight transportation is a fundamental factor for urban development, influencing both urban mobility and interactions between society, the environment and commerce. However, freight transportation is often not considered in the planning and regulation of urban transportation, being the main focus on passenger vehicles. These gaps motivated the emergence of City Logistics, a concept that aims to reduce the disorders caused by the transportation of goods while promoting cooperation between the parties involved in this sector. Regarding the understanding of this scope in medium-sized cities, there is still a reduced literature in the national and international scenarios. Thus, the objective of this study is to develop and use freight trip generation models to determine the number of freight trips attracted by commercial establishments in medium-sized cities, with the city of Itajubá-MG as the object of study, quantifying the supply and demand for loading/unloading parking spaces in this region. First, a questionnaire was applied personally at 200 retail establishments to collect statistically reliable data. With these data, freight trip generation models were developed and they estimated that the studied region can receive around 539 daily trips, showing that the 19 existing loading/unloading parking spaces are equivalent to only 34.5% of the 55 parking spaces needed to meet estimated daily trips. Therefore, it was proposed to implement 36 new loading/unloading parking spaces located in areas of greater demand for goods deliveries. It was also found that the greater the distance between loading/unloading parking spaces and areas with a higher concentration of deliveries, the longer the duration that freight vehicles are parked in these parking spaces. This increase may be up to 75.42% for small vehicles (Pickup/Van/Truck), 30.17% for medium vehicles (VUC) and 18.86% for large vehicles (Truck). In addition, the simulation of some scenarios showed how three City Logistics measures can optimize the use of these parking spaces. Compared to the current scenario, freight consolidation (CCU), off-hour delivery (OHD) and staggered delivery (ITS) would reduce demand for parking spaces/hour by 65%, 20% and 8%, respectively. The freight trip generation models developed proved to be effective in quantifying the trips attraction potential of the establishments and also assist in decision-making to solve the problems of urban freight transportation. It was also observed a need for greater integration between local retailers and public authorities for a better functioning of local goods transportation. It is also suggested that local authorities increase inspection over the use of public loading/unloading parking spaces. Finally, the methodology adopted in this work can be used in other medium-sized cities with similar characteristics to those of Itajubá. However, when implementing City Logistics initiatives, attention should be paid to the particularities and adjust these initiatives according to the reality of each city studied.

11
  • RÓBSON RAIMUNDO CARDOSO RIBEIRO
  • ANALYSIS AND OPTIMIZATION OF THE RESISTANCE SPOT WELDING PROCESS OF 22MNB5-GALVANNEALED STEEL

  • Advisor : JOSE HENRIQUE DE FREITAS GOMES
  • COMMITTEE MEMBERS :
  • GILMAR FERREIRA BATALHA
  • JOAO ROBERTO FERREIRA
  • JOSE HENRIQUE DE FREITAS GOMES
  • SEBASTIAO CARLOS DA COSTA
  • Data: Aug 21, 2020


  • Show Abstract
  • The main goal of the global automotive industry is to attemper energy efficiency, reduction of the greenhouse gases emissions, high passive safety and durability. Achieve these goal the automotive industry invests in the development of new materials that can replace conventional steels. Advanced High Strength Steels (AHSS) are promising alternatives; the boron steel 22MnB5-galvannealed is the most extensively steel grade used in the hot stamping process to produce car anti-collision structure parts; this alloy achieves a fully martensitic microstructure and a tensile strength over 1500 MPa after the hot stamping ,in addition to having an intermetallic layer composed of iron and zinc which further cathodic protection
    and protects the steel substrate against weathering during the hot stamping process. Furthermore, is estimated that in the automotive industry the Resistance Spot Welding (RSW) is a broadly used and important welding process in automotive body construction. However, the 22MnB5-galvannealed steels are a big challenge to the joining methods since their microstructure and mechanical properties differ from the middle steel grades. The aim of this research was to optimize the resistance spot welding parameters applied to the 22MnB5-
    galvannealed steel. The goal was initially, the removal of the galvannealed coating and later the objectives were: maximization the nugget width, the nugget cross-sectional area, the penetration, the strength, the joint efficiency and the energy absorption.The process parameters selected were the effective welding time, the effective welding current, the quenching time and the upslope time. To design the experiments, collect the data and analyze them, it was used the Response Surface Methodology and, to optimize the process, it was used one optimization methods, the Global Criterion Method Based on Principal Components, The
    optimization exhibit that the values obtained by the MCGCP, approached the established targets, emphasizing the importance of using a method that considers the correlation structure when the multiple responses are correlated.

Thesis
1
  • JULIO CESAR MOSQUERA GUTIERRES
  • PERSPECTIVES OF THE USE OF DEA IN THE REGULATION OF ELECTRICITY DISTRIBUTORS IN BRAZIL

  • Advisor : RAFAEL CORADI LEME
  • COMMITTEE MEMBERS :
  • MARCELO AZEVEDO COSTA
  • MARIA CONCEIÇÃO ANDRADE E SILVA
  • PEDRO PAULO BALESTRASSI
  • RAFAEL CORADI LEME
  • VICTOR EDUARDO DE MELLO VALERIO
  • Data: Mar 3, 2020


  • Show Abstract
  • Generally, utilities are considered natural monopolies under economic regulation,
    since a single firm has the ability to provide one or more services within a
    limited territory, following rules previously established by regulators. For these types
    of companies, performance measurement is one of the key tasks as it helps organizations
    set appropriate goals for the future. One of the strongest ways to determine
    an organization’s performance is by measuring efficiency and productivity. For these
    companies, the importance and interest of assessing efficiency and productivity in
    general has to do with three reasons: First, only by measuring efficiency and productivity,
    and separating its effects from those of the operating environment to level the
    playing field, can explore hypotheses about efficiency sources or productivity differentials.
    The identification and separation of controllable and uncontrollable sources
    of performance variation is essential for the establishment of private practices and
    public policies designed to improve the performance of industries.
    Second, macro performance depends on micro performance, and so the same thinking
    applies to studying the growth of nations through performance evaluation.
    Thirdly, measures of efficiency and productivity are indicators of success, indicators
    of performance by which producers are evaluated.
    In this sense, this thesis derives a new model that allows the utilities decompose
    efficiency among their subsidiaries, respecting all parameters specified by
    the Regulator in different periods of time. Also, a different approach to that used
    in the Brazilian electricity distribution sector for measure productivity changes is
    applied to find out information that the approach used in Brazil does not offer.
    With the proposed analyzes, it is possible that managers have a tool that allows
    them a better understand the performance of each part of their company and thus
    manage their utilities in a better way. Both, the proposed model for decomposing
    efficiency between subunits and the approach to measure productivity changes over
    time will be used to analyze electrical distribution companies in Brazil, since it is a
    very important sector that is constantly seeking improvements.

2
  • WASHINGTON LUIS MOREIRA BRAGA
  • ROBUST PARAMETERS DESIGN FOR MODELING AND OPTIMIZATION OF KANBAN SYSTEMS: AN APPROACH BASED ON SIMULATED EXPERIMENTS

  • Advisor : JOSE HENRIQUE DE FREITAS GOMES
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • CARLOS EDUARDO SANCHES DA SILVA
  • FRANCISCO RODRIGUES LIMA JUNIOR
  • JOSE HENRIQUE DE FREITAS GOMES
  • MESSIAS BORGES SILVA
  • Data: Apr 23, 2020


  • Show Abstract
  • Inventory control impacts the flow of the production process, customer service and cost. To improve this management, many companies use the concept of kanban, which has the purpose of programming and controlling production visually. However, few studies in the literature make use of a systematic method for its optimization, and it is also possible to observe in industrial environments, the use of Toyota's traditional formula, considering the demand levelly. Therefore, this work was developed with the objective of proposing an approach to optimize kanban systems using the Robust Parameters Design, through simulated experiments. This strategy is based on the calculation of the safety stock to determine the experimental planning, seeking to achieve the desired service level. Then, the simulation is executed with the experimental planning data in the simulated model, considering the daily random variation of demand over 30 days, with the quantity of kanban not delivered and the number of kanban in stock recorded at the end. From these data mathematical models that take into account the mean and variance are constructed, so that the robust optimization for the dimensioning of the supermarket can be obtained through the Mean Square Error (MSE). To prove the applicability of the proposed method, the procedure was tested in two different cases in the literature, Tubino (2007) and Hurrion (1997). For the modeling of the optimization functions, the Response Surface Methodology (RSM) was used, being mathematically programmed using the MSE. The weighted approach (MSEP) was also applied as an additional technique to achieve better results in relation to robust optimization. Therefore, the proposed approach was developed and applied satisfactorily in both cases in the literature, leading to optimal results in relation to the objectives of robust optimization. It was also possible to make a comparison between the proposed approach and the results achieved with the optimizer of the Arena® simulation software, OptQuest®, thus demonstrating that the method proposed in this work obtained better results with regard to robustness.

3
  • ANA PAULA SIQUEIRA SILVA DE ALMEIDA
  • Incorporation of Usability Engineering Methods in the Development of Medical Devices with a Focus on Certification

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • DANIEL CAPALDO AMARAL
  • JANAINA MASCARENHAS HORNOS DA COSTA
  • MARCELO GITIRANA GOMES FERREIRA
  • RODRIGO DUARTE SEABRA
  • Data: May 20, 2020


  • Show Abstract
  • Regulatory agencies concerned with controlling the new Medical Devices (MD) that go on the market started to require that manufacturers follow standards. Among these, an IEC 62366 standard establishes a Usability Engineering (UE) process. Within the scope of the scientific literature, authors present ways of including the user in product development and point out practical difficulties in the selection and execution of different methods, in understanding the requirements and in inserting the EU process in existing processes. These difficulties can be aggravated by deficiencies in capacity and cultural aspects, treating or topic only as documentary compliance. This thesis presents a model of UE methods for the development of MD, with a primary focus on meeting regulatory requirements. To assess the suitability of the proposal, case studies were planned in four different companies, first the goal was to apply the model by out-sourced specialists into two companies and then to apply in another two companies, by the internal company team. It was possible to conclude by the first application of the model that the results contributed to the certification of products in the studied companies. For a model application in the second stage of verification, as teams of companies passed through a process of training in the methods and structuring of the model. In both companies, an UE process was established for a product under development, with the creation of a documentary procedure, plan and auxiliary documents. Some of the proposed methods have been successfully implemented by the teams. It was possible to conclude that the model showed a good option for MD companies to implement the EU process according to IEC 62366, even if they do not have specialists or professionals with experience in this area. It was possible to observe that, when a company implements the model, applies the proposed methods, the results can generate not only regulatory compliance, but also a cultural transformation. The companies that applied the model based on internal training, after the results, mentioned that they could do more than what was proposed in the model, in order to go beyond compliance with requirements, impacting general use and innovation.

4
  • FERNANDA ROCHA
  • Simulation and Virtual Reality: an experimental research in industrial engineering

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • ANTONIO AUGUSTO CHAVES
  • CARLOS HENRIQUE PEREIRA MELLO
  • FERNANDO AUGUSTO SILVA MARINS
  • JOSE ARNALDO BARRA MONTEVECHI
  • RAFAEL DE CARVALHO MIRANDA
  • Data: Jul 10, 2020


  • Show Abstract
  • Improving the development of education and learning requires in-depth study, especially when it comes to the learning of undergraduate Production Engineering students. The use of active teaching methodologies is the main interest of this work, which aims to evaluate the results from Simulation Based Learning (SBL) on the transfer of learning. In this thesis, the use of “Discrete Events Simulation" and "Virtual Reality" was investigated to improve the teaching of chronoanalysis in the Production Engineering course disciplines. Discrete Event Simulation, combined with Virtual Reality, was used to reinforce learning by replacing a real production line with virtual environments, thus connecting theory to practice. The research method used was the Experimental Research with the use of a control group and an experimental group of students.
    To verify the efficiency of the Simulation in conjunction with Virtual Reality, a comparative analysis was made between the results in teaching using SBL Interactive and Immersive with the traditional teaching method. The results revealed that learning with the use of Simulation in conjunction with Virtual Reality could improve the overall quality of learning and increase student understanding, in addition to increasing their confidence. The results showed that the students demonstrated a positive perception about the proposed method. The study's findings supported the Discrete Events Simulation together with Virtual Reality as having the potential to strengthen the development of undergraduate students in Production Engineering, preparing them to meet the industry's demands for more prepared engineers.

5
  • VINICIUS DE CARVALHO PAES
  • COMPUTATIONAL MODELING APPLIED TO EXTRACTION OF IRREGULAR GEOMETRIC CHARACTERISTICS IN MULTIOBJECTIVE PROCESSES

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • JOSE LEONARDO NORONHA
  • PAULO ROTELLA JUNIOR
  • PEDRO PAULO BALESTRASSI
  • WESLEY VIEIRA DA SILVA
  • Data: Jul 14, 2020


  • Show Abstract
  • Modern production systems require monitoring processes for quality assurance and production compliance. Computer vision is a strong ally in the process of object identification and feature extraction, being used in several fields of application. Extracting geometric features from irregularly shaped objects is a challenge, especially due to the complexity of the measurement system. Manual measurement methods may not guarantee the required accuracy and also have the demand of an experienced operator, time and directly impact the final cost of the product. Thus, the objective of this research is the creation of a computational modeling for the extraction of geometric characteristics of objects of irregular geometry, digitally, using computer vision. The standard block analysis confirmed that the measurement system based on computational modeling is satisfactory and has better accuracy, compared to the manual measurement method aided by image analyzer. The analysis of the experiments defined as the object of study also confirmed the effectiveness of the measurement system and it was possible to calculate the bias of the manual operator. As a result, measurements on the object of study were successfully extracted, made available in a systematic way, in record time. The method proved to be very effective, opening the possibility for several future works in the area.

6
  • LUCAS GUEDES DE OLIVEIRA
  • Prediction Capability of Pareto Optimal Solutions

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • PEDRO PAULO BALESTRASSI
  • ROBERTO DA COSTA QUININO
  • ROGÉRIO SANTANA PERUCHI
  • Data: Aug 7, 2020


  • Show Abstract
  • Response Surface Methodology is an effective framework for performing modelling and optimization of industrial processes. The Central Composite Design is the most popular experimental design for response surface analyses given its good statistical properties, such as decreasing prediction variance in the design center, where it is expected to find the stationary points of the regression models. However, the common practice of reducing center points in response surface studies may damage this property. Moreover, stationary and optimum points are rarely the same in manufacturing processes, for several reasons, such as saddle-shaped models, convexity incompatible with optimization direction, conflicting responses, and distinct convexities. This means that even when the number of center points is appropriate, the optimal solutions will lie in regions with larger prediction variance. Considering that, in this paper, we advocate that the prediction variance should also be considered into multiobjective optimization problems. To do this, we propose a multi-criteria optimization strategy based on capability ratios, wherein (1) the prediction variance is taken as the natural variability of the model and (2) the differences of expected values to nadir solutions are taken as the allowed variability. Factor Analysis with rotated scores is adopted for the grouping of correlated variables. Normal Boundary Intersection method is formulated for performing the optimization of capability ratios and obtaining the Pareto frontiers. To illustrate the feasibility of the proposed approach, two case studies are presented: (1) the turning of AISI H13 steel with wiper CC650 tool and (2) the end milling of the UNS S32205 duplex stainless steel, both processes without cutting fluids. The results have supported that the proposed approach was able to find a set of optimal solutions with satisfactory prediction capabilities for all responses of interest. In the first case, this occurred even with a reduced number of center points, a saddle-shaped function and a convex function, with conflicting objectives. In the second case, similar results were observed for six correlated responses, with conflicting objectives and rotated factors modeled by saddles.

7
  • PAULO ROBERTO MAIA
  • Normal Intersection Method at the border by rotated scores in elliptical solution space

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • JOAO ROBERTO FERREIRA
  • PEDRO PAULO BALESTRASSI
  • ROBERTO DA COSTA QUININO
  • ROGÉRIO SANTANA PERUCHI
  • Data: Oct 16, 2020


  • Show Abstract
  • Several factors affect the performance of the Normal Border Intersection Method (NBI) when employed in the optimization of multiple response surfaces. Such factors cause discontinuities in the Pareto-optimal solution set, instability in the detection of solutions in continuous space and the inversion of the original sense of relationship between the various objective functions. After an extensive research period, three fundamental causes for these inconsistencies were detected: (a) the presence of correlation between objective functions as well as their neglect, (b) the definition of anchor points, utopia and Nadir outside the regions. of confidence associated with the experimental data caused by the individual optimization step and (c) the nature of the initialization points of the gradient algorithm used to find the solutions of the multiobjective problem in each desired weight vector. To minimize the influence of these factors, the original method was modified to
    allow for the inclusion of elliptic constraints based on multivariate quadratic distance, and to replace the original objective functions with other independent functions - obtained by response surfaces of rotating factor scores with angles of diversified rotation. To improve the stability in finding and finding viable solutions for each weight of interest, we used a diffuse initialization process in which a matrix of viable initial solutions were tested a priori. For all tests, the quality of the obtained solutions was evaluated by calculating the Mahalanobis distance for the various solution vectors. This proposal was tested with twoand multi-dimensional assemblies associated with the H13 hardened steel turning process and the results were extremely satisfactory. During simulations with the experimental models obtained, it was identified that the emergence of correlation structures between
    stochastic objective functions is due, in large part, to a relationship between the amplitudes attributed to the decision variables within the experimental arrangements. Such simulations emphasize that natural correlations can become stronger depending on the amplitudes chosen for the input variable levels.

8
  • KAROLLAY GIULIANI DE OLIVEIRA VALÉRIO
  • Systematization of general requirements for software development in industry 4.0

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • MAICON GOUVÊA DE OLIVEIRA
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • HELDER JOSÉ CELANI DE SOUZA
  • JOSIANE PALMA LIMA
  • SANDRA MIRANDA NEVES
  • Data: Nov 21, 2020


  • Show Abstract
  • The identification and implementation of technologies and innovations, through the development of business strategies, are factors that contribute to organizational success. In this context, industry 4.0 can contribute to the development of industries that seek technological and innovative differentials. The objective of this research is to develop a system for software development in industry 4.0. Because they are current topics and linked to technology and innovation, they are still underdeveloped in academic studies. Through a bibliographic review, using bibliometrics and content analysis techniques, the general requirements for software development in industry 4.0 were identified, and for each requirement, what are the challenges, risks, gaps, advantages and trends. The systematic is analyzed by academic experts, being considered as theoretical with the incorporations of the practice. The AHP method was used to prioritize the requirements, resulting in the adjusted systematic, having as elements and relative score of the specialists: analysis of opportunities 34.98%, operational efficiency 2.0 19.41%, optimization of the business model 15.57%, information technology 20.26%, IT integration 12.29%, IT security management 7.97%, new interfaces and data 17.93%, data analysis and management 6.90%, new intellectual property management 6.69%, cloud-based applications 4.34%, integrated and intelligent management 10.75%, intelligent supply chain 3.85%, lifecycle management 3.81%, intelligent logistics 3.09%, change and learning 16.08%, corporate business 9.33%, organizational learning 6.75%. Among the criteria, the requirement that got the most weight was the analysis of opportunities and among the sub-criteria the ones with the highest weight were: operational efficiency, optimization of the business model and IT integration. Finally, an analysis by specialists from companies that develop software is applied to understand how the requirements are used in these companies, reaching the final system. The research through the AHP, showed the importance of analyzing opportunities and operational efficiency in organizations, justifying that companies are concerned with identifying business opportunities and managing processes efficiently. These results also showed the stage of preparing the system as the most relevant. The interviews demonstrated the concern with cybersecurity and IT integration tools, but the gap was shown in relation to the development of integrated and intelligent management, with emphasis on intelligent logistics.

9
  • ALEXANDRE FONSECA TORRES
  • Multivariate chance-constrained method applied in multi-objective optimization problems of manufacturing processes

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • RAPINDER SINGH SAWHNEY
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • MARCELO MACHADO FERNANDES
  • PEDRO PAULO BALESTRASSI
  • Data: Dec 7, 2020


  • Show Abstract
  • In the multi-objective optimization problems of manufacturing processes, the responses of interest are often significantly correlated. In addition to the multivariate nature of the problems, product demands, productive capacities, cycle times, the costs of labor, machines, and tools are just some of the many random variables involved in the optimization model. In particular, when using Design of Experiments (DoE) techniques and regression methods, the estimated coefficients for the empirical models - such as response surface models - are also stochastic. However, it has been observed that most of the articles published in this research area are limited to represent the stochastic variables in a deterministic way. Within this context, the present study aimed to propose the use of stochastic programming techniques combined with multivariate statistical methods including some process capability indices widely used in the industry, such as the C_pk capacity index and the Parts Per Million (PPM) index. The use of the methods combined used resulted in the proposal of the Multivariate Chance-Constrained Programming (MCCP). To test the applicability of the MCCP method, a multi-objective optimization problem of the AISI 52100 hardened steel turning process was selected as a case study given its widespread use and relevance to the industry nowadays. As a starting point for this study, a set of experimental results obtained from a central composite design was used. The decision variables were the cutting speed (Vc), the feed rate (f) and the depth of cut (ap). The responses of interest selected for this work were the total machining cost per part (Kp), the material removal rate (MRR), the tool life (T), the average roughness (Ra) and the total roughness (Rt). After analyzing the data and building the mathematical models for the responses of interest, three approaches were carried out. In the first approach, the Cpk index included the calculation of the variance of the response surface model of Ra. In the second approach, the probability that Kp is less than or equal to a predefined value was modelled as a stochastic objective function. Finally, the third approach described the application of the proposed MCCP method. In this approach, the PPM index was calculated using a normal bivariate distribution for both Ra and Rt. The main results of this research were: a) the demonstration and validation of an equation used to calculate the variance of a continuous, derivable and dependent function of stochastic variables; b) the analysis of the impact of seven stochastic industrial variables (setup time, lot size, machine and labor costs, insert changing time, tool holder price, tool holder life and insert price) on the cost of the process; c) finding that maximizing tool life may reduce cost in some cases – for example when using Wiper tools – but the change of the cutting conditions alone does not necessarily reduce the cost of the process, as in what occurred in the case study analyzed.

2019
Dissertations
1
  • HEBERT WESLEY PEREIRA ZARONI
  • A Business Intelligence Model purpose to support decision-making through the Data Science perspective

  • Advisor : ALEXANDRE FERREIRA DE PINHO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • ANDRÉ PIMENTA FREIRE
  • FABIO FAVARETTO
  • Data: May 8, 2019


  • Show Abstract
  • Management decision-making is a recurring issue in organizations and companies, whether private or publicly managed. The latter type of organization one remains a bureaucratic system with difficult access to quick and accurate information. With the massive amount of data available within organizations, and also from external sources, they have searched new technologies and methods from the Information Systems field to obtain higher quality information. Business Intelligence (BI) systems are one of the means that contribute to gather, analyze and propagate data, resulting in various products and management reports that ease decision making process; and Data Science (DS), which is an emerging field within Information Systems, carries the characteristics of data transformation and analysis in a way that helps the organization in the decision-making process as well. Both concepts have their methods, processes, models, and life cycles that reach a common goal. However, in the literature, there is a lack of models that aggregate the two concepts concomitantly, or apply concepts from one field to another. From this gap point, the objective of this research is to propose a model that applies the concepts of BI and DS, placing the concepts of each one together, conceptualizing them and identifying the points where they are convergent and divergent to have an efficient model. The methodology used is the Modelling one, to develop the proposed model, which encompasses its Conceptualization, Modeling, and Solution and Implementation, where the concepts and steps of the BI and DS processes are explored and placed together, with their own cycles and phases being put togheter. Finally, the developed model will be applied in a computational software that can incorporate it, as a way to test and validate it, generating computational products to be used. The result will be applied at the Federal University of Itajubá, more specifically in the accounting and finance sector, assisting managers in decision making and also for transparency purposes, exposing the reports coming from the model. The results that emerge from the developed model are dashboards and data visualization products that are available on the University's website. It can be concluded that concepts are applicable, including in public management, to generate decision support models, and that they can be developed in other sectors and organizations.

2
  • DIÊGO JEAN DE MELO
  • Otimização Robusta do Processo de Aplainamento da Madeira Pinus taeda

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • JOSE HENRIQUE DE FREITAS GOMES
  • JOSE REINALDO MOREIRA DA SILVA
  • Data: May 17, 2019


  • Show Abstract
  • The planing is a machining process widely used for material removal and obtaining good quality of wood surfaces. This process is strongly present in industries in the furniture industry, where the planing is used to adjust product measurements by removing material and improving visual quality, thus increasing the value added of the final product. To evaluate this machining process, it is important to use experimental designs methods, robust modeling and multi-objective optimization to obtain the best possible results in different responses. The present work presents the robust multi-objective optimization of the planing of Pinus taeda wood. The experiments were conducted based on the central composite design combining variables of control of cutting speed, feed rate and depth of penetration and the noise variable defined as the percentage of moisture present in the wood. Response surface methodology was applied along with the robust parameter design, principal component analysis, the mean squared error optimization, and the normalized normal constraint. The effects of each control and noise variable and their interactions were discussed. The analyzed responses were of electric current, voltage, torque, mechanical power, energy consumed, specific cutting energy and mean thickness of the undeformed chips. The multi-objective optimization was performed by weighting the identified principal components. Mean and variance models were obtained for each response and for the weighted principal component. Pareto frontiers were constructed for all responses. Confirmation tests were performed with the purpose of verifying the robustness of the responses. It was concluded that the optimum global point for planing that minimizes the effect of wood moisture is obtained using the parameters of 11.25 m×s-1 for cutting speed, 3.65 m×min-1 for feed rate and 0.95 mm for depth of penetration.

3
  • VINÍCIUS DE CASTRO SEGHETO
  • ANÁLISE DA APLICAÇÃO DO DESIGN CENTRADO NO USUÁRIO PARA MELHORIAS DE USABILIDADE DE UMA INCUBADORA NEONATAL

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • RENATA APARECIDA RIBEIRO CUSTODIO
  • STEFÂNIA LIMA OLIVEIRA METZKER
  • Data: May 17, 2019


  • Show Abstract
  • Aiming at the competitiveness of the current market, companies seek more and more the satisfaction of their consumers. For this, it becomes necessary techniques that can be used in the development of its products, focusing mainly on the needs of its customers. Among them is the User-Centered Design (UCD), a philosophy that is based on the needs and interests of the users to guarantee the success of the product. In the context of medical device development, it is even more important to engage users to ensure effective and error-free product development. However, the DCU approach is not widely used in practice, and it is necessary to develop practical studies of its application that are the basis for new studies. Thus, this research had as objective the application of the DCU approach to improve the usability of an electro medical equipment, more specifically, a neonatal incubator. The study can be characterized as an action research, being possible to identify great similarity between its stages and the DCU. To conduct the research were made usability tests, applied the QUIS and SUS questionnaires, conducted unstructured interviews and direct observation. Involving the users, it was possible to identify improvement points for the incubator and after the implementation of the changes, it was possible to obtain the usability improvement of the new concept.

4
  • YASMIN SILVA MARTINS XAVIER
  • Sistemática para abordagem de riscos em Sistemas de Gestão da Qualidade baseados na norma ISO 9001:2015

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • PAULO ALEXANDRE DA COSTA ARAUJO SAMPAIO
  • Data: Jun 26, 2019


  • Show Abstract
  • With the update of the standard ISO 9001 in 2015, one of the established requirements is the risk-based thinking. Considering that the standard does not specify how this requirement can be implemented by organisations and due to the large number of companies already certified and in process of certification, the risk-based thinking becomes a significant subject, discussed by many authors. Through the conduction of a systematic literature review (SLR), it was possible to identify methods and methodologies used to approach the risk-based thinking, leading to a gap related to its implementation. In order to assist the organisations in the process of adequacy/implementation of their Quality Management Systems, contributing also to researchers in the area, the main objective of this research is to propose an integrated systematic for risks approach, in the context of ISO 9001: 2015. Based on the literary aspects identified in the SLR, the systematic combines these elements into a PDCA cycle, linking requirements of ISO 9001:2015 with the real needs of organisations, to assist them in the comprehension of the risk-based thinking, required by the standard, while implementing the necessary actions for its approach. To validate the proposed systematic, the author selected a small and medium-sized (SME) company from the electronic systems field, as the study object of this research. The method used was action research, conducted through 05 sequential cycles with activities of planning, acting, analysing and reflecting. Among this research main results are the implementation and adequacy of the systematic, to the study object context, and the proposal of a practical guide for organisations. In addition, fundamental aspects for the incorporation of the risk-based thinking were identified: Top Management commitment, involvement of all employees and, consequently, a change in the organisational culture.

5
  • TIAGO DELA - SÁVIA
  • Avaliação da Complexidade da Modelagem Híbrida em Processos de Manufatura Flow Shop

  • Advisor : ALEXANDRE FERREIRA DE PINHO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • DAVID CUSTÓDIO DE SENA
  • FABIANO LEAL
  • Data: Jun 28, 2019


  • Show Abstract
  • Computer simulation is considered one of the most powerful tools for decision making, as it can be used to analyze the operation of complex systems by testing hyphoteses and predicting its future behavior. One of simulation’s characteristics is its ability to reduce the complexity of real systems by representing them in the form of computer models. However, with the rise of manufacture processes themselves, computer models have become more and more complex with time, which can cause an impact on their accessibility and functionality. In the context of operations management there is a discussion regarding the applicability of agent based simulation (ABS), which started to gain popularity in the early 2000s, in contrast (and sometimes complementing each other) to discrete-event simulation (DES), which has been the most common practice in operations research for more than 40 years. This paper consists in a case study comparing the impact of hybrid simulation, called HS for short, (utilizing elements of “agents” in a discrete event model) on the complexity of simulation models, which represents flow shop manufacturing processes, developed in the simulation package Anylogic®. A framework was developed to measure the complexity of pairs of models, which consists, each pair, in a DES and a HS based model respectively, representing the same process by the two different approaches. The analysis showed that most models developed in hybrid simulation presented a bigger complexity, with the exception of one. The difference in complexity varied pair by pair, according to each case.

6
  • PAULA CARNEIRO MARTINS
  • Aplicação da simulação a eventos discretos em conjunto com a pesquisa-ação para um projeto de melhoria em uma empresa de base tecnológica

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • FABIANO LEAL
  • JOAO JOSE DE ASSIS RANGEL
  • JOSE ARNALDO BARRA MONTEVECHI
  • JOSE HAMILTON CHAVES GORGULHO JUNIOR
  • Data: Jul 4, 2019


  • Show Abstract
  • Incoming data management is considered a highly influential factor in the success of a simulation project since problems with input data are often the reasons why a model is not validated. However, the data collection and modeling phases require much time to execute. In this context, the academic objective of this work is to analyze the influence of different strategies of input data modeling on the operational validation of a simulation model to discrete events. As a practical objective, the research will develop a computational model that will be used as a decision-making tool for the weekly planning of the company object of study. It also highlights the specific objective of analyzing the applicability of BPMN and IDEF-SIM process modeling techniques, used together, in a simulation project. In this research will be used the research-action and modeling and simulation methods together. The object of study is the repair sector of a technology-based company. Three cycles of action research were carried out, with each one of them aimed at completing one of the phases of the simulation project: conception, implementation, and analysis. In this study, it was concluded that there are no significant differences between the models that were simulated with the times of the activities in the different forms of input data modeling since both models were validated with the real situation. In the conception phase, the use of BPMN was more advantageous since the technique allows the visualization of the relations between sectors. Already in the implementation phase, the use of IDEF-SIM facilitated the translation of the modeled situation to the simulation software. And in the analysis phase, the combined use of the two techniques made the proposition of scenarios more assertive. The use of action research combined with modeling and simulation favored project performance.

7
  • ANA CLARA BARBIERI BORTOT
  • Estudo da valorização de áreas urbanas no entorno de Instituições de Ensino Superior: Um modelo multicritério de apoio à decisão

  • Advisor : JOSIANE PALMA LIMA
  • COMMITTEE MEMBERS :
  • DEIVIDI DA SILVA PEREIRA
  • EDSON DE OLIVEIRA PAMPLONA
  • JOSIANE PALMA LIMA
  • Data: Jul 8, 2019


  • Show Abstract
  • Large traffic generators (LTGs) are institutions that have short- and long- term impacts. The great majority of the studies found in the literature are related to the short-term impacts, which involves estimating the trip generation rates in order to contribute to the analysis of transport demand. Long-term impacts, however, have a broader spectrum linked to social, historical, cultural and environmental issues, as well as issues related to urban economics. When analyzing LTGs from the perspective of the Urban Economy, it is possible to perceive a relation older than the concept of LTG itself, which is the relationship between land use and transport, which also addresses the concept of urban land value. This value is higher or lower according to accessibility to urban centers. However, today, cities have attractive poles, that is, attractive localities beyond their original center, which can sometimes be linked to the presence of LTGs (shopping centers and hypermarkets, educational institutions, etc.). The study of the urban land evaluation process is a complex problem that, in addition to the spatial issue and impacts derived from LTGs, may involve a set of variables of urban infrastructure, built environment and mobility. With this, the main objective of this work is to study the potential of valuation of areas around Tertiary Education Institutions (TEIs), using spatial multicriteria analysis. Among the specific objectives are: (i) study factors related to the value of urban land; (ii) structure the evaluation model and determine the degree of importance with the help of the Analytic Hierarchical Process (AHP); and (iii) to carry out a case study applying the model in areas close to Tertiary Education Institutions located in Itajubá-MG. Twenty (20) variables were identified which were spatially analyzed and then grouped according to the model. Through the method used it was possible to present the results in a very organized way. When comparing the two large groups of criteria, Physical Aspects and Socioeconomic Aspects of the built environment, it can be concluded that the spatial analysis validates the model, where the Physical Aspects related to Basic infrastructure, Location and Accessibility were more important for the potential of enhancement of the TEI environment. The results of the study point to subsidies for the elaboration of public policies for urban planning and transportation, since they show the main factors that can impact on the valorization of land, as well as identify areas with potential of valorization in the surroundings of LTGs.

8
  • LUCAS CHILELLI DA SILVA
  • METODOLOGIA DE COVARIÂNCIA GINI APLICADA A ESTIMAÇÃO DE PARÂMETROS EM MODELOS AUTO REGRESSIVOS COM CAUDAIS LONGOS

  • Advisor : RAFAEL CORADI LEME
  • COMMITTEE MEMBERS :
  • ANTONIO FERNANDO BRANCO COSTA
  • FERNANDO LUIZ CYRINO OLIVEIRA
  • RAFAEL CORADI LEME
  • Data: Jul 19, 2019


  • Show Abstract
  • The Gini Methodology was present more than one century ago by the Italian
    Statistician Corrado Gini, for almost 80 years it was used to measure income distribution
    and inequality. Around 3 decades ago, some researchers published works
    about the use of the Gini in other areas of study and its capacity on parameters
    estimation and modelling series with distributions departing from normality. Based
    on these works, new methods started to arise e have become more mature over time.
    The aim of this work is to develop a model capable of applying the methods found
    on the literature to estimate the parameters of Auto Regressive models with heavytailed
    underlying distributions and extreme values. To evaluate the performance of
    these models the results were compared to models created using the classical Auto
    Regressive estimation, through the Ordinary Least Squares Method and the Akaike
    model selection criteria. The results showed that the Gini on time series modelling
    presents great advantages and superiority on the forecast, through its correlations
    and other metrics is possible to conclude the Gini was a good estimator on models
    with low order, such as one or two, but also on model with order higher than two.

9
  • CARLOS HENRIQUE DOS SANTOS
  • UTILIZAÇÃO DA SIMULAÇÃO PARA TOMADA DE DECISÕES NO CONTEXTO DA INDÚSTRIA 4.0: UMA APLICAÇÃO INSERIDA NO CONCEITO DE GÊMEO DIGITAL

  • Advisor : FABIANO LEAL
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • BRENO GONTIJO TAVARES
  • FABIANO LEAL
  • JOSE ANTONIO DE QUEIROZ
  • Data: Jul 30, 2019


  • Show Abstract
  • The advent of new technologies brings significant impacts on the way that systems and processes are managed. In this case, we highlight the scenario of Industry 4.0, a term that refers to a new concept of industry, which preaches processes and systems increasingly automated, integrated and digitized for a greater efficiency. Also in this context, there are great changes regarding the use of several tools and concepts already known, which has been gaining new applications and scopes of action, in front of increasingly complex processes and a more competitive market. Given this scenario, it is possible to use the simulation as Digital Twin, concept which refers to a virtual and intelligent copy capable to reflect real processes inside the context of Cyber-physical Systems, one of the pillars of Industry 4.0. Thus, the present work aims to analyze the applicability of the Digital Twin simulation in a real process with a low degree of automation, which is a challenging scenario regarding adherence to the precepts of Industry 4.0. The process chosen as the object of study corresponds to an activity of supplying materials at kanban stations arranged on the production lines of an aeronautical industry. Applying the Modeling and Simulation method, it was possible to design, construct and implement a Digital Twin through the simulation. In this case, we evaluated the advantages, limitations, and the role of the simulation as a tool to facilitate the implementation of Industry 4.0 concepts in processes with little or no preparation. About the results of the research, a Digital Twin was obtained through Discrete Event Simulation (DES) and an intermediate interface, which extracts the data coming from the process, simulates the possible supply routes and provides the most efficient route options. It is noted that the Digital Twin made the process, object of this study, an intelligent system and connected with the other systems of the operation, a fact that is in accordance with the precepts of Industry 4.0. In addition, aiming for greater efficiency, it is also noted that such application matches the Lean Philosophy, since it was possible to reduce unnecessary movements and transports by the order of 20%. Finally, it is concluded that the use of the Digital Twin simulation in a real process with a low degree of automation was shown possible from the present research, a fact that illustrates the versatility of the tool in front of the challenges imposed by the evolution of the industry.

10
  • Lucas Catalani Gabriel
  • COMPETÊNCIAS PARA OS GERENTES DE PROJETOS DA CONSTRUÇÃO CIVIL

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • CARLOS AUGUSTO DE SOUZA OLIVEIRA
  • CARLOS EDUARDO SANCHES DA SILVA
  • RICARDO ANDRE FIOROTTI PEIXOTO
  • Data: Aug 15, 2019


  • Show Abstract
  • The Project Manager (GP) leads the responsible teams during the project implementation stage and is responsible for ensuring the achievement of all project goals PMBOK (2017). The present work aimed to bring a contribution to the theory by identifying the competencies demanded to the Brazilian project managers by the civil construction. The first stage of the research consisted of a systematic review of the literature whose databases were the Web of Science® and Scopus®. The second stage was constituted by a questionnaire’s elaboration according to the literature’s main concepts in order to interview the target audience. Based on the literature review, the principal success attributes of civil construction projects are "Delivery Within the Schedule", "Delivery Within the Budget" and "Final Quality Within the Parameters". The main competencies of the project managers were classified by means of 3 factors and 27 attributes retrieved from the theory. Therefore, the following factors were detected: theoretical technical knowledge; practical performance; and interpersonal skills. After the pilot test, the questionnaire was distributed, thereby collecting 422 responses with a total of 390 valid answers. The results demonstrated that the most relevant competencies for project managers in civil construction are costs, risk management, project timeline management, scope management, ethics, and communication. The analysis by means of the Attribute relevance level did not identify strong agreements between the groups of respondents and the standard. Although the ANOVA test revealed a similar opinion between the groups, lower-skilled managers, contrarily to experienced managers, considered both attributes "Experience in Project Management" and "Conflict Management" as less relevant.

11
  • ESTEVÃO LUIZ ROMÃO
  • Estudo Comparativo entre Redes Neurais Artificiais e Markov-Switching Model na modelagem de séries temporais não lineares

  • Advisor : PEDRO PAULO BALESTRASSI
  • COMMITTEE MEMBERS :
  • ANTONIO FERNANDO BRANCO COSTA
  • MARIANGELA DE OLIVEIRA ABANS
  • PEDRO PAULO BALESTRASSI
  • Data: Sep 17, 2019


  • Show Abstract
  • Nonlinear time series are widely encountered in real situations involving different scenarios such as economic growth, energy consumption, and climate changes. These time series are usually nonstationary and their behavior can change over time leading us to use sophisticated techniques that are able to capture these characteristics. This work aims to present a comparative study between Artificial Neural Networks (ANN) and the Markov-Switching Model (MSM) performances in modeling nonstationary nonlinear time series. Then, it was developed and it was applied a decision support methodology based on Design of the Experiments (DOE) in order to guide the analysis and the creation of a synthetic dataset, representing distinct contexts, which allows to generalize the comparisons. The design considered the following factors: dataset sizes, random error associated to each time series, and the correlations between them. After modeling each one using ANN and MSM we calculated the Mean Absolute Percentage Error (MAPE) and developed a paired-t test, and this led us to conclude that the performance of ANN is statistically better than MSM’s no matter the scenario considered. Furthermore, this study presents a case which aims to model Brazilian exportation during the last 10 years as a function of three time series: exchange rate, importation, and gross domestic product at the same period, besides the exportation series, but delayed one unit in time. Thus, we could validate that ANN has a better performance in problems involving nonstationary nonlinear time series than MSM.

12
  • MÁRIO HENRIQUE SOMBRA BEUTTENMULLER VILELA
  • INTEGRAÇÃO DOS MÉTODOS: MULTICRITÉRIO PARA TOMADA DE DECISÃO E SISTEMA DE INFORMAÇÃO GEOGRÁFICA PARA SELEÇÃO DE LOCAIS

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • JOAO BATISTA TURRIONI
  • Data: Sep 23, 2019


  • Show Abstract
  • Tools such as Multi-Criteria Decision Analysis and the Geographical Information System have become resource in structuring, evaluating and delivering viable alternatives with the increasing complexity of site selection problems. The theme has been applied in the most diverse areas and, due to its high versatility, it is necessary to map the current publications, catalog its areas of applications, most used methods, most researched countries, and subject areas. This mapping, through the Systematic Literature Review, identified the areas of greatest potential, helping future studies. The number of publications on the subject was verified between 1994 and 2018. From the analyses carried out, we defined the main authors and articles; main countries, journals, and nationality of authors; the main application groups of the themes, as well as their subgroups; most commonly used software; and finally, main trends for future work. Among the conclusions presented, the following stand out: The main journals and authors have the main focus in the areas of Waste Treatment and Disposal Sites, Energy Distribution and Generation Plants, Farming Sites, and Urban Facilities and Structures Sites; the tendency in the increase of the number of publications has a global scope; the trend of developing the theme for the next few years will be focused on activities that have economic, ecological, environmental and social impact, with improvement and extension of existing models in the literature. The literature review culminated in a proposal for the integrated use of GIS and the MCDA composed of 10 steps, aiming to simplify and standardize the existing methodologies.

13
  • ANNA CAROLINA DE SIQUEIRA FERREIRA
  • Behavioral Aspects Resulting from the Performance Systems of Graduate Programs

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • CARLOS EDUARDO SANCHES DA SILVA
  • DANILO HENRIQUE SPADOTI
  • EDGAR NOBUO MAMIYA
  • Data: Nov 25, 2019


  • Show Abstract
  • Performance Measurement has been implemented in a variety of contexts for the purpose of
    continuous improvement and competitiveness, in the private sector as well as in public
    management. Through KPI (key performance indicators), PM-based management has
    different effects, including many unintended ones. Although this is a relatively recent and not
    very deep topic, understanding the impacts caused by the KPI use is very relevant to improve
    performance measurement processes, aiming to minimize their negative effects and enhance
    the positive ones. Therefore, this paper aims to analyze the behavioral effects caused by the
    use of Performance Measurement Systems (PMS) in stricto sensu Post Graduate Programs of
    a Public Higher Education Institution. In this case, the chosen institution was the Federal
    University of Itajuba (UNIFEI) and its Post Graduate Programs (PGP). This study is an
    applied research of qualitative approach, which used the case study method. The participants
    were selected in two steps: randomly at first and then by intentional sampling. In short, 48
    permanent members from all UNIFEI PGP were selected to participate of this study, which
    used semi-structured interviews, participant observation and documentary research as
    techniques for data collection. The interview script was developed based on five behavioral
    factors from the literature: communication, perception, cooperation, motivation and control.
    Each of these factors was analyzed for CAPES, UNIFEI and PGP Performance Measurement
    System. For data analysis, statistical techniques were used such as: boxplot; Cronbach's alpha;
    participants' profile; correlation and cluster analysis; and Mann-Whitney Hypothesis Test. As
    results, all analyzed PMS somehow fail to induce cooperation and motivation among people,
    but they are positive in promoting communication and control. Among the 3 systems, the
    UNIFEI’s PMS is the most recent and the most criticized about its effects on teachers
    behavior.

14
  • LUCAS VIEITES SILVA
  • USE OF RELIABLE-CENTERED MAINTENANCE IN ROTATING WING INDUSTRIES

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • AGNELO MAROTTA CASSULA
  • ALEXANDRE FERREIRA DE PINHO
  • CARLOS HENRIQUE PEREIRA MELLO
  • Data: Dec 13, 2019


  • Show Abstract
  • With the need to expand the portfolio of options for aircraft maintenance contracts,
    an opportunity envisioned by companies of this follow-up is linked to enriching their repair
    processes with the most obvious use of quality tools. Such opportunities open the way for
    validation of the use of reliability centered maintenance (RCM) in workshops that specialize
    in routine inspections of rotary-wing aircraft. Thus, the use of the RCM becomes a qualitative
    differential in the moment of offering a maintenance contract per hour of flight, using a
    combination of activities that analyze the whole scenario that involves a process of
    maintenance and repair, studying the behavior of the components critical and assisting in
    decision making regarding which maintenance mode should be applied, corrective,
    preventive or predictive. The applications of RCM's core activities help maximize component
    life, which in this case is directed to a rotary wing aircraft. With the narrowing of this
    component monitoring it is possible to write reliable reports on which discrepancies a
    particular model tends to present according to the conditions and amount of hours flown.
    These analyzes, in conjunction with supply management tools, aid in the scheduled
    inspection process being possible to predict which materials will undergo replacement,
    repair or certification processes. The use of the RCM in the aeronautical environment
    provides credibility in the view of the client, providing the greater availability of airfreight fleet
    with the reduction of inspection time. Mapping the entire maintenance process, defining and
    classifying the main components to be used and sequencing the work orders are points
    addressed in this study that seek to certify and make feasible the use of the exposed
    activities of the literature related to RCM.

Thesis
1
  • JOSENILDO BRITO DE OLIVEIRA
  • GESTÃO DE RISCOS LOGÍSTICOS EM CADEIAS DE SUPRIMENTOS: OTIMIZAÇÃO VIA METAMODELO DE SIMULAÇÃO

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • ENZO MOROSINI FRAZZON
  • FABIANO LEAL
  • FABIO FAVARETTO
  • FERNANDO AUGUSTO SILVA MARINS
  • JOSE ARNALDO BARRA MONTEVECHI
  • RENATO DA SILVA LIMA
  • Data: Mar 29, 2019


  • Show Abstract
  • Some risks can cause losses to the supply chains, producing ruptures in the flow of materials and finished goods. Logistics risks are associated with failures in transportation, warehousing, production and sales processes. Proper management of these risks is critical for the integration of flows under the responsibility of logistics and operations, whose activities are often carried out by logistics service providers. However, there was a shortage of systematic procedures focused on logistics risk management that take advantage of the integration of simulation and optimization methods. This research was carried out in an automotive supply chain located in Portugal, and based on secondary data available in the literature. The problems of this study are: (a) what are the impacts of logistics risks on the performance of this chain? (b) Under the influence of these risks, which adjustments in the logistics system could improve the response of the arrangement to the impacts? In order to solve such issues, the research objective is mitigating the effects of logistics risks from a simulation metamodel for the optimization of critical parameters. The logistical processes carried out in the supply chain were selected as object of study. About the nature, approach of the problem and objectives, this research was classified as applied, quantitative and exploratory normative, respectively. The discrete event simulation, elaborated in the Arena® platform, was used as a research method. The Black Box optimization, performed through OptQuest® software, was employed to design the appropriate parameters of the logistics system. An OLS method-based regression metamodel was created from design and implementation of experiments with the purpose of integrating the outputs of the simulation model to the inputs of the optimization model. Numerous verification and validation techniques were applied to calibrate the simulation and optimization model, such as modular implementation and sensitivity analysis. A DMAIC-based methodological systematic was developed to comprise the logistic risk management steps and to lead to the results of this research, containing the identification (Define), assessment (Measurement), management (Improvement) and monitoring (Control) of logistics risks. A logistical risk event was inserted into the model in order to reproduce disruptions in the physical distribution flow and allow the evaluation of its impacts on the supply chain performance. The impacts were measured by the following metrics: total logistic cost, stockout risk and fill-rate. For the transportation risk, two mitigation strategies, redundancy and flexibility, have been performed to simultaneously minimize cost and risk and maximize the fill-rate. The risk response solution suggested by the multiobjective optimization model simulation proved to be adequate and effective since the adjustments in the logistic system blocked the effects of the disruption. The main contribution of the research was to develop systematic procedures to improve the supply chain logistics risk management from the combined use of simulation and optimization methods

2
  • GIANCARLO AQUILA
  • Contribuição para o processo de contratação de projetos de geração eólico-fotovoltaica a partir da análise econômica de decisões baseada na programação multiobjetivo

  • Advisor : EDSON DE OLIVEIRA PAMPLONA
  • COMMITTEE MEMBERS :
  • ANDERSON RODRIGO DE QUEIROZ
  • BENEDITO DONIZETI BONATTO
  • EDSON DE OLIVEIRA PAMPLONA
  • LUIZ CÉLIO SOUZA ROCHA
  • RAFAEL DE CARVALHO MIRANDA
  • WILSON TOSHIRO NAKAMURA
  • Data: Apr 18, 2019


  • Show Abstract
  • Recently, with the growth of wind power investments in Brazil and the ingress of photovoltaic
    solar power into long-term energy auctions in Brazil, it is possible to notice that, in some
    places, there is a trend to build power plants that may produce electricity using wind and
    photovoltaic energy simultaneously. However, there are still no bidding processes and no
    specific decision criteria for contracting this type of project. Therefore, the objective of the
    present work is to provide a contribution to the selection of wind-photovoltaic generation
    projects in the Brazilian electricity system. For this, the mixture design technique is initially
    used to generate scenarios that are employed to calculate the reduced emission density and
    Levelized Cost of Electricity, which are the response variables used in the model, for twelve
    cities in Brazil. In this approach, quadratic regressions are applied to obtain the objective
    functions used in an optimization model based on the Normal Boundary Intersection method
    to construct the Pareto frontier. The proposed approach also uses a metric involving the
    Shannon entropy ratio and the Global Percentage Error in the analysis to identify the best
    Pareto-optimal solution. Finally, it is computed physical guarantee and the minimum price
    that would make feasible a wind-photovoltaic plant in each city in analysis, in order to
    identify which place the installation of a potential wind-photovoltaic plant would be more
    competitive in an auction involving this type of projects. The results indicate that the model is
    capable of providing the optimal configuration for a wind-photovoltaic plant, according to
    the generation potential for each site, besides being able to identify the most recommended
    locations for installation of this type of project, according to the weights of the objectives
    being analyzed.

3
  • KÍVIA MOTA NASCIMENTO
  • OTIMIZAÇÃO DE ARRANJOS DE MISTURA DE COMPÓSITOS CIMENTÍCIOS COMPACTADOS COM INCORPORAÇÃO DE RESÍDUOS DE PEAD

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • CARLOS EDUARDO SANCHES DA SILVA
  • LEANDRO JOSÉ DA SILVA
  • MARIA TERESA PAULINO AGUILAR
  • MIRIAN DE LOURDES NORONHA MOTTA MELO
  • TULIO HALLAK PANZERA
  • Data: Jul 1, 2019


  • Show Abstract
  • The high consumption of polymer materials and the consequent increase in solid waste generation demand alternative recycling routes. A significant amount of research in materials engineering has been directed towards the reuse of waste as dispersive phases in composites, specially cement-based materials, combining environmental, economic and technological issues. Among the polymer materials, high density polyethylene (HDPE) is one of the most produced, but its recycling is not done in an expressive way and the research is still incipient. This doctoral thesis investigates the reuse of HDPE in compacted cementitious composites and its influence on physical-mechanical properties through Design of Experiment (DoE). A desirability method was used to optimise the amount of HDPE, quartz and cement particles in the system. The water/cement ratio of 0.5 was kept constant while the aggregate/cement ratio varied from 3.75 to 5.25 and a minimum percentage of 30% of HDPE particles was defined in the experiment A low adhesion between the HDPE particles and the cementitious matrix was verified. The increasing percentage of HDPE led to reduced compressive strength (up to 80%), flexural strength (up to 75%), ultra-pulse velocity (up to 55%), dynamic modulus (up to 90%), bulk and apparent densities (up to 50% for both ) and increased porosity (up to 60%) and water absorption (up to 200%). The relationships between the response variables were modelled by means of regression analysis with satisfactory adjustments. It should be noted that the ultra-pulse velocity and dynamic modulus increase as the hydration process and the moisture rises in the sample. The combination of 0.21 of cement matrix, 0.24 of HDPE and 0.55 of quartz optimises all responses, except for bulk and apparent densities in which the optimum setup was 0.16 of cement matrixand 0.84 HDPE. Six different scenarios were considered to perform the multi-objective optimisation in which the importance given to the variable parameters was varied, obtaining individual desirability and satisfactory composites for all cases, which demonstrates the possibility of incorporating HDPE particles for different applications. The findings reveal a new recycling route for HDPE in non-structural microconcrete applications.

4
  • TAYNARA INCERTI DE PAULA
  • Método da Interseção Normal à Fronteira baseado em
    Análise Fatorial para otimização de problemas
    multivariados utilizando-se Algoritmo Genético

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • JOSE HENRIQUE DE FREITAS GOMES
  • GUILHERME FERREIRA GOMES
  • LUIZ CÉLIO SOUZA ROCHA
  • ROGÉRIO SANTANA PERUCHI
  • Data: Jul 4, 2019


  • Show Abstract
  • Finding solutions to multiobjective optimization problems is not a trivial
    task. To find a Pareto-optimal set of solutions, a common approach is the combination
    of a weighted multiobjective optimization method with a metaheuristic.
    Considering the optimization of two different functions with different levels of complexity,
    when a greater weight is given to the more complex function, the final
    objective function will be a lot more complex and will require more effort from the
    search algorithm. This means that for each different combination of weights, one
    can have a different configuration of the algorithm parameters that can lead to the
    optimal solution. To overcome this problem, the present study addresses the simultaneous
    optimization of the algorithm parameters and the weights applied to the
    multiobjective problem. The Genetic Algorithm was chosen as the search algorithm,
    since it is one of the most used meta-heuristics and it can suffer the influence of several
    parameters. As the method of optimization, it was chosen the Normal Boundary
    Intersection method, since this is able to find solutions even in non-convex regions
    of search space. However, such method does not perform well in problems with too
    many or correlated responses. In this context, the application of Factor Analysis
    allows the reduction of the problem dimensionality and the substitution of a large
    number of responses by a few uncorrelated objective functions, formed by rotated
    factor scores. Considering these facts, this study proposes a method that allows the
    reduction of the problem dimensionality, the optimization of uncorrelated factors
    and the simultaneous optimization of weights and algorithm parameters, through a
    design of mixtures combined with process variables. In this case, the components of
    the mixture will be the weights of the objective functions and the process variables
    will be the input parameters of the Genetic Algorithm. The results obtained for
    the optimization with this method allow the calculation of a Mean Square Error
    for each factor, which when optimized provide a Pareto border with optimal setups
    of weights and parameters that can be used to optimize the initial problem.
    The method proposed in this study was applied for the optimization of a set of
    benchmark functions, to validate its applicability in other processes. In addition,
    the method was also applied to a real optimization process: the laser beam machining
    process of DIN X40CrMoV5-1 steel. In both cases, the main objective was
    reached, where the method was able to determine the frontiers / surfaces of optimal
    configurations of weights and parameters.

5
  • WILSON TRIGUEIRO DE SOUSA JUNIOR
  • PROPOSTA DE REDUÇÃO DO TEMPO COMPUTACIONAL EM PROBLEMAS DE OTIMIZAÇÃO VIA SIMULAÇÃO A EVENTOS DISCRETOS INTEGRANDO METAHEURÍSTICAS, APRENDIZAGEM DE MÁQUINA E PARALELISMO

  • Advisor : JOSE ARNALDO BARRA MONTEVECHI
  • COMMITTEE MEMBERS :
  • DAVID CUSTÓDIO DE SENA
  • JOSE ARNALDO BARRA MONTEVECHI
  • MONA LIZA MOURA DE OLIVEIRA
  • RAFAEL DE CARVALHO MIRANDA
  • ROBSON BRUNO DUTRA PEREIRA
  • TABATA NAKAGOMI FERNANDES PEREIRA
  • Data: Jul 5, 2019


  • Show Abstract
  • Discrete event simulation techniques have been used in several industrial sectors in recent decades, with the advent and popularization of computational resources and statistical knowledge applied in the production of goods and services. A recurring purpose in generating a simulation is the evaluation of a large number of scenarios in order to find an optimal relation of the combination of variables to meet constraints and functions of minimization and/or maximization of objectives. The optimization process of the simulation models has to deal with the problem of the exponential increase of the search space for solutions, while the number of decision variables increases linearly, making the resolution of this problem very difficult or impossible as regards the evaluation of all possible combinations when there is insufficient time and/or computational capability to do so. With the increase in complexity regarding the number of possible solutions to be considered by decision-making agents, the Operational Research area of knowledge has applied techniques traditionally generated to solve problems of combinatorial optimization, in problems of discrete event simulation. For these optimization techniques, metaheuristics have been used with success since the origin time of the first optimization methods. A restrictive factor in the use of these optimization methods in discrete event simulation is that in order to test the quality of a given possible solution, it is generally necessary to use simulator software. Thus, even with the use of optimization method, a lot of time is spent to obtain a good solution because it is necessary to call the simulator recurrently. In order to decrease this effect, the present thesis has associated three approaches to reduce the response time required to obtain good responses in the optimization of discrete event simulation, using machine learning metamodel and solution parallelism within metaheuristics that use population search method. In this context, it was possible to integrate all of these concepts in the same platform, applying the proposed in three objects of study concerning problems of Production Engineering. An open source optimization environment was built in Python to integrate the two study objects with 33 machine learning methods, two metaheuristics, and parallelism for scenario processing. As a mean result of the proposed method it was possible to reduce the computational time by 93.5% compared to the traditional method of using only metaheuristic optimization, obtaining a solution equal to 87.5% of the reference value of the studied objects.

6
  • ANDRIANI TAVARES TENÓRIO GONÇALVES
  • Integrated Solid Waste Management Index to support decision making in Municipalities 

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • RENATO DA SILVA LIMA
  • FABIANO LEAL
  • JOSIANE PALMA LIMA
  • RAFAEL DE CARVALHO MIRANDA
  • MARCELO MONTAÑO
  • RUI ANTONIO RODRIGUES RAMOS
  • Data: Oct 11, 2019


  • Show Abstract
  • Municipalities around the world face a major challenge in proper integrated solid waste management, mainly due to the urbanization and the industrialization process, which have increased the amount and diversity of generated waste, making the management of the waste a complex process. In addition to that, the relation between the management and the political, economic, environmental, social and cultural aspects of the municipalities require the development of appropriate tools to help cities understand those aspects and the interrelationships between them. In this sense, indexes and indicators are excellent tools that can help municipal administrators to identify the strengths and weaknesses of the integrated solid waste management, to plan priority actions and to direct public investments. Within this context, this research proposed the construction of the Integrated Solid Waste Management Index (ISWMI), using the Analytic Hierarchy Process (AHP) method. For the development of the ISWMI, 56 indicators (criteria) and 143 sub-criteria were identified, considering the solid urban waste, the health service and the civil construction, respecting the limits of the participation of the government in their management. The indicators were grouped into five dimensions: operational, environmental, political-economic, educational and social. The index was applied in municipalities located in the south of Minas Gerais State, reaching a global value of 0.49 (Santa Rita do Sapucaí), 0.58 (Pouso Alegre), 0.60 (Machado) and 0.70 ( Três Corações), corresponding to low efficiency in the integrated solid waste management  in the municipality of Santa Rita do Sapucaí and, in the other municipalities, a medium efficiency. These results reflect, in a general way, the the lack of a selective collection program, social inclusion, specific legislation, sustainable practices, training and employee training, environmental education, and integration among management stakeholders (public, private, and the population).The application of the index allowed the identification of the aspects to be improved and prioritized in the integrated solid waste management, serving as a monitoring tool and management evaluation, providing subsidies for the implementation of policies and strategies aimed at increasing the efficiency of the integrated solid waste management for sustainable development.

7
  • ROBERTA ALVES
  • MODELAGEM E SIMULAÇÃO BASEADA EM AGENTES APLICADA AO TRANSPORTE URBANO DE CARGAS DO COMÉRCIO ELETRÔNICO-B2C

  • Advisor : RENATO DA SILVA LIMA
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • CLAUDIO BARBIERI DA CUNHA
  • JOSE HENRIQUE DE FREITAS GOMES
  • LEISE KELLI DE OLIVEIRA
  • RENATO DA SILVA LIMA
  • WILFREDO F. YUSHIMITO
  • Data: Oct 25, 2019


  • Show Abstract
  • Brazil is the country leader of Latin American in e-commerce sales. Most deliveries take place at home, in the attended forms (delivery to one person). This delivery model results in a large number of missed deliveries and redelivery attempts, reaching up to three delivery attempts for the same package. In this way, Delivery Lockers (DL) represents an option for delivery and consolidation of goods. The objective of this study is to evaluate the implementation of Delivery Lockers solution to the last mile, considering the behavior and interaction between e-commerce stakeholders, by using agent based simulation model (ABSM). The model was built considering the four main stakeholders (customers, e-commerce store, carrier and DLs) involved in the delivery process. The technique Design of Experiment (DOE) was also used to assist in the analysis of variables. The central idea was to present an ABSM that can be used to support decision making. It was simulated 84 scenarios that consider different daily demands, variations in the input parameters, implantation or not of DLs and the exclusion or not of the policy of three delivery attempts. The scenarios elaborated rely on the Base Scenario Group, which resembles the real situation, where there is no use of Delivery Lockers and there are three delivery attempts and three more Scenario Groups. The Base Scenario Group had worse results. Scenarios that simulate higher daily demand, while providing higher total costs, have even the lowest cost per order, being up to 56,7% lower than those that simulate lower daily demand. The ABSM showed that the DLs represent a better solution than the distribution center to receive deliveries failures, reducing the distances traveled by customers up to 60%. Comparing the scenarios, it was found that the implementation of the DLs and the exclusion of the three attempts, simulated by the Scenario Group 4, increase the number of orders delivered by 16, 7%, reduce the distance traveled by trucks by up to 28,1%, increase net revenue by up to 79,1%, and include gain in order hosting when compared to the base Scenario Group. This configuration is the most attractive delivery system for everyone involved. The ABSM has proved a useful tool to model the urban transport of e-commerce goods, being able to correctly simulate the interactions between the agents and the actions taken by them in different situations. In addition, these techniques allowed to model and evaluate different delivery strategies and obtain comparable results for each agent.

8
  • RENATO PONTES RODRIGUES
  • Application of Hybrid Simulation in Production Scheduling in Flexible Job Shop Systems

  • Advisor : ALEXANDRE FERREIRA DE PINHO
  • COMMITTEE MEMBERS :
  • ALEXANDRE FERREIRA DE PINHO
  • FABIANO LEAL
  • FABIO FAVARETTO
  • FERNANDO AUGUSTO SILVA MARINS
  • MARIANO DE PAULA
  • Data: Dec 9, 2019


  • Show Abstract
  • This research studies one of the most complex and important issues in production scheduling research: flexible job shop systems, which belongs to a class of NP-hard problems. These systems are extremely important for the industry which uses make to order production strategy and seeks mix and volume flexibility. There is a current demand for robustness of the sequencing method. The same should be effective in the face of unexpected quotidian problems that affect the efficiency of the initial sequence defined. In addition, it must be considered stochastic times and seek to achieve more production goals simultaneously. The scheduling will be performed through simulation that should sequence the orders at the beginning of the process and re-sequence it when necessary, in order to seek a more robust model to everyday problems and seeking to improve more indicators at the same time. That is because traditional models tend to be more sensitive to unexpected events. The model system will use agents within Discrete-Event Simulation models, generating a Hybrid Simulation model. The goal is to evaluate how the agent within the model can help in responding to these events, adding robustness to it, and making it more accessible to users. This is a quantitative research, using the modeling and simulation method and following a normative empirical model. The scientific contribution is generated in the final model that is more robust to eventualities and has a broader focus. Additionally, the model constructed in this work brings a friendly interface to insert modifications, what enhance model integration with unprepared users. The work consists in the validation of the model by comparing its results with the results of the Gantt chart. After that, comparisons of the results obtained with and without re-sequencing were made. First with the agent using one sequencing logic and then using the same logic, but with sequence adjustments during the batches production, seeking to neutralize the negative points of the initial logic through re-sequencing. It also stresses that this schedule ensures that the Manager agent reduces makespan and increases machine utilization while increasing its interference in the model.

9
  • DALTON GARCIA BORGES DE SOUZA
  • R&D project selection: which criteria should we use?

  • Advisor : CARLOS EDUARDO SANCHES DA SILVA
  • COMMITTEE MEMBERS :
  • ADLER DINIZ DE SOUZA
  • CARLOS EDUARDO SANCHES DA SILVA
  • DANIEL CAPALDO AMARAL
  • NEI YOSHIHIRO SOMA
  • PEDRO PAULO BALESTRASSI
  • Data: Dec 10, 2019


  • Show Abstract
  • Many companies around the world lay on R&D their chances to be profitable and still standing in a dynamic market. To keep the changes going, many ideas surge and some are transformed into projects. Since the resources are limited, organizations are obliged to select only the most suitable projects to attend their objectives. This is an old practice. However, project portfolio characteristics has changed. The portfolio objectives of today go beyond profit: strategy, environment and society has also become import, along with many other decision criteria. The computational power was also enhanced, making multidata decision approaches feasible, even for small-profitable organizations. On the last half century, many authors have proposed multicriteria decision making (MCDM) methods for project portfolio selection (PPS) on Research and Development (R&D). However, only a few gave importance to the criteria used, which would be a central issue on any multicriteria decision. Thus, in order to contribute to R&D PPS field of study, this thesis investigates two propositions: (1) most criteria used in R&D PPS may be represented by a smaller list of criteria, and (2) the criteria used in R&D PPS can be selected in a fuzzy environment, according to their influence and importance. To do so, we explore the 227 criteria used in R&D PPS from 1970 to 2019, summarizing them in a list of 23 criteria with broader scopes and 8 criteria groups. A Systematic Literature Review was performed to get to the initial 227 criteria and to lighten the research opportunities in MCDM-based R&D PPS explored by this thesis. We also propose a novel MCDM approach for criteria selection, that integrates Fuzzy-based DEMATEL and Fuzzy-AHP Extend Analysis methods. Both list and method are built and validated by experts from a representative electrical-public Brazilian R&D organization. Experts from other representative public Brazilian R&D organizations have also contributed in other research steps. All involved organizations manage together R&D portfolios valued around US$ 5 billion each year, which account for 38% of all Brazilian annually expenditures in R&D projects. In a overall manner, the results provide guidance on the topic and facilitate knowledge accumulation and creation concerning the criteria selection process in MCDM-based R&D PPS.

10
  • ADRIANO CARLOS MORAES ROSA
  • REFERENCE MODEL FOR OPEN INNOVATION IN TECHNOLOGY-BASED COMPANIES

  • Advisor : CARLOS HENRIQUE PEREIRA MELLO
  • COMMITTEE MEMBERS :
  • ANEIRSON FRANCISCO DA SILVA
  • CARLOS EDUARDO SANCHES DA SILVA
  • CARLOS HENRIQUE PEREIRA MELLO
  • EDUARDO GOMES SALGADO
  • JOAO BATISTA TURRIONI
  • VANESSA CRISTHINA GATTO CHIMENDES
  • Data: Dec 12, 2019


  • Show Abstract
  • Several studies point out the benefits conquered by companies that innovate and, recently, practice open innovation or OI. Nevertheless, there are many difficulties reported by these companies, mainly when they need to measure their results in face of these practices. In Brazil, when the search for innovations is brought to small companies segment, more difficulties are found, however, according to recent researches (inside this thesis), small companies register on these new changing scenarios and get to adapt themselves generating and multiplying technology and innovation. This work proposes the adoption of OI for small Technology-based companies or TBCs, as well as, to measure it, a set of indicators suitable to these companies were identified and tested, considering the Brazilian reality and for that, as a research method, the survey was adopted, adding the statistical technique of Factor Analysis to ensure the application of the research instrument and the reliability of the proposed model. With the methodological procedures, it was possible to observe, choose and evaluate the application of indicators in centers or centers considered reference in innovation in the Brazilian states of São Paulo and Minas Gerais, with samples that occurred in TBCs located in São José dos Campos (SP), Itajubá (MG) and Santa Rita do Sapucaí (MG). As contributions, the study aimed to provide the reader or end user with a detailed view on how OI actions strengthen strategies, collaborations and culture in companies that adopt it, and that this knowledge can be applied in small companies. Thus, the present study intended to fill the knowledge gap, identifying and discussing the importance of OI activities development and their measurement, as well as to guide those interested in innovative practices and to promote a culture change in the adoption of these practices as possible strategies for small businesses. As the main theoretical results, it was also verified that in other countries, small-scale TBCs has been already included in the current scenario of management and innovation and, in the face of positive results, it is concluded that measuring OI´s impacts becomes an important factor for the development. Companies, regardless of size, universities, researchers and government, in order to promote development, must open themselves to test knowledge and multiply them in a collaborative way. As future work, it is expected to multiply and consolidate the work presenting the set of OI indicators for the academic and business community, as well as to test it in other environments and companies.

11
  • LIVIO AGNEW BACCI
  • Multivariate optimization of the combination of forecasting methods

  • Advisor : ANDERSON PAULO DE PAIVA
  • COMMITTEE MEMBERS :
  • ANDERSON PAULO DE PAIVA
  • ANTONIO FERNANDO BRANCO COSTA
  • DANIELLE MARTINS DUARTE COSTA
  • PEDRO PAULO BALESTRASSI
  • RONÃ RINSTON AMAURY MENDES
  • Data: Dec 12, 2019


  • Show Abstract
  • This paper proposed a new multi-objective approach to find the optimal set of weight’s combination of forecasts that were jointly efficient with respect to various performance and precision metrics. For this, the residues’ series of each previously selected forecasts methods were calculated and, to combine them through of a weighted average, several sets of weights were obtained using Simplex - Lattice Design {m, q}. Then, several metrics were calculated for each combined residues’ series. After, Principal Components Factor Analysis (PCFA) was used for extracting a small number series’ factor scores to represent the metrics selected with minimal loss of information. The extracted series’ factor scores were mathematically modeled with Mixture Design of Experiments (DOE-M). Normal Boundary Intersection method (NBI) was applied to perform joint optimization of these objective functions, allowing to obtain different optimal weights set and the Pareto frontier construction. As selection criteria of the best optimal weights’ set were used the Shannon's Entropy Index (S) and Global Percentage Error (GPE). Here, these steps were successfully applied to predict coffee demand in Brazil as a case study. In order to test the applicability and feasibility of the proposed method based on distinct time series, the coffee’s Brazilian production and exportation were also foreseen by new method. Besides, the simulated series available in Montgomery et al. (2008) were also used to test the viability of the new method. The results showed that the proposed approach, named of FA-NBI combination method, can be successfully employed to find the optimal weights of a forecasts’ combination.

2015
Dissertations
1
  • GIANCARLO AQUILA
  • Análise do impacto dos programas de incentivos para viabilizar enconomicamente o uso de fontes de energia renovável.

  • Advisor : EDSON DE OLIVEIRA PAMPLONA
  • COMMITTEE MEMBERS :
  • ANDERSON RODRIGO DE QUEIROZ
  • EDSON DE OLIVEIRA PAMPLONA
  • RAFAEL DE CARVALHO MIRANDA
  • WILSON TOSHIRO NAKAMURA
  • Data: Nov 12, 2015


  • Show Abstract
  • Através de estudos existentes na literatura, é possível observar que diversos países têm
    aplicado estratégias direcionadas para o incentivo de geração de energia elétrica por meio de
    fontes de energia renovável. Este trabalho apresenta um entendimento sobre o contexto e a
    evolução dessas políticas, destacando os principais tipos e aplicações, com foco no impacto
    das  estratégias  de  incentivo  para  o  produtor  inserido  no  mercado  de  energia  renovável
    brasileiro. Para isso, foram feitas análises de investimentos em uma usina eólica, no estado da
    Bahia, considerando as incertezas de geração de energia e exposição aos riscos de liquidação
    dessas diferenças. As análises incluem a possibilidade  da  usina  comercializar energia em  um
    ambiente regulado ou no livre mercado, analisando situações específicas para cada ambiente e
    para  diferentes  regimes  de  tributação.  Além  disso,  analisaram-se  cenários  em  que  ocorre  a
    possibilidade de comercialização de créditos de carbono, através da participação  da usina  no
    Mecanismo  de  Desenvolvimento  Limpo.  Inicialmente  são  realizadas  as  análises
    determinísticas  para  cada  cenário  realizado,  seguido  pela  análise  de  sensibilidade,  também
    foram  feitas  as  análises  estocásticas  que  incorporam  as  incertezas  nas  principais  variáveis
    identificadas  na  análise  de  sensibilidade  e  as  incertezas  relacionadas  à  geração  de  energia
    mensal da usina e ao preço de liquidação das diferenças, por último aplicou-se o Value at Risk
    que possibilitou a análise do pior cenário esperado do produtor em cada situação. Nota-se que
    no  cenário  brasileiro  atual,  os  ambientes  incentivados  ainda  apresentam  características
    particulares capazes de mitigar os riscos do produtor, em um mercado que ainda está em fase
    de amadurecimento.

SIGAA | DTI - Diretoria de Tecnologia da Informação - (35) 3629-1080 | Copyright © 2006-2024 - UFRN - sigaa06.unifei.edu.br.sigaa06