Jump to content
  • Advertisement

Martin H Hollstein

Member
  • Content Count

    72
  • Joined

  • Last visited

Community Reputation

651 Good

About Martin H Hollstein

  • Rank
    Member

Personal Information

  • Website
  • Role
    Game Designer
    Level Designer
    Programmer
  • Interests
    Business
    DevOps
    Programming

Social

  • Twitter
    trollpurse
  • Github
    TrollPurse

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Martin H Hollstein

    Eight Hours Version 2.0.0

    Version 2.0.0 This is a short post to announce that Eight Hours has released version 2.0.0. This is a big change in the core game code, there were some new assets purchased to build out levels, and we went through a lot to fix up the lighting within the game itself. We are very excited to release this on Halloween and hope that you can give the game a play. There are two options to download the ( FREE ) game from: Itch.io and GameJolt.com. There will be more on this in the future, but for now we hope the game will do all the talking.
  2. Martin H Hollstein

    Eight Hours

    Take on the role of Bob Norm, paranormal investigator. Begin your night locked in a haunted house. It is your job to capture apparitions residing within the abode. But, not all is as it seems. Something sinister lurks within the house at night. Can you survive Eight Hours? Eight Hours is the horrifying debut game from Troll Purse. It features you as Bob Norm, a paranormal investigator. At your request, the owner has locked you in the house for the duration of the investigation. Now, it is up to you to discover the secrets of the haunting. You have eight hours - good luck and Godspeed.
  3. Martin H Hollstein

    Eight Hours Version 1.5

    Version 1.5 release. If you want, you can go directly to itch.io or gamejolt to play the latest free version of Eight Hours. A lot has changed over the past few months for Eight Hours. First, if you haven't, checkout our trailer! Trailer Big Updates Here is a list of some of our updates and details of went behind them. Entity AI. The entity itself was very lax and boring in previous versions of Eight Hours. So, we had decided to update the entity to appear sooner and more often at random intervals. It also has a scaling difficulty curve based on the amount of time played in the game. This will help limit game time plays as well as induce more stress the longer it takes a player to capture all the EVPs. Hints. Previously hints would show up and auto disappear after a few seconds. We now re-programmed the controls so that hints show for a significantly longer period of time after they are triggered. Secondly, objectives and hints are shown by default, and can be toggled on and off with the space bar (default binding). Tutorial. A lot of feedback we observed and received was along the lines of "How do I play". So, we had worked to introduce more helpful tutorial text during gameplay at relevant moments. We hope this will help players get into the game quicker and enjoy the investigation. Get It Now Go directly to itch.io or gamejolt to play the latest free version of Eight Hours.
  4. Martin H Hollstein

    Eight Hours - Supernatural Investigation Horror Game

    Thanks for the comment! You can check out he current demo on itch.io or gamejolt. The current version is 1.4 with 1.5 planned to come out sometime next month - so we would love your feedback!
  5. Eight Hours by Troll Purse Supernatural Investigation Horror Game Jump right into it from either itch.io or gamejolt! Take on the role of Bob Norm, paranormal investigator. Begin your night locked in a haunted house. It is your job to capture apparitions residing within the abode. But, not all is as it seems. Something sinister lurks within the house at night. Can you survive Eight Hours? Eight Hours is the horrifying debut game from Troll Purse. It features you as Bob Norm, a paranormal investigator. At your request, the owner has locked you in the house for the duration of the investigation. Now, it is up to you to discover the secrets of the haunting. You have eight hours - good luck and Godspeed. See more at the official game webpage. Continue to explore more and get the download links to play today! Use Supernatural Detection Equipment to Hunt Haunts Explore A Seemingly Normal Haunted House You can get the game from either itch.io or gamejolt!
  6. Martin H Hollstein

    Creating Dread in Horror Games

    Can you hear it? The slight creak in your office chair, the pop inside the walls of your room, the bones in the house stretching. Are your lights off? Do you still see flickers of shadows out of the corners of your eyes then? Who was that knocking on your door - or, was that just the wind? These are questions leading to uncertainty within Eight Hours. Not a moment goes by where there isn’t some trick of light or slight of sound that leads to a question. Are all the sounds normal, or has something introduced itself? Are you really seeing what you are seeing, or have you been staring at the screen too long? Dread What is dread? Dread is to be in great fear. How does one create great fear within a game? Is it the atmosphere - sounds, textures, visuals? Or is it in the thoughts put into the heads of the player? Is it all about how the game is played, the tension it creates? Well - it really is all of it put together. Setting the Mood Audio (or lack there of), visuals, and texture are all really great starts to set the tone for a horror game. You can see examples of this in almost all horror themed games. Take a look at some of the recent releases of horror themed games to get a sense of the horror tone. The FNAF (Five Nights at Freddys) series of games all use dark environments and a lack of lighting, Resident Evil 7: Biohazard has a lot of gross imagery - from bugs to guts (just look at how wet those models look), and just listen to the Silent Hill soundtracks. From each of those examples, one who has not even played video games before can say that they are trying to be scary. They all give off a sense of dread in some form. Beyond the Senses What about how a player thinks when playing a game? As you walk through the halls in PT or Madison, do you think of puppies and unicorns? No, there are clues and mystery created that drive your mind to somewhere darker. In Outlast, do you pass by the residents with an air of confidence, or do you sit and think to yourself, “Will this one attack me or is he too addled to care about my presence?”. Amnesia - one of the great indie darlings of horror - makes you think the boogie man is in the closet, but when you open it, nothing happens. Then, when you turn around - BOO!. All of the actions done by these games makes the player uncertain of what is about to happen. Uncertainty leads to nervousness, coupled with the environment - the player is lead on the path of dread. Perpetuating the Dread Of course, none of this would continue to bother the player if nothing ever happens. This is why a lot of games finish off with a pop of jump scares - similar in fashion to Slender: The Eight Pages or FNAF. In both scenarios this is also used as a form of punishment for failure - creating a tension to avoid failure at all costs. Death and punishment for failures is a common pattern to keep the fear going in the player’s mind. It instills a basic cause and effect pattern in the mind. To go a step further, audio and visual cues can attach to the effects of failure. An example of this would be in the origin FNAF game. When the player runs out of power, the player knows they have failed. However the scare doesn’t come immediately. It is preceded by an indeterminately timed music loop and light flicker before punishing the player. Finally, another way to create dread throughout the game is via limited resources to manage. Why is that most hobby or independent horror games have flashlights with batteries? Because it is an easy to relate to form of resource management. Outlast does this well with the use of the video camera and night vision. Dread in Eight Hours Eight Hours is a paranormal investigative horror themed game. With that description, we had to match up with what creates a good horror game. So, Eight Hours employs several methods of using the environment, audio, and game play to bring out the scares. Environment The environment in Eight Hours matches that of most horror themed games. That is, the entire house is dark and the setting is a night time scene. There is also an eerie sense of loneliness due to the vacancy of objects within the home. This is further compounded by the fact that the player is not allowed to leave this enclosure and is alone the entire night. Audio There is no music within the game of Eight Hours. This was done intentionally as some of the game play does rely on audio cues. It also lets all of those seemingly harmless sounds to sink in. The environment helps add dread by introducing sounds effects too closely related to those associated with failure punishment within the game. So, is that creak really a creak or the foot steps of a malicious entity? Game Play Eight Hours increases the feeling of dread in the game play. The player is offered combat resources in the form of light switches and detection objects to help avoid the spirits. However, these are offered in small quantities. This allows for known exposure to failure in the mind of the player. Finally, the game play does not let the player only respond in fight or only respond in fight. This adds another layer of panic as the player has to decide what to do - and the wrong choice or indecision can cost the player dearly. Links to Reality Finally, Eight Hours is unique in that it uses real world dread to the game. All of the EVPs within Eight Hours are sourced from real world ghost hunts and investigations. Some believe that this can invite said spirits (or something more sinister) into your life. Play with caution - if you dare. View the full article
  7. Martin H Hollstein

    Designing Player World Interaction in Unreal Engine 4

    Thanks, I will fix that up in time. Any thoughts on the content other than that? Was this useful?
  8. Martin H Hollstein

    The Story of Eight Hours

    Troll Purse was started as a means to distribute games under a common title. Today, we will discuss our first distributed game, Eight Hours. Taking on a different model, rather than selling early access games, Troll Purse wants to create games to garner interest in them. If enough interest is found in a game, we continue to develop it at a commercial level. Our first game, Eight Hours is our first attempt at such a model. How it Began Eight Hours development started in late 2016. Troll Purse was born a few months afterwards in February of 2017. The reason Eight Hours started was to add real world paranormal elements into a horror game. Also, Eight Hours was to change the models of either using weapons in horror games or running and hiding as a means to survive. These two thoughts created the first iterations of Eight Hours. Two Prototypes Originally, there were two games slated for development, or at least designed in documentation. The first was title “Sick Day”, the second “The Hauntings”. Sick Day “Sick Day” was meant to be a story rich, deeply psychologically driven horror game focused on the impacts of guilt and denial. It was to feature a new way to interact with player’s from the games perspective. Rather than displaying data directly to an in game HUD, a cell phone held by the player character would drive all interactions within the game. This was to be achieved via text messages, voice mails, and smart home controls. Eventually, the plot would change and remove these controls (literally) from the player. Because the phone was a physical in game element, the idea was to use the phone as a natural way to remove control from the player - rather than “randomly” disabling buttons. Unfortunately, creating a story rich game with fear based on psychology is expensive in nature. Voice actors, writers, and strong animators would have been required to create the level of immersion required by the game play and story of “Sick Day”. Thus, the game stopped at the prototype phase. However, all was not lost, as most of the code moved onwards to a new project. The Hauntings “The Hauntings” was designed to be a horror themed strategy game. The idea of the game was to have player’s assume the role of a paranormal investigator looking to launch a career in mainstream ghost hunting. The core of the game was to select locations and equipment for filming ghostly places. Each haunted area would randomly generate haunting events that must be captured by the crew to score higher ratings. The technical work required to create a playable and production ready game with such a small team far out weighed the time frame in which a final product could be developed. From here, the majority of game play events in Eight Hours was born. EVPs “Eight Hours” sought to introduce the novel idea of real world Electronic Voice Phenomenon into a horror game. Before, most of the spooky voices and “captured” voices in other horror games were a work of art, not fact. Sourcing from the Paranormal Investigators of Milwaukee, Troll Purse was able to grab raw, unedited EVP captures from real paranormal investigations. So far, four of these EVPs are in the demo. The plan is to collect more and make collecting these EVPs a core part of the game play. So, from those previous prototypes and a desire to bring the real world of the paranormal into a game, Eight Hours was born. Here is a glimpse into what it is all about. Game Play Eight Hours plays the same as most games when it comes to controls. It uses standard first person movement controls for PC. This is mouse for looking and W,A,S,D keys for moving and strafing, and the mouse button to interact with objects within the world. What changes within Eight Hours in comparison to other horror themed action or adventure games are the objectives and the way the player survives. You can Run, You can Hide, but you Can’t Fight A long lasting trend in hobby and indie horror game development is to follow the pattern of Amnesia: The Dark Descent where in, the player is unable to combat the monsters within the game. Rather, the player must run and hide to survive. Another excellent game that follows this is Outlast. These types of horror games really rely on the flight mechanic of fight or flight. So, they have chosen to alienate the other half of world (or some other portion of the world) that would actually choose to fight. What happens here is they weaken player choice within a dynamic medium such as games. Granted, this does create a strong feeling of helplessness as the player literally can’t do anything to help themselves to save themselves. Backed in a corner? Too bad, YOU DEAD! Guts and Glory in Horror The “run and hide to survive” pattern emerged because a lot of scary games were actually just turning into shooters with horror elements. While good games like F.E.A.R, Resident Evil, and Dead Space existed, the definition from horror game with shooting elements to a shooter with horror elements became more apparent with each new installment. Games like this focus on the fight mechanic of human survival. On top of that, there is the added stress of resource management in regards to ammunition. However, some of these games would either stall if you don’t have any more ammunition, in which, you can no longer progress. Another issue is that again, an entirely different group of people with a different mindset are alienated from the game due to a lack of game play choices. Out of ammo, can’t progress, can’t skirt enemies? Too bad, YOU IN LIMBO! Eight Hours Spin Troll Purse wanted to take a different approach. It wanted to introduce the choice between non-typical weapons and the run and hide mechanics. This was achieved in several ways. Weapons Eight Hours does not use weapons in the typical sense. The rules within the reality of the game is that spirits cannot exist within lighted areas. Why? Because the wave and particle behavior of light disrupts the physical manifestation of the spirit. So, the spirit does actually still exist, but cannot interact within the physical realm. So, within the game, the player is able to activate lights within the haunted abode to attempt banishment of an entity. This mechanic also follows the limited resource style by only installing a fixed number of lights that can only be activated from a fixed location. Finally, it takes several indeterminate seconds for the light to fully banish the entity. Detection To add a layer of strategy to the game, the player is enabled to detect the spirit. There are two forms of detection. Visual and Audio. Alternative light detection devices or enhanced night vision devices will have to ability to capture the movement of a “shadow” “cast” by the entities manifestation within the physical realm. Also, as it is being disrupted by light, the entity will cast a brief shadow. The second form of detection is audio. The player will have access to devices that will detect the electromagnetic force given off my the physical manifestation of a spirit. This will warn the player by playing a sound as the the spirit closes in within the proximity of these devices. To enhance the level of doubt, the player is able to strategically place these devices in a static location. They will not be able to detect the proximity of the spirit in regards to the players position. Survival If fighting is not within the skills or desires of the player, the player will always have the choice to evade the physically manifested spirits within the game. Combined with the different strategies to detect a spirit the play can find rooms to dodge into to avoid the spirit chasing them. Not much more to it than that. Of Course, that is all just in the game It is believed by some that playing with the EVPs of spirits or entertaining the idea of the existence of spirits invites them into your life - within reality. Not sure how to help out there. But, perhaps you should turn on the lights just in case. View the full article
  9. Originally posted on Troll Purse development blog. Unreal Engine 4 is an awesome game engine and the Editor is just as good. There are a lot of built in tools for a game (especially shooters) and some excellent tutorials out there for it. So, here is one more. Today the topic to discuss is different methods to program player world interaction in Unreal Engine 4 in C++. While the context is specific to UE4, it can also easily translate to any game with a similar architecture. Interaction via Overlaps By and far, the most common tutorials for player-world interaction is to use Trigger Volumes or Trigger Actors. This makes sense, it is a decoupled way to set up interaction and leverages most of the work using classes already provided by the engine. Here is a simple example where the overlap code is used to interact with the player: Header // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/Actor.h" #include "InteractiveActor.generated.h" UCLASS() class GAME_API InteractiveActor : public AActor { GENERATED_BODY() public: // Sets default values for this actor's properties InteractiveActor(); virtual void BeginPlay() override; protected: UFUNCTION() virtual void OnInteractionTriggerBeginOverlap(UPrimitiveComponent* OverlappedComp, AActor* OtherActor, UPrimitiveComponent* OtherComp, int32 OtherBodyIndex, bool bFromSweep, const FHitResult& SweepResult); UFUNCTION() virtual void OnInteractionTriggerEndOverlap(UPrimitiveComponent* OverlappedComp, class AActor* OtherActor, class UPrimitiveComponent* OtherComp, int32 OtherBodyIndex); UFUNCTION() virtual void OnPlayerInputActionReceived(); UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = Interaction) class UBoxComponent* InteractionTrigger; } This is a small header file for a simple base Actor class that can handle overlap events and a single input action. From here, one can start building up the various entities within a game that will respond to player input. For this to work, the player pawn or character will have to overlap with the InteractionTrigger component. This will then put the InteractiveActor into the input stack for that specific player. The player will then trigger the input action (via a keyboard key press for example), and then the code in OnPlayerInputActionReceived will execute. Here is a layout of the executing code. Source // Fill out your copyright notice in the Description page of Project Settings. #include "InteractiveActor.h" #include "Components/BoxComponent.h" // Sets default values AInteractiveActor::AInteractiveActor() { PrimaryActorTick.bCanEverTick = true; RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("Root")); RootComponent->SetMobility(EComponentMobility::Static); InteractionTrigger = CreateDefaultSubobject<UBoxComponent>(TEXT("Interaction Trigger")); InteractionTrigger->InitBoxExtent(FVector(128, 128, 128)); InteractionTrigger->SetMobility(EComponentMobility::Static); InteractionTrigger->OnComponentBeginOverlap.AddUniqueDynamic(this, &ABTPEquipment::OnInteractionProxyBeginOverlap); InteractionTrigger->OnComponentEndOverlap.AddUniqueDynamic(this, &ABTPEquipment::OnInteractionProxyEndOverlap); InteractionTrigger->SetupAttachment(RootComponent); } void AInteractiveActor::BeginPlay() { if(InputComponent == nullptr) { InputComponent = ConstructObject<UInputComponent>(UInputComponent::StaticClass(), this, "Input Component"); InputComponent->bBlockInput = bBlockInput; } InputComponent->BindAction("Interact", EInputEvent::IE_Pressed, this, &AInteractiveActor::OnPlayerInputActionReceived); } void AInteractiveActor::OnPlayerInputActionReceived() { //this is where logic for the actor when it receives input will be execute. You could add something as simple as a log message to test it out. } void AInteractiveActor::OnInteractionProxyBeginOverlap(UPrimitiveComponent* OverlappedComp, AActor* OtherActor, UPrimitiveComponent* OtherComp, int32 OtherBodyIndex, bool bFromSweep, const FHitResult& SweepResult) { if (OtherActor) { AController* Controller = OtherActor->GetController(); if(Controller) { APlayerController* PC = Cast<APlayerController>(Controller); if(PC) { EnableInput(PC); } } } } void AInteractiveActor::OnInteractionProxyEndOverlap(UPrimitiveComponent* OverlappedComp, class AActor* OtherActor, class UPrimitiveComponent* OtherComp, int32 OtherBodyIndex) { if (OtherActor) { AController* Controller = OtherActor->GetController(); if(Controller) { APlayerController* PC = Cast<APlayerController>(Controller); if(PC) { DisableInput(PC); } } } } Pros and Cons The positives of the collision volume approach is the ease at which the code is implemented and the strong decoupling from the rest of the game logic. The negatives to this approach is that interaction becomes broad when considering the game space as well as the introduction to a new interactive volume for each interactive within the scene. Interaction via Raytrace Another popular method is to use the look at viewpoint of the player to ray trace for any interactive world items for the player to interact with. This method usually relies on inheritance for handling player interaction within the interactive object class. This method eliminates the need for another collision volume for item usage and allows for more precise interaction targeting. Source AInteractiveActor.h // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/Actor.h" #include "InteractiveActor.generated.h" UCLASS() class GAME_API AInteractiveActor : public AActor { GENERATED_BODY() public: virtual OnReceiveInteraction(class APlayerController* PC); } AMyPlayerController.h // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/PlayerController.h" #include "AMyPlayerController.generated.h" UCLASS() class GAME_API AMyPlayerController : public APlayerController { GENERATED_BODY() AMyPlayerController(); public: virtual void SetupInputComponent() override; float MaxRayTraceDistance; private: AInteractiveActor* GetInteractiveByCast(); void OnCastInput(); } These header files define the functions minimally needed to setup raycast interaction. Also note that there are two files here as two classes would need modification to support input. This is more work that the first method shown that uses trigger volumes. However, all input binding is now constrained to the single ACharacter class or - if you designed it differently - the APlayerController class. Here, the latter was used. The logic flow is straight forward. A player can point the center of the screen towards an object (Ideally a HUD crosshair aids in the coordination) and press the desired input button bound to Interact. From here, the function OnCastInput() is executed. It will invoke GetInteractiveByCast() returning either the first camera ray cast collision or nullptr if there are no collisions. Finally, the AInteractiveActor::OnReceiveInteraction(APlayerController*) function is invoked. That final function is where inherited classes will implement interaction specific code. The simple execution of the code is as follows in the class definitions. AInteractiveActor.cpp void AInteractiveActor::OnReceiveInteraction(APlayerController* PC) { //nothing in the base class (unless there is logic ALL interactive actors will execute, such as cosmetics (i.e. sounds, particle effects, etc.)) } AMyPlayerController.cpp AMyPlayerController::AMyPlayerController() { MaxRayTraceDistance = 1000.0f; } AMyPlayerController::SetupInputComponent() { Super::SetupInputComponent(); InputComponent->BindAction("Interact", EInputEvent::IE_Pressed, this, &AInteractiveActor::OnCastInput); } void AMyPlayerController::OnCastInput() { AInteractiveActor* Interactive = GetInteractiveByCast(); if(Interactive != nullptr) { Interactive->OnReceiveInteraction(this); } else { return; } } AInteractiveActor* AMyPlayerController::GetInteractiveByCast() { FVector CameraLocation; FRotator CameraRotation; GetPlayerViewPoint(CameraLocation, CameraRotation); FVector TraceEnd = CameraLocation + (CameraRotation.Vector() * MaxRayTraceDistance); FCollisionQueryParams TraceParams(TEXT("RayTrace"), true, GetPawn()); TraceParams.bTraceAsyncScene = true; FHitResult Hit(ForceInit); GetWorld()->LineTraceSingleByChannel(Hit, CameraLocation, TraceEnd, ECC_Visibility, TraceParams); AActor* HitActor = Hit.GetActor(); if(HitActor != nullptr) { return Cast<AInteractiveActor>(HitActor); } else { return nullptr; } } Pros and Cons One pro for this method is the control of input stays in the player controller and implementation of input actions is still owned by the Actor that receives the input. Some cons are that the interaction can be fired as many times as a player clicks and does not repeatedly detect interactive state without a refactor using a Tick function override. Conclusion There are many methods to player-world interaction within a game world. In regards to creating Actors within Unreal Engine 4 that allow for player interaction, two of these potential methods are collision volume overlaps and ray tracing from the player controller. There are several other methods discussed out there that could also be used. Hopefully, the two implementations presented help you decide on how to go about player-world interaction within your game. Cheers! Originally posted on Troll Purse development blog.
  10. Martin H Hollstein

    Designing Player World Interaction in Unreal Engine 4

    Originally posted on Troll Purse development blog. Unreal Engine 4 is an awesome game engine and the Editor is just as good. There are a lot of built in tools for a game (especially shooters) and some excellent tutorials out there for it. So, here is one more. Today the topic to discuss is different methods to program player world interaction in Unreal Engine 4 in C++. While the context is specific to UE4, it can also easily translate to any game with a similar architecture. Interaction via Overlaps By and far, the most common tutorials for player-world interaction is to use Trigger Volumes or Trigger Actors. This makes sense, it is a decoupled way to set up interaction and leverages most of the work using classes already provided by the engine. Here is a simple example where the overlap code is used to interact with the player: Header // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/Actor.h" #include "InteractiveActor.generated.h" UCLASS() class GAME_API InteractiveActor : public AActor { GENERATED_BODY() public: // Sets default values for this actor's properties InteractiveActor(); virtual void BeginPlay() override; protected: UFUNCTION() virtual void OnInteractionTriggerBeginOverlap(UPrimitiveComponent* OverlappedComp, AActor* OtherActor, UPrimitiveComponent* OtherComp, int32 OtherBodyIndex, bool bFromSweep, const FHitResult& SweepResult); UFUNCTION() virtual void OnInteractionTriggerEndOverlap(UPrimitiveComponent* OverlappedComp, class AActor* OtherActor, class UPrimitiveComponent* OtherComp, int32 OtherBodyIndex); UFUNCTION() virtual void OnPlayerInputActionReceived(); UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = Interaction) class UBoxComponent* InteractionTrigger; } This is a small header file for a simple base Actor class that can handle overlap events and a single input action. From here, one can start building up the various entities within a game that will respond to player input. For this to work, the player pawn or character will have to overlap with the InteractionTrigger component. This will then put the InteractiveActor into the input stack for that specific player. The player will then trigger the input action (via a keyboard key press for example), and then the code in OnPlayerInputActionReceived will execute. Here is a layout of the executing code. Source // Fill out your copyright notice in the Description page of Project Settings. #include "InteractiveActor.h" #include "Components/BoxComponent.h" // Sets default values AInteractiveActor::AInteractiveActor() { PrimaryActorTick.bCanEverTick = true; RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("Root")); RootComponent->SetMobility(EComponentMobility::Static); InteractionTrigger = CreateDefaultSubobject<UBoxComponent>(TEXT("Interaction Trigger")); InteractionTrigger->InitBoxExtent(FVector(128, 128, 128)); InteractionTrigger->SetMobility(EComponentMobility::Static); InteractionTrigger->OnComponentBeginOverlap.AddUniqueDynamic(this, &ABTPEquipment::OnInteractionProxyBeginOverlap); InteractionTrigger->OnComponentEndOverlap.AddUniqueDynamic(this, &ABTPEquipment::OnInteractionProxyEndOverlap); InteractionTrigger->SetupAttachment(RootComponent); } void AInteractiveActor::BeginPlay() { if(InputComponent == nullptr) { InputComponent = ConstructObject<UInputComponent>(UInputComponent::StaticClass(), this, "Input Component"); InputComponent->bBlockInput = bBlockInput; } InputComponent->BindAction("Interact", EInputEvent::IE_Pressed, this, &AInteractiveActor::OnPlayerInputActionReceived); } void AInteractiveActor::OnPlayerInputActionReceived() { //this is where logic for the actor when it receives input will be execute. You could add something as simple as a log message to test it out. } void AInteractiveActor::OnInteractionProxyBeginOverlap(UPrimitiveComponent* OverlappedComp, AActor* OtherActor, UPrimitiveComponent* OtherComp, int32 OtherBodyIndex, bool bFromSweep, const FHitResult& SweepResult) { if (OtherActor) { AController* Controller = OtherActor->GetController(); if(Controller) { APlayerController* PC = Cast<APlayerController>(Controller); if(PC) { EnableInput(PC); } } } } void AInteractiveActor::OnInteractionProxyEndOverlap(UPrimitiveComponent* OverlappedComp, class AActor* OtherActor, class UPrimitiveComponent* OtherComp, int32 OtherBodyIndex) { if (OtherActor) { AController* Controller = OtherActor->GetController(); if(Controller) { APlayerController* PC = Cast<APlayerController>(Controller); if(PC) { DisableInput(PC); } } } } Pros and Cons The positives of the collision volume approach is the ease at which the code is implemented and the strong decoupling from the rest of the game logic. The negatives to this approach is that interaction becomes broad when considering the game space as well as the introduction to a new interactive volume for each interactive within the scene. Interaction via Raytrace Another popular method is to use the look at viewpoint of the player to ray trace for any interactive world items for the player to interact with. This method usually relies on inheritance for handling player interaction within the interactive object class. This method eliminates the need for another collision volume for item usage and allows for more precise interaction targeting. Source AInteractiveActor.h // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/Actor.h" #include "InteractiveActor.generated.h" UCLASS() class GAME_API AInteractiveActor : public AActor { GENERATED_BODY() public: virtual OnReceiveInteraction(class APlayerController* PC); } AMyPlayerController.h // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/PlayerController.h" #include "AMyPlayerController.generated.h" UCLASS() class GAME_API AMyPlayerController : public APlayerController { GENERATED_BODY() AMyPlayerController(); public: virtual void SetupInputComponent() override; float MaxRayTraceDistance; private: AInteractiveActor* GetInteractiveByCast(); void OnCastInput(); } These header files define the functions minimally needed to setup raycast interaction. Also note that there are two files here as two classes would need modification to support input. This is more work that the first method shown that uses trigger volumes. However, all input binding is now constrained to the single ACharacter class or - if you designed it differently - the APlayerController class. Here, the latter was used. The logic flow is straight forward. A player can point the center of the screen towards an object (Ideally a HUD crosshair aids in the coordination) and press the desired input button bound to Interact. From here, the function OnCastInput() is executed. It will invoke GetInteractiveByCast() returning either the first camera ray cast collision or nullptr if there are no collisions. Finally, the AInteractiveActor::OnReceiveInteraction(APlayerController*) function is invoked. That final function is where inherited classes will implement interaction specific code. The simple execution of the code is as follows in the class definitions. AInteractiveActor.cpp void AInteractiveActor::OnReceiveInteraction(APlayerController* PC) { //nothing in the base class (unless there is logic ALL interactive actors will execute, such as cosmetics (i.e. sounds, particle effects, etc.)) } AMyPlayerController.cpp AMyPlayerController::AMyPlayerController() { MaxRayTraceDistance = 1000.0f; } AMyPlayerController::SetupInputComponent() { Super::SetupInputComponent(); InputComponent->BindAction("Interact", EInputEvent::IE_Pressed, this, &AInteractiveActor::OnCastInput); } void AMyPlayerController::OnCastInput() { AInteractiveActor* Interactive = GetInteractiveByCast(); if(Interactive != nullptr) { Interactive->OnReceiveInteraction(this); } else { return; } } AInteractiveActor* AMyPlayerController::GetInteractiveByCast() { FVector CameraLocation; FRotator CameraRotation; GetPlayerViewPoint(CameraLocation, CameraRotation); FVector TraceEnd = CameraLocation + (CameraRotation.Vector() * MaxRayTraceDistance); FCollisionQueryParams TraceParams(TEXT("RayTrace"), true, GetPawn()); TraceParams.bTraceAsyncScene = true; FHitResult Hit(ForceInit); GetWorld()->LineTraceSingleByChannel(Hit, CameraLocation, TraceEnd, ECC_Visibility, TraceParams); AActor* HitActor = Hit.GetActor(); if(HitActor != nullptr) { return Cast<AInteractiveActor>(HitActor); } else { return nullptr; } } Pros and Cons One pro for this method is the control of input stays in the player controller and implementation of input actions is still owned by the Actor that receives the input. Some cons are that the interaction can be fired as many times as a player clicks and does not repeatedly detect interactive state without a refactor using a Tick function override. Conclusion There are many methods to player-world interaction within a game world. In regards to creating Actors within Unreal Engine 4 that allow for player interaction, two of these potential methods are collision volume overlaps and ray tracing from the player controller. There are several other methods discussed out there that could also be used. Hopefully, the two implementations presented help you decide on how to go about player-world interaction within your game. Cheers! Originally posted on Troll Purse development blog.
  11. Martin H Hollstein

    Designing Player World Interaction in Unreal Engine 4

    Unreal Engine 4 is an awesome game engine and the Editor is just as good. There are a lot of built in tools for a game (especially shooters) and some excellent tutorials out there for it. So, here is one more. Today the topic to discuss is different methods to program player world interaction in Unreal Engine 4 in C++. While the context is specific to UE4, it can also easily translate to any game with a similar architecture. Interaction via Overlaps By and far, the most common tutorials for player-world interaction is to use Trigger Volumes or Trigger Actors. This makes sense, it is a decoupled way to set up interaction and leverages most of the work using classes already provided by the engine. Here is a simple example where the overlap code is used to interact with the player: Header // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/Actor.h" #include "InteractiveActor.generated.h" UCLASS() class GAME_API InteractiveActor : public AActor { GENERATED_BODY() public: // Sets default values for this actor's properties InteractiveActor(); virtual void BeginPlay() override; protected: UFUNCTION() virtual void OnInteractionTriggerBeginOverlap(UPrimitiveComponent* OverlappedComp, AActor* OtherActor, UPrimitiveComponent* OtherComp, int32 OtherBodyIndex, bool bFromSweep, const FHitResult& SweepResult); UFUNCTION() virtual void OnInteractionTriggerEndOverlap(UPrimitiveComponent* OverlappedComp, class AActor* OtherActor, class UPrimitiveComponent* OtherComp, int32 OtherBodyIndex); UFUNCTION() virtual void OnPlayerInputActionReceived(); UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = Interaction) class UBoxComponent* InteractionTrigger; } This is a small header file for a simple base Actor class that can handle overlap events and a single input action. From here, one can start building up the various entities within a game that will respond to player input. For this to work, the player pawn or character will have to overlap with the InteractionTrigger component. This will then put the InteractiveActor into the input stack for that specific player. The player will then trigger the input action (via a keyboard key press for example), and then the code in OnPlayerInputActionReceived will execute. Here is a layout of the executing code. Source // Fill out your copyright notice in the Description page of Project Settings. #include "InteractiveActor.h" #include "Components/BoxComponent.h" // Sets default values AInteractiveActor::AInteractiveActor() { PrimaryActorTick.bCanEverTick = true; RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("Root")); RootComponent->SetMobility(EComponentMobility::Static); InteractionTrigger = CreateDefaultSubobject<UBoxComponent>(TEXT("Interaction Trigger")); InteractionTrigger->InitBoxExtent(FVector(128, 128, 128)); InteractionTrigger->SetMobility(EComponentMobility::Static); InteractionTrigger->OnComponentBeginOverlap.AddUniqueDynamic(this, &ABTPEquipment::OnInteractionProxyBeginOverlap); InteractionTrigger->OnComponentEndOverlap.AddUniqueDynamic(this, &ABTPEquipment::OnInteractionProxyEndOverlap); InteractionTrigger->SetupAttachment(RootComponent); } void AInteractiveActor::BeginPlay() { if(InputComponent == nullptr) { InputComponent = ConstructObject<UInputComponent>(UInputComponent::StaticClass(), this, "Input Component"); InputComponent->bBlockInput = bBlockInput; } InputComponent->BindAction("Interact", EInputEvent::IE_Pressed, this, &AInteractiveActor::OnPlayerInputActionReceived); } void AInteractiveActor::OnPlayerInputActionReceived() { //this is where logic for the actor when it receives input will be execute. You could add something as simple as a log message to test it out. } void AInteractiveActor::OnInteractionProxyBeginOverlap(UPrimitiveComponent* OverlappedComp, AActor* OtherActor, UPrimitiveComponent* OtherComp, int32 OtherBodyIndex, bool bFromSweep, const FHitResult& SweepResult) { if (OtherActor) { AController* Controller = OtherActor->GetController(); if(Controller) { APlayerController* PC = Cast<APlayerController>(Controller); if(PC) { EnableInput(PC); } } } } void AInteractiveActor::OnInteractionProxyEndOverlap(UPrimitiveComponent* OverlappedComp, class AActor* OtherActor, class UPrimitiveComponent* OtherComp, int32 OtherBodyIndex) { if (OtherActor) { AController* Controller = OtherActor->GetController(); if(Controller) { APlayerController* PC = Cast<APlayerController>(Controller); if(PC) { DisableInput(PC); } } } } Pros and Cons The positives of the collision volume approach is the ease at which the code is implemented and the strong decoupling from the rest of the game logic. The negatives to this approach is that interaction becomes broad when considering the game space as well as the introduction to a new interactive volume for each interactive within the scene. Interaction via Raytrace Another popular method is to use the look at viewpoint of the player to ray trace for any interactive world items for the player to interact with. This method usually relies on inheritance for handling player interaction within the interactive object class. This method eliminates the need for another collision volume for item usage and allows for more precise interaction targeting. Source AInteractiveActor.h // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/Actor.h" #include "InteractiveActor.generated.h" UCLASS() class GAME_API AInteractiveActor : public AActor { GENERATED_BODY() public: virtual OnReceiveInteraction(class APlayerController* PC); } AMyPlayerController.h // Fill out your copyright notice in the Description page of Project Settings. #pragma once #include "CoreMinimal.h" #include "GameFramework/PlayerController.h" #include "AMyPlayerController.generated.h" UCLASS() class GAME_API AMyPlayerController : public APlayerController { GENERATED_BODY() AMyPlayerController(); public: virtual void SetupInputComponent() override; float MaxRayTraceDistance; private: AInteractiveActor* GetInteractiveByCast(); void OnCastInput(); } These header files define the functions minimally needed to setup raycast interaction. Also note that there are two files here as two classes would need modification to support input. This is more work that the first method shown that uses trigger volumes. However, all input binding is now constrained to the single ACharacter class or - if you designed it differently - the APlayerController class. Here, the latter was used. The logic flow is straight forward. A player can point the center of the screen towards an object (Ideally a HUD crosshair aids in the coordination) and press the desired input button bound to Interact. From here, the function OnCastInput() is executed. It will invoke GetInteractiveByCast() returning either the first camera ray cast collision or nullptr if there are no collisions. Finally, the AInteractiveActor::OnReceiveInteraction(APlayerController*) function is invoked. That final function is where inherited classes will implement interaction specific code. The simple execution of the code is as follows in the class definitions. AInteractiveActor.cpp void AInteractiveActor::OnReceiveInteraction(APlayerController* PC) { //nothing in the base class (unless there is logic ALL interactive actors will execute, such as cosmetics (i.e. sounds, particle effects, etc.)) } AMyPlayerController.cpp AMyPlayerController::AMyPlayerController() { MaxRayTraceDistance = 1000.0f; } AMyPlayerController::SetupInputComponent() { Super::SetupInputComponent(); InputComponent->BindAction("Interact", EInputEvent::IE_Pressed, this, &AInteractiveActor::OnCastInput); } void AMyPlayerController::OnCastInput() { AInteractiveActor* Interactive = GetInteractiveByCast(); if(Interactive != nullptr) { Interactive->OnReceiveInteraction(this); } else { return; } } AInteractiveActor* AMyPlayerController::GetInteractiveByCast() { FVector CameraLocation; FRotator CameraRotation; GetPlayerViewPoint(CameraLocation, CameraRotation); FVector TraceEnd = CameraLocation + (CameraRotation.Vector() * MaxRayTraceDistance); FCollisionQueryParams TraceParams(TEXT("RayTrace"), true, GetPawn()); TraceParams.bTraceAsyncScene = true; FHitResult Hit(ForceInit); GetWorld()->LineTraceSingleByChannel(Hit, CameraLocation, TraceEnd, ECC_Visibility, TraceParams); AActor* HitActor = Hit.GetActor(); if(HitActor != nullptr) { return Cast<AInteractiveActor>(HitActor); } else { return nullptr; } } Pros and Cons One pro for this method is the control of input stays in the player controller and implementation of input actions is still owned by the Actor that receives the input. Some cons are that the interaction can be fired as many times as a player clicks and does not repeatedly detect interactive state without a refactor using a Tick function override. Conclusion There are many methods to player-world interaction within a game world. In regards to creating Actors within Unreal Engine 4 that allow for player interaction, two of these potential methods are collision volume overlaps and ray tracing from the player controller. There are several other methods discussed out there that could also be used. Hopefully, the two implementations presented help you decide on how to go about player-world interaction within your game. Cheers! View the full article
  12. Martin H Hollstein

    General That Is Right, Discord Update Automation in AWS

    Originally posted on Troll Purse Dev Blog Recently, Troll Purse setup a public invite for the Troll Purse Discord Server. And, as with all things, we decided to test out using Discord Webhooks to push updates to our members in realtime. This is by far the most effective realtime pushing we have conceived yet. It was so easy, sharing it will be just as easy. Using A Simple Webhook Usually, the pattern at Troll Purse to push to third party accounts follows this pattern: Sign up for the third party account Register an application Find an API wrapper library for said third party account Publish an AWS Lambda Post about it! This time, we decided to skip step 3. For the most part, the developers at Troll Purse recognized that this push would require very little data transformation and authentication routines. In fact, all of the work was done in one POST request to the Troll Purse Discord Server. The Code, Kind Human public async Task<string> FunctionHandler(SNSEvent input, ILambdaContext context) { try { var messageJSONString = input.Records[0]?.Sns.Message; context?.Logger.LogLine($"Received({input.Records[0]?.Sns.MessageId}): {messageJSONString}"); if (messageJSONString != null) { var messageContent = JsonConvert.DeserializeObject<BlogContentUpdated>(messageJSONString); using (var httpClient = new HttpClient()) { string payload = $"{"content":"{messageContent.PostTitle}. {messageContent.ContentSnippet}... {messageContent.PostLink}"}"; var response = await httpClient.PostAsync(Environment.GetEnvironmentVariable("discord_webhook"), new StringContent(payloadEncoding.UTF8, "application/json")); return response.StatusCode.ToString(); } } else { return null; } } catch (Exception e) { context?.Logger.LogLine("Unable to Discord the SNS message"); context?.Logger.LogLine(e.Message); context?.Logger.LogLine(e.StackTrace); return null; } } Notes: BlogContentUpdated is code defined in an external Troll Purse binary. WE USE SECURE ENVIRONMENT VARIABLES!!! THIS IS IMPORTANT!!!! (As opposed to plaintext credentials in our source code.) The Joy of Lambda All of these features that Troll Purse has blogged about are done within a few hours. This is easily aided by the idea of serverless programming. There is no overhead of provisioning servers, testing different server environments, and configuring a network for these functions. It removes a lot of network infrastructure and enables Troll Purse developers to create fast, reactive, internal services. Please, if you spend too much time configuring and setting up, try using AWS Lambda to speed up development time. Would You Look At That In two lines, without a library or API wrapper, our developers can now push blog updates to our Discord server. This is a nice quick feature that we plan on integrating in our automated build environment to push updates about new versions released to the public. Enjoy! Originally posted on Troll Purse Dev Blog
  13. Martin H Hollstein

    That Is Right, Discord Update Automation in AWS

    Recently, Troll Purse setup a public invite for the Troll Purse Discord Server. And, as with all things, we decided to test out using Discord Webhooks to push updates to our members in realtime. This is by far the most effective realtime pushing we have conceived yet. It was so easy, sharing it will be just as easy. Using A Simple Webhook Usually, the pattern at Troll Purse to push to third party accounts follows this pattern: Sign up for the third party account Register an application Find an API wrapper library for said third party account Publish an AWS Lambda Post about it! This time, we decided to skip step 3. For the most part, the developers at Troll Purse recognized that this push would require very little data transformation and authentication routines. In fact, all of the work was done in one POST request to the Troll Purse Discord Server. The Code, Kind Human public async Task<string> FunctionHandler(SNSEvent input, ILambdaContext context) { try { var messageJSONString = input.Records[0]?.Sns.Message; context?.Logger.LogLine($"Received({input.Records[0]?.Sns.MessageId}): {messageJSONString}"); if (messageJSONString != null) { var messageContent = JsonConvert.DeserializeObject<BlogContentUpdated>(messageJSONString); using (var httpClient = new HttpClient()) { string payload = $"{"content":"{messageContent.PostTitle}. {messageContent.ContentSnippet}... {messageContent.PostLink}"}"; var response = await httpClient.PostAsync(Environment.GetEnvironmentVariable("discord_webhook"), new StringContent(payloadEncoding.UTF8, "application/json")); return response.StatusCode.ToString(); } } else { return null; } } catch (Exception e) { context?.Logger.LogLine("Unable to Discord the SNS message"); context?.Logger.LogLine(e.Message); context?.Logger.LogLine(e.StackTrace); return null; } } Notes: BlogContentUpdated is code defined in an external Troll Purse binary. WE USE SECURE ENVIRONMENT VARIABLES!!! THIS IS IMPORTANT!!!! (As opposed to plaintext credentials in our source code.) The Joy of Lambda All of these features that Troll Purse has blogged about are done within a few hours. This is easily aided by the idea of serverless programming. There is no overhead of provisioning servers, testing different server environments, and configuring a network for these functions. It removes a lot of network infrastructure and enables Troll Purse developers to create fast, reactive, internal services. Please, if you spend too much time configuring and setting up, try using AWS Lambda to speed up development time. Would You Look At That In two lines, without a library or API wrapper, our developers can now push blog updates to our Discord server. This is a nice quick feature that we plan on integrating in our automated build environment to push updates about new versions released to the public. Enjoy! View the full article
  14. Martin H Hollstein

    Even More AWS Automation With Tumblr

    Yes! Troll Purse has done it again! Looking at all the great social platforms out there, we were missing one that needed our posting! Tumblr has now been integrated into our automation platform. Following our desire to automate a lot of our processes, our foray into blog sharing automation has brought us upon the beaches of Tumblr’s API. Today, we again, share with you our implementation. How We Did It Just like our Reddit link posting automation, Troll Purse decided to use NodeJS and a javascript API library for Tumblr, by Tumblr. The big reason behind this was because C-Sharp was not supported by the offical Tumblr APIs. Seeing that we already had 90% of the code to write another Javascript API, we went that route for the Tumblr API as well. All the Troll Purse developers had to do was implement the same logic as we did for Reddit link posting. Then, we swapped out the client provider. This does say that some refactoring needs to be done as this was common logic copied for two different projects with the same parameters across. But, let us show you our first iteration that worked right off the bat! Again, note that we used secured environment variables for our configuration data! Ze Code var tumblr = require('tumblr.js'); var createTags = function (content) { var hashTagStr = 'indie,video games'; var tagMap = { AWS: 'AWS,Amazon', UE4: 'UE4,Unreal Engine' } for (var tag in tagMap) { if (tagMap.hasOwnProperty(tag)) { var search = tagMap[tag].split(','); for (var i = 0; i < search.length; ++i) { var str = search[i]; if (content.indexOf(str) >= 0) { hashTagStr += ',' + tag; break; } } } } return hashTagStr; } exports.handler = function (event, context, callback) { if (event != null) { var record = event.Records[0]; if (record != null) { console.log('Received(' + record.Sns.MessageId + '): ' + record.Sns.Message + ')'); var body = JSON.parse(record.Sns.Message); var client = tumblr.createClient({ credentials: { consumer_key: process.env.consumer_key, consumer_secret: process.env.consumer_secret, token: process.env.token, token_secret: process.env.token_secret }, returnPromises: true }); var params = { title: body.PostTitle, url: body.PostLink, description: body.ContentSnippet + '...', tags: createTags(body.PostTitle + ' ' + body.ContentSnippet) }; console.log("Sending: " + JSON.stringify(JSON.stringify(params))); client.createLinkPost(process.env.blog_name, params) .then(resp => callback(null, JSON.stringify(resp || 'Success'))); } else { console.log('null record'); callback(null, null); } } else { console.log('No event object'); callback(null, null); } }; The one differece is that there is a createTags function. This stupid little function just scans our title and content snippet to apply predetermined hashtags that are relevant to the content. That way, Troll Purse is not just making another reposting spam bot with dozens of non-relevant hashtags. Farewell Since this was pretty much a cut and dry repeat of our Reddit link posting, this was a short and sweet post. Here are all the resources we used to write this Lambda function for Tumblr posting: Tumblr API Get Started Creating a Tumblr App The Tumblr npm package View the full article
  15. Martin H Hollstein

    Learning How Troll Purse Easily Setup Forums in AWS

    How Troll Purse Easily Setup Forums in AWS Originally Posted on Troll Purse Dev Blog After our migration to AWS, Troll Purse removed the old forums running in Digital Ocean. Troll Purse decided to start with a clean slate. Which was easy - as nobody registered (no migrations needed, just nuclear destruction of the service)! A curse that turned into a blessing. Troll Purse can now scale the forums based on usage and save some money on infrastructure. This will allow us to put more effort into our games! How To Troll Purse decided to share with you how to set up this type of environment. S3 Configuration For hosting content uploaded by Troll Purse forum users, S3 was used to store images. Since NodeBB has a nice S3 upload plugin, there was little to no work other than configuration needed to enable the feature. S3 on the otherhand, required configuration to allow access from http://forums.trollpurse.com. However, it also needed to allow access to the real DNS hostname (according to AWS) for the actual server to update data. This meant a custom S3 CORS policy and S3 Bucket Policy. Finally, the role our server would assume needed to have full access to S3 buckets. Further, Troll Purse could restrict access by bucket name. Below are examples Troll Purse Built up to help restrict access to an S3 bucket. Note, AWS will still mark it as public. However, there was a configuration that allowed public GET without S3 being marked public. S3 Bucket Policy { "Version": "2012-10-17", "Id": "website access bucket Policy", "Statement": [ { "Sid": "Allow get requests originating from your domain.", "Effect": "Allow", "Principal": "*", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::your-bucket-name/*", "Condition": { "StringLike": { "aws:Referer": "http://your-domain-name/*" } } }, { "Sid": "Deny get requests not originating from your domain.", "Effect": "Deny", "Principal": "*", "Action": "s3:GetObject", "Resource": "arn:aws:s3:::your-bucket-name/*", "Condition": { "StringNotLike": { "aws:Referer": "http://your-domain-name/*" } } }, { "Sid": "Create, Update, Delete for ARN", "Effect": "Allow", "Principal": { "AWS": "arn:aws:iam::xxxxxxxxx:role/your-role-used-for-s3-access-and-management" }, "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject" ], "Resource": "arn:aws:s3:::your-bucket-name/*" } ] } S3 CORS Policy <?xml version="1.0" encoding="UTF-8"?> <CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/"> <CORSRule> <AllowedOrigin>your-domain-name</AllowedOrigin> <AllowedMethod>GET</AllowedMethod> <AllowedMethod>PUT</AllowedMethod> <AllowedMethod>POST</AllowedMethod> <AllowedMethod>DELETE</AllowedMethod> <MaxAgeSeconds>3000</MaxAgeSeconds> <AllowedHeader>Authorization</AllowedHeader> </CORSRule> </CORSConfiguration> Redis Configuration For Redis, Troll Purse used default configurations provided by AWS ElastiCache. This cache was put in a private subnet, accessible only to services in the Troll Purse VPC as configured. Currently, Troll Purse is using the free tier cache.t2.micro instance. Other than that, the Launch Configuration just needs a reference to the public DNS of the cache. VPC Configuration AWS VPC is great for creating logically segregated services for an environment. Subnets Following normal AWS architecture diagrams (shown below), Troll Purse created two subnets. There is the public subnet which will host the forum instances and the load balancer. There is then the private subnet which has no internet access. The private subnet contains the forum’s Redis service. Security Groups Troll Purse setup two different Security Groups. One for services bound to the public subnet and another for services bound to the private subnet. The only real different is how the inboud internet traffic is configured. The public security group allows inbound internet traffic. The private security group does not allow inbound internet traffic. This is further strengthened by Route Tables Route Tables The Route Tables used were configured according to the afore mentioned diagram. There were two Route Tables. The first route table was created for the public subnet. This allows internet traffic in via the Internet Gateway bound to the public subnet. The second route table created was the private subnet. This Route Table did not receive configuration for public internet access. IAM Role Configuration To get our environment up using NodeBB with Redis, Troll Purse created a new IAM Role for EC2 instances meant to host NodeBB. This role did not need a lot of thought put into it. All it needed was full S3 Access, and full Redis access. From here Troll Purse uses two more AWS services to provide data storage for the forums. Auto-Scaling Configuration Using our existing configuration, Troll Purse created an Auto Scaling Configuration using the base Amazon Linux AMI on a t2.micro instance. We don’t do anything else special. Troll Purse set the default configurations of Min instances to 1 and Max instances to 1. This ensures the service will always be running one instance, whether it fails or not. Note: Make sure to use ELB Health Checks - this will verify the web service is actually running on the instance Launch Configuration User Data Here is a wonderful gist provided by one of our AWESOME developers (Disclaimer: I authored this post - totally biased opinion) used as a Launch Configuration. Soon Troll Purse will take away half of that setup and make an image for EC2 to use. Then only NodeBB configuration and launch information is required for the Launch Configuration. EC2 Configuration There wasn’t anything to do for EC2 since all of our instance information was setup using Auto Scaling. Conclusion Setting up an environment in AWS for our forums took about two days of building and verifying. These changes required no code whatsoever. All Troll Purse had to do was select from a large suite of services to support desired results. So, now that they exsist, join up on the forums! Originally Posted on Troll Purse Dev Blog
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!