Bj

Members
  • Content count

    19
  • Joined

  • Last visited

Community Reputation

260 Neutral

About Bj

  • Rank
    Member
  1. It appears i had the wrong version of Windows Media Feature pack installed. Downloading this one fixed the problem: https://www.microsoft.com/en-US/download/details.aspx?id=49919
  2. Ah, this was supposed to go into 'For beginners' :O
  3. Hello, I'm trying to add audio to my game, and to load audio files I want to use the Media Foundation, however when i try to initialize it with this line:   MFStartup(MF_VERSION);   it crashes, before the program even loads with the following error:   "Unable to activate Windows Store App '031...'. The process started but failed with error 'The App didn't start'."   If I uncomment this line the game starts, but I need it to use audio, what am I doing wrong? :O   /Björn
  4. The problem was a simple one,    file.open(Filename, std::ios::out)   needed to be   file.open(Filename, std::ios::out | std::ios::binary);   since it needs to write the data in binary mode.
  5. Thank you for your answers. =)     I do store them that way: for (auto it = ImageData.begin(); it != ImageData.end(); ++it) {    char red = it->r;    char green = 0;    char blue = 0;    file.write(&blue, sizeof(char));    file.write(&green, sizeof(char));    file.write(&red, sizeof(char)); }     I thought using a width of 640 would be aligned since: 640 pixels * 3 chars = 1920bytes   1920 bytes % 4 = 0 Isnt this correct?   Edit: If I change to constant values it works, and generates a red picture as expected       temp.r = i % 255; to    temp.r = 255;
  6. I am having some troubles when generating a .bmp file, from the code below i expected it to come out in shades of red, but as you'll see from the included picture it becomes both blue and green instead. Any ideas why this is happening? struct Pixel { unsigned char r;  unsigned char g; unsigned char b; }; The main function generates the vector that holds all the pixel data, the color depends on the height of the image. int main() { std::vector<Pixel> ImageData; unsigned int Width = 640; unsigned int Height = 480;   for (unsigned int i = 0; i < Height; i++) { for (unsigned int j = 0; j < Width; j++) { Pixel temp; temp.r = i % 255; temp.g = 0; temp.b = 0; ImageData.push_back(temp); } }   WriteBitMap("raytracer.bmp", Width, Height, ImageData);   return 0; }   Writes the 2 headers needed for BMP files and then the image data. bool WriteBitMap(std::string Filename, const unsigned int Width, const unsigned int Height, std::vector<Pixel> ImageData) { //Create fileformat headers   BitmapFileHeader fh; BitmapInfoHeader ih;   fh.bmtype[0] = 'B'; fh.bmtype[1] = 'M'; fh.iFileSize = sizeof(BitmapFileHeader) + sizeof(BitmapInfoHeader) + (Width*Height * 3); fh.iOffsetBits = sizeof(BitmapFileHeader) + sizeof(BitmapInfoHeader);   ih.iSizeHeader = 40; ih.iWidth = Width; ih.iHeight = Height; ih.iPlanes = 1; ih.iBitCount = 24; ih.Compression = 0;   //Open file for writing   std::ofstream file; file.open(Filename, std::ios::out);   file.write(reinterpret_cast<char*>(&fh), sizeof(BitmapFileHeader)); file.write(reinterpret_cast<char*>(&ih), sizeof(BitmapInfoHeader));   for (auto it = ImageData.begin(); it != ImageData.end(); ++it) { char red = it->r; char green = 0; char blue = 0; file.write(&blue, sizeof(char)); file.write(&green, sizeof(char)); file.write(&red, sizeof(char)); }   file.close(); return false; }   Edit: See solution further down   File needed to be opened in binary mode.
  7. Hello, when loading data from my application into the Vertexbuffer, it seems that random data is loaded instead of my vertices. See picture:   [attachment=25741:vb.jpg]   However, my vertices are defiened as follows (there are 24 in total, model is a cube):   [attachment=25742:vertices.jpg]     I create my buffer like this: D3D11_BUFFER_DESC vertexBufferDesc; ZeroMemory(&vertexBufferDesc, sizeof(vertexBufferDesc)); vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT; vertexBufferDesc.ByteWidth = sizeof(SimpleVertex2) * _vertexes.size(); vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = 0; vertexBufferDesc.MiscFlags = 0; D3D11_SUBRESOURCE_DATA InitialVertexData; ZeroMemory(&InitialVertexData, sizeof(InitialVertexData)); InitialVertexData.pSysMem = &_vertexes; InitialVertexData.SysMemPitch = 0; InitialVertexData.SysMemSlicePitch = 0; _gfx->GetDevice()->CreateBuffer(&vertexBufferDesc, &InitialVertexData, &_vertexbuffer); I set my vertexbuffer before rendering: unsigned int _stride = sizeof(SimpleVertex2); unsigned int _offset = 0; _gfx->GetContext()->IASetVertexBuffers(0, 1, &_vertexbuffer, &_stride, &_offset); My vertexformat: struct SimpleVertex2 { XMFLOAT4 Pos; XMFLOAT3 Normal; }; I am having trouble locating the source of the error. Hoping you can help my once again with my problems .       Edit: The problem was on this line:   InitialVertexData.pSysMem = &_vertexes;   When using vectors to initialize VBs you need to use:   InitialVertexData.pSysMem = &_vertexes[0];
  8. Thank you for the answers. I've uploaded a file here which contains the raw data i receive:   http://speedy.sh/69QPv/weather4.txt   I read your link Endurion and it seems you are correct! I was thinking along these lines in the beginning but when Reading 8000 bytes it landed me in the middle of the XML, however the number was encoded in hexadecimal, which is 32767.   Thank you Glass_Knife and Endurion for your help :)
  9. This is how i parse it:   NetClient client = new NetClient();   String temp = client.GetData(60.10F,9.58F,70); DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();   DocumentBuilder dBuilder;   try   {    dBuilder = dbFactory.newDocumentBuilder();    InputStream is = new ByteArrayInputStream(temp.getBytes());    Document doc = dBuilder.parse(is);   } And thank you for the fast reply   I do save it as a txt file aswell, just the pure data received, but it contains the "008000" aswell which is why i checked the packets using wireshark. Perhaps i should try another server aswell and see if it remains.
  10. I have a project where i download a XML file from a server using sockets, and doing a request with GET. This is the file:   http://api.yr.no/weatherapi/locationforecast/1.9/?lat=60.10;lon=9.58;msl=70   However when receiving the file it contains the string "008000" in some places of the file, which is not present when opened in my browser, and this string breaks the XML formating. Example: <location altitude="70" latitude="6 008000 0.1000" longitude="9.5800"> I used wireshark to see if this was also sent from the server, or if it was created on my side, but it seems the server sends this:     Any ideas on how to fix this?   My code   import java.io.DataInputStream; import java.io.FileNotFoundException; import java.io.IOException; import java.io.OutputStream; import java.io.PrintWriter; import java.net.*; public class NetClient {     Socket clientSocket = new Socket();     InetSocketAddress ip = new InetSocketAddress("api.yr.no", 80);       public String GetData(float _latitude, float _longitude, int _msl)     {      try      {          byte[] data = new byte[65000];          String translateddata = "";                 clientSocket.connect(ip);          DataInputStream inData = new DataInputStream(clientSocket.getInputStream());          OutputStream outData = clientSocket.getOutputStream();                   PrintWriter pw = new PrintWriter(outData, false);          pw.print("GET " + "/weatherapi/locationforecast/1.9/?lat=" + _latitude + ";lon=" + _longitude + ";msl=" + _msl +  " HTTP/1.1\r\n");          pw.print("Host: api.yr.no\r\n");          pw.print("Accept: text/xml\r\n");          pw.print("\r\n");          pw.flush();                   Thread.sleep(1000);                   int bytesread = 0;          int i = 0;          while (bytesread != -1)          {           bytesread = inData.read(data);           if (bytesread != -1)           {                      translateddata = translateddata + new String(data);              String temp = new String(data);                  PrintWriter file;            file = new PrintWriter("weather" + i++ + ".txt");                  file.write(temp);                  file.close();           }          }          clientSocket.close();          return translateddata;      }      catch (IOException e)      {             }      catch (InterruptedException e)      {    e.printStackTrace();   }      finally      {             }      return "Something went wrong when trying to download data from remote server!";     }     }  
  11. Ok i found the problem. I was using the DirectX Toolkit to render text which disables the depthbuffer to do so, I just had to enable it again :)   D3DDeviceContext->OMSetDepthStencilState(pDSState, 1);
  12. I have a problem where objects behind other objects are rendered.   I have set up my depth buffer as follows (code from msdn): ID3D11Texture2D* pDepthStencil = NULL; D3D11_TEXTURE2D_DESC descDepth; descDepth.Width = 640; descDepth.Height = 480; descDepth.MipLevels = 1; descDepth.ArraySize = 1; descDepth.Format = DXGI_FORMAT_D32_FLOAT_S8X24_UINT; descDepth.SampleDesc.Count = 1; descDepth.SampleDesc.Quality = 0; descDepth.Usage = D3D11_USAGE_DEFAULT; descDepth.BindFlags = D3D11_BIND_DEPTH_STENCIL; descDepth.CPUAccessFlags = 0; descDepth.MiscFlags = 0; D3DDevice->CreateTexture2D(&descDepth, NULL, &pDepthStencil);   D3D11_DEPTH_STENCIL_DESC dsDesc;   // Depth test parameters dsDesc.DepthEnable = true; dsDesc.DepthWriteMask = D3D11_DEPTH_WRITE_MASK_ALL; dsDesc.DepthFunc = D3D11_COMPARISON_LESS;   // Stencil test parameters dsDesc.StencilEnable = true; dsDesc.StencilReadMask = 0xFF; dsDesc.StencilWriteMask = 0xFF;   // Stencil operations if pixel is front-facing dsDesc.FrontFace.StencilFailOp = D3D11_STENCIL_OP_KEEP; dsDesc.FrontFace.StencilDepthFailOp = D3D11_STENCIL_OP_INCR; dsDesc.FrontFace.StencilPassOp = D3D11_STENCIL_OP_KEEP; dsDesc.FrontFace.StencilFunc = D3D11_COMPARISON_ALWAYS;   // Stencil operations if pixel is back-facing dsDesc.BackFace.StencilFailOp = D3D11_STENCIL_OP_KEEP; dsDesc.BackFace.StencilDepthFailOp = D3D11_STENCIL_OP_DECR; dsDesc.BackFace.StencilPassOp = D3D11_STENCIL_OP_KEEP; dsDesc.BackFace.StencilFunc = D3D11_COMPARISON_ALWAYS;   // Create depth stencil state ID3D11DepthStencilState * pDSState; D3DDevice->CreateDepthStencilState(&dsDesc, &pDSState);   // Bind depth stencil state D3DDeviceContext->OMSetDepthStencilState(pDSState, 1);   D3D11_DEPTH_STENCIL_VIEW_DESC descDSV; descDSV.Format = DXGI_FORMAT_D32_FLOAT_S8X24_UINT; descDSV.ViewDimension = D3D11_DSV_DIMENSION_TEXTURE2D; descDSV.Texture2D.MipSlice = 0; descDSV.Flags = 0;   // Create the depth stencil view D3DDevice->CreateDepthStencilView(pDepthStencil, // Depth stencil texture &descDSV, // Depth stencil desc &pDSV);  // [out] Depth stencil view   // Bind the depth stencil view D3DDeviceContext->OMSetRenderTargets(1,          // One rendertarget view &RenderTargetView,      // Render target view, created earlier pDSV);     // Depth stencil view for the render target   And I clear the buffer every frame:   D3DDeviceContext->ClearDepthStencilView(pDSV, D3D11_CLEAR_DEPTH | D3D11_CLEAR_STENCIL, 1.0f, 0);   I get no warnings in the output, have i forgotten something?
  13. Solved! The problem was  Effects.push_back(new Position3NormalColor()); should have been Effects.push_back(new EffectPosition3NormalColor());     Hi, I have a problem with the following code: class EffectPosition3NormalColor : virtual public IEffect {        //code } vector<IEffect*> Effects; Effects.push_back(new Position3NormalColor());   It gives me the following error:   1>error C2664: 'void std::vector<_Ty>::push_back(IEffect *&&)' : cannot convert parameter 1 from 'Position3NormalColor *' to 'IEffect *&&' 1>          with 1>          [ 1>              _Ty=IEffect * 1>          ] 1>          Reason: cannot convert from 'Position3NormalColor *' to 'IEffect *' Isnt this possible when Position3NormalColor inherits from IEffect?
  14. Never mind, solved it by changing x and z to float instead of double.
  15. Hello, I get a warning when comiling my project. The project works as intended but would be nice to resolve the warning.   It says: warning C4244: 'argument' : conversion from 'double' to 'float', possible loss of data for each of these lines DirectX::XMMATRIX rotationMatrix = DirectX::XMMatrixRotationY(0.5f*DirectX::XM_PI); Eye = DirectX::XMVectorSet( x, 3.0f, z, 0.0f ); At = DirectX::XMVectorSet( x + sin(CameraRotationHorizontal), 3.0f,z + cos(CameraRotationHorizontal), 0.0f ); they are declared as the following types DirectX::XMVECTOR Eye; DirectX::XMVECTOR At; DirectX::XMMATRIX rotationMatrix   How do I solve these? They're all predefined types from the DirectX API with their assosciated functions, not sure what i can do here.